WorldWideScience

Sample records for non-parametric vote-counting approach

  1. Vote Counting as Mathematical Proof

    DEFF Research Database (Denmark)

    Schürmann, Carsten; Pattinson, Dirk

    2015-01-01

    Trust in the correctness of an election outcome requires proof of the correctness of vote counting. By formalising particular voting protocols as rules, correctness of vote counting amounts to verifying that all rules have been applied correctly. A proof of the outcome of any particular election......-based formalisation of voting protocols inside a theorem prover, we synthesise vote counting programs that are not only provably correct, but also produce independently verifiable certificates. These programs are generated from a (formal) proof that every initial set of ballots allows to decide the election winner...

  2. A non-parametric approach to investigating fish population dynamics

    National Research Council Canada - National Science Library

    Cook, R.M; Fryer, R.J

    2001-01-01

    .... Using a non-parametric model for the stock-recruitment relationship it is possible to avoid defining specific functions relating recruitment to stock size while also providing a natural framework to model process error...

  3. Non-parametric approach to the study of phenotypic stability.

    Science.gov (United States)

    Ferreira, D F; Fernandes, S B; Bruzi, A T; Ramalho, M A P

    2016-02-19

    The aim of this study was to undertake the theoretical derivations of non-parametric methods, which use linear regressions based on rank order, for stability analyses. These methods were extension different parametric methods used for stability analyses and the result was compared with a standard non-parametric method. Intensive computational methods (e.g., bootstrap and permutation) were applied, and data from the plant-breeding program of the Biology Department of UFLA (Minas Gerais, Brazil) were used to illustrate and compare the tests. The non-parametric stability methods were effective for the evaluation of phenotypic stability. In the presence of variance heterogeneity, the non-parametric methods exhibited greater power of discrimination when determining the phenotypic stability of genotypes.

  4. ANALYSIS OF TIED DATA: AN ALTERNATIVE NON-PARAMETRIC APPROACH

    Directory of Open Access Journals (Sweden)

    I. C. A. OYEKA

    2012-02-01

    Full Text Available This paper presents a non-parametric statistical method of analyzing two-sample data that makes provision for the possibility of ties in the data. A test statistic is developed and shown to be free of the effect of any possible ties in the data. An illustrative example is provided and the method is shown to compare favourably with its competitor; the Mann-Whitney test and is more powerful than the latter when there are ties.

  5. Binary Classifier Calibration Using a Bayesian Non-Parametric Approach.

    Science.gov (United States)

    Naeini, Mahdi Pakdaman; Cooper, Gregory F; Hauskrecht, Milos

    Learning probabilistic predictive models that are well calibrated is critical for many prediction and decision-making tasks in Data mining. This paper presents two new non-parametric methods for calibrating outputs of binary classification models: a method based on the Bayes optimal selection and a method based on the Bayesian model averaging. The advantage of these methods is that they are independent of the algorithm used to learn a predictive model, and they can be applied in a post-processing step, after the model is learned. This makes them applicable to a wide variety of machine learning models and methods. These calibration methods, as well as other methods, are tested on a variety of datasets in terms of both discrimination and calibration performance. The results show the methods either outperform or are comparable in performance to the state-of-the-art calibration methods.

  6. A New Non-Parametric Approach to Galaxy Morphological Classification

    CERN Document Server

    Lotz, J M; Madau, P; Lotz, Jennifer M.; Primack, Joel; Madau, Piero

    2003-01-01

    We present two new non-parametric methods for quantifying galaxy morphology: the relative distribution of the galaxy pixel flux values (the Gini coefficient or G) and the second-order moment of the brightest 20% of the galaxy's flux (M20). We test the robustness of G and M20 to decreasing signal-to-noise and spatial resolution, and find that both measures are reliable to within 10% at average signal-to-noise per pixel greater than 3 and resolutions better than 1000 pc and 500 pc, respectively. We have measured G and M20, as well as concentration (C), asymmetry (A), and clumpiness (S) in the rest-frame near-ultraviolet/optical wavelengths for 150 bright local "normal" Hubble type galaxies (E-Sd) galaxies and 104 0.05 < z < 0.25 ultra-luminous infrared galaxies (ULIRGs).We find that most local galaxies follow a tight sequence in G-M20-C, where early-types have high G and C and low M20 and late-type spirals have lower G and C and higher M20. The majority of ULIRGs lie above the normal galaxy G-M20 sequence...

  7. Homothetic Efficiency and Test Power: A Non-Parametric Approach

    NARCIS (Netherlands)

    J. Heufer (Jan); P. Hjertstrand (Per)

    2015-01-01

    markdownabstract__Abstract__ We provide a nonparametric revealed preference approach to demand analysis based on homothetic efficiency. Homotheticity is a useful restriction but data rarely satisfies testable conditions. To overcome this we provide a way to estimate homothetic efficiency of

  8. Homothetic Efficiency and Test Power: A Non-Parametric Approach

    NARCIS (Netherlands)

    J. Heufer (Jan); P. Hjertstrand (Per)

    2015-01-01

    markdownabstract__Abstract__ We provide a nonparametric revealed preference approach to demand analysis based on homothetic efficiency. Homotheticity is a useful restriction but data rarely satisfies testable conditions. To overcome this we provide a way to estimate homothetic efficiency of consump

  9. Non-parametric Tuning of PID Controllers A Modified Relay-Feedback-Test Approach

    CERN Document Server

    Boiko, Igor

    2013-01-01

    The relay feedback test (RFT) has become a popular and efficient  tool used in process identification and automatic controller tuning. Non-parametric Tuning of PID Controllers couples new modifications of classical RFT with application-specific optimal tuning rules to form a non-parametric method of test-and-tuning. Test and tuning are coordinated through a set of common parameters so that a PID controller can obtain the desired gain or phase margins in a system exactly, even with unknown process dynamics. The concept of process-specific optimal tuning rules in the nonparametric setup, with corresponding tuning rules for flow, level pressure, and temperature control loops is presented in the text.   Common problems of tuning accuracy based on parametric and non-parametric approaches are addressed. In addition, the text treats the parametric approach to tuning based on the modified RFT approach and the exact model of oscillations in the system under test using the locus of a perturbedrelay system (LPRS) meth...

  10. The geometry of distributional preferences and a non-parametric identification approach: The Equality Equivalence Test.

    Science.gov (United States)

    Kerschbamer, Rudolf

    2015-05-01

    This paper proposes a geometric delineation of distributional preference types and a non-parametric approach for their identification in a two-person context. It starts with a small set of assumptions on preferences and shows that this set (i) naturally results in a taxonomy of distributional archetypes that nests all empirically relevant types considered in previous work; and (ii) gives rise to a clean experimental identification procedure - the Equality Equivalence Test - that discriminates between archetypes according to core features of preferences rather than properties of specific modeling variants. As a by-product the test yields a two-dimensional index of preference intensity.

  11. A non-parametric approach to estimate the total deviation index for non-normal data.

    Science.gov (United States)

    Perez-Jaume, Sara; Carrasco, Josep L

    2015-11-10

    Concordance indices are used to assess the degree of agreement between different methods that measure the same characteristic. In this context, the total deviation index (TDI) is an unscaled concordance measure that quantifies to which extent the readings from the same subject obtained by different methods may differ with a certain probability. Common approaches to estimate the TDI assume data are normally distributed and linearity between response and effects (subjects, methods and random error). Here, we introduce a new non-parametric methodology for estimation and inference of the TDI that can deal with any kind of quantitative data. The present study introduces this non-parametric approach and compares it with the already established methods in two real case examples that represent situations of non-normal data (more specifically, skewed data and count data). The performance of the already established methodologies and our approach in these contexts is assessed by means of a simulation study. Copyright © 2015 John Wiley & Sons, Ltd.

  12. A non-parametric Bayesian approach for clustering and tracking non-stationarities of neural spikes.

    Science.gov (United States)

    Shalchyan, Vahid; Farina, Dario

    2014-02-15

    Neural spikes from multiple neurons recorded in a multi-unit signal are usually separated by clustering. Drifts in the position of the recording electrode relative to the neurons over time cause gradual changes in the position and shapes of the clusters, challenging the clustering task. By dividing the data into short time intervals, Bayesian tracking of the clusters based on Gaussian cluster model has been previously proposed. However, the Gaussian cluster model is often not verified for neural spikes. We present a Bayesian clustering approach that makes no assumptions on the distribution of the clusters and use kernel-based density estimation of the clusters in every time interval as a prior for Bayesian classification of the data in the subsequent time interval. The proposed method was tested and compared to Gaussian model-based approach for cluster tracking by using both simulated and experimental datasets. The results showed that the proposed non-parametric kernel-based density estimation of the clusters outperformed the sequential Gaussian model fitting in both simulated and experimental data tests. Using non-parametric kernel density-based clustering that makes no assumptions on the distribution of the clusters enhances the ability of tracking cluster non-stationarity over time with respect to the Gaussian cluster modeling approach. Copyright © 2013 Elsevier B.V. All rights reserved.

  13. Comparative Study of Parametric and Non-parametric Approaches in Fault Detection and Isolation

    DEFF Research Database (Denmark)

    Katebi, S.D.; Blanke, M.; Katebi, M.R.

    This report describes a comparative study between two approaches to fault detection and isolation in dynamic systems. The first approach uses a parametric model of the system. The main components of such techniques are residual and signature generation for processing and analyzing. The second...... approach is non-parametric in the sense that the signature analysis is only dependent on the frequency or time domain information extracted directly from the input-output signals. Based on these approaches, two different fault monitoring schemes are developed where the feature extraction and fault decision...... algorithms employed are adopted from the template matching in pattern recognition. Extensive simulation studies are performed to demonstrate satisfactory performance of the proposed techniques. The advantages and disadvantages of each approach are discussed and analyzed....

  14. Assessing T cell clonal size distribution: a non-parametric approach.

    Science.gov (United States)

    Bolkhovskaya, Olesya V; Zorin, Daniil Yu; Ivanchenko, Mikhail V

    2014-01-01

    Clonal structure of the human peripheral T-cell repertoire is shaped by a number of homeostatic mechanisms, including antigen presentation, cytokine and cell regulation. Its accurate tuning leads to a remarkable ability to combat pathogens in all their variety, while systemic failures may lead to severe consequences like autoimmune diseases. Here we develop and make use of a non-parametric statistical approach to assess T cell clonal size distributions from recent next generation sequencing data. For 41 healthy individuals and a patient with ankylosing spondylitis, who undergone treatment, we invariably find power law scaling over several decades and for the first time calculate quantitatively meaningful values of decay exponent. It has proved to be much the same among healthy donors, significantly different for an autoimmune patient before the therapy, and converging towards a typical value afterwards. We discuss implications of the findings for theoretical understanding and mathematical modeling of adaptive immunity.

  15. Assessing T cell clonal size distribution: a non-parametric approach.

    Directory of Open Access Journals (Sweden)

    Olesya V Bolkhovskaya

    Full Text Available Clonal structure of the human peripheral T-cell repertoire is shaped by a number of homeostatic mechanisms, including antigen presentation, cytokine and cell regulation. Its accurate tuning leads to a remarkable ability to combat pathogens in all their variety, while systemic failures may lead to severe consequences like autoimmune diseases. Here we develop and make use of a non-parametric statistical approach to assess T cell clonal size distributions from recent next generation sequencing data. For 41 healthy individuals and a patient with ankylosing spondylitis, who undergone treatment, we invariably find power law scaling over several decades and for the first time calculate quantitatively meaningful values of decay exponent. It has proved to be much the same among healthy donors, significantly different for an autoimmune patient before the therapy, and converging towards a typical value afterwards. We discuss implications of the findings for theoretical understanding and mathematical modeling of adaptive immunity.

  16. Factors associated with malnutrition among tribal children in India: a non-parametric approach.

    Science.gov (United States)

    Debnath, Avijit; Bhattacharjee, Nairita

    2014-06-01

    The purpose of this study is to identify the determinants of malnutrition among the tribal children in India. The investigation is based on secondary data compiled from the National Family Health Survey-3. We used a classification and regression tree model, a non-parametric approach, to address the objective. Our analysis shows that breastfeeding practice, economic status, antenatal care of mother and women's decision-making autonomy are negatively associated with malnutrition among tribal children. We identify maternal malnutrition and urban concentration of household as the two risk factors for child malnutrition. The identified associated factors may be used for designing and targeting preventive programmes for malnourished tribal children. © The Author [2014]. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  17. Rural-urban Migration and Dynamics of Income Distribution in China: A Non-parametric Approach%Rural-urban Migration and Dynamics of Income Distribution in China: A Non-parametric Approach

    Institute of Scientific and Technical Information of China (English)

    Yong Liu,; Wei Zou

    2011-01-01

    Extending the income dynamics approach in Quah (2003), the present paper studies the enlarging income inequality in China over the past three decades from the viewpoint of rural-urban migration and economic transition. We establish non-parametric estimations of rural and urban income distribution functions in China, and aggregate a population- weighted, nationwide income distribution function taking into account rural-urban differences in technological progress and price indexes. We calculate 12 inequality indexes through non-parametric estimation to overcome the biases in existingparametric estimation and, therefore, provide more accurate measurement of income inequalitY. Policy implications have been drawn based on our research.

  18. A Non-parametric Approach to the Overall Estimate of Cognitive Load Using NIRS Time Series.

    Science.gov (United States)

    Keshmiri, Soheil; Sumioka, Hidenobu; Yamazaki, Ryuji; Ishiguro, Hiroshi

    2017-01-01

    We present a non-parametric approach to prediction of the n-back n ∈ {1, 2} task as a proxy measure of mental workload using Near Infrared Spectroscopy (NIRS) data. In particular, we focus on measuring the mental workload through hemodynamic responses in the brain induced by these tasks, thereby realizing the potential that they can offer for their detection in real world scenarios (e.g., difficulty of a conversation). Our approach takes advantage of intrinsic linearity that is inherent in the components of the NIRS time series to adopt a one-step regression strategy. We demonstrate the correctness of our approach through its mathematical analysis. Furthermore, we study the performance of our model in an inter-subject setting in contrast with state-of-the-art techniques in the literature to show a significant improvement on prediction of these tasks (82.50 and 86.40% for female and male participants, respectively). Moreover, our empirical analysis suggest a gender difference effect on the performance of the classifiers (with male data exhibiting a higher non-linearity) along with the left-lateralized activation in both genders with higher specificity in females.

  19. A Non-parametric Approach to Constrain the Transfer Function in Reverberation Mapping

    Science.gov (United States)

    Li, Yan-Rong; Wang, Jian-Min; Bai, Jin-Ming

    2016-11-01

    Broad emission lines of active galactic nuclei stem from a spatially extended region (broad-line region, BLR) that is composed of discrete clouds and photoionized by the central ionizing continuum. The temporal behaviors of these emission lines are blurred echoes of continuum variations (i.e., reverberation mapping, RM) and directly reflect the structures and kinematic information of BLRs through the so-called transfer function (also known as the velocity-delay map). Based on the previous works of Rybicki and Press and Zu et al., we develop an extended, non-parametric approach to determine the transfer function for RM data, in which the transfer function is expressed as a sum of a family of relatively displaced Gaussian response functions. Therefore, arbitrary shapes of transfer functions associated with complicated BLR geometry can be seamlessly included, enabling us to relax the presumption of a specified transfer function frequently adopted in previous studies and to let it be determined by observation data. We formulate our approach in a previously well-established framework that incorporates the statistical modeling of continuum variations as a damped random walk process and takes into account long-term secular variations which are irrelevant to RM signals. The application to RM data shows the fidelity of our approach.

  20. Spectral decompositions of multiple time series: a Bayesian non-parametric approach.

    Science.gov (United States)

    Macaro, Christian; Prado, Raquel

    2014-01-01

    We consider spectral decompositions of multiple time series that arise in studies where the interest lies in assessing the influence of two or more factors. We write the spectral density of each time series as a sum of the spectral densities associated to the different levels of the factors. We then use Whittle's approximation to the likelihood function and follow a Bayesian non-parametric approach to obtain posterior inference on the spectral densities based on Bernstein-Dirichlet prior distributions. The prior is strategically important as it carries identifiability conditions for the models and allows us to quantify our degree of confidence in such conditions. A Markov chain Monte Carlo (MCMC) algorithm for posterior inference within this class of frequency-domain models is presented.We illustrate the approach by analyzing simulated and real data via spectral one-way and two-way models. In particular, we present an analysis of functional magnetic resonance imaging (fMRI) brain responses measured in individuals who participated in a designed experiment to study pain perception in humans.

  1. A Non-parametric Approach to Constrain the Transfer Function in Reverberation Mapping

    CERN Document Server

    Li, Yan-Rong; Bai, Jin-Ming

    2016-01-01

    Broad emission lines of active galactic nuclei stem from a spatially extended region (broad-line region; BLR) that are composed of discrete clouds and photoionized by the central ionizing continuum. The temporal behaviors of these emission lines are blurred echoes of the continuum variations (i.e., reverberation mapping; RM) and directly reflect structures and kinematics information of BLRs through the so-called transfer function (also known as velocity-delay map). Based on the previous works of Rybicki & Press (1992) and Zu et al. (2011), we develop an extended, non-parametric approach to determine the transfer function for RM data, in which the transfer function is expressed as a sum of a family of relatively-displaced Gaussian response functions. As such, arbitrary shapes of transfer functions associated with complicated BLR geometry can be seamlessly included, enabling us to relax the presumption of a specified transfer function frequently adopted in previous studies and to let it be determined by obs...

  2. Population pharmacokinetics of nevirapine in Malaysian HIV patients: a non-parametric approach.

    Science.gov (United States)

    Mustafa, Suzana; Yusuf, Wan Nazirah Wan; Woillard, Jean Baptiste; Choon, Tan Soo; Hassan, Norul Badriah

    2016-07-01

    Nevirapine is the first non-nucleoside reverse-transcriptase inhibitor approved and is widely used in combination therapy to treat HIV-1 infection. The pharmacokinetics of nevirapine was extensively studied in various populations with a parametric approach. Hence, this study was aimed to determine population pharmacokinetic parameters in Malaysian HIV-infected patients with a non-parametric approach which allows detection of outliers or non-normal distribution contrary to the parametric approach. Nevirapine population pharmacokinetics was modelled with Pmetrics. A total of 708 observations from 112 patients were included in the model building and validation analysis. Evaluation of the model was based on a visual inspection of observed versus predicted (population and individual) concentrations and plots weighted residual error versus concentrations. Accuracy and robustness of the model were evaluated by visual predictive check (VPC). The median parameters' estimates obtained from the final model were used to predict individual nevirapine plasma area-under-curve (AUC) in the validation dataset. The Bland-Altman plot was used to compare the AUC predicted with trapezoidal AUC. The median nevirapine clearance was of 2.92 L/h, the median rate of absorption was 2.55/h and the volume of distribution was 78.23 L. Nevirapine pharmacokinetics were best described by one-compartmental with first-order absorption model and a lag-time. Weighted residuals for the model selected were homogenously distributed over the concentration and time range. The developed model adequately estimated AUC. In conclusion, a model to describe the pharmacokinetics of nevirapine was developed. The developed model adequately describes nevirapine population pharmacokinetics in HIV-infected patients in Malaysia.

  3. Performances and Spending Efficiency in Higher Education: A European Comparison through Non-Parametric Approaches

    Science.gov (United States)

    Agasisti, Tommaso

    2011-01-01

    The objective of this paper is an efficiency analysis concerning higher education systems in European countries. Data have been extracted from OECD data-sets (Education at a Glance, several years), using a non-parametric technique--data envelopment analysis--to calculate efficiency scores. This paper represents the first attempt to conduct such an…

  4. Estimating Financial Risk Measures for Futures Positions:A Non-Parametric Approach

    OpenAIRE

    Cotter, John; dowd, kevin

    2011-01-01

    This paper presents non-parametric estimates of spectral risk measures applied to long and short positions in 5 prominent equity futures contracts. It also compares these to estimates of two popular alternative measures, the Value-at-Risk (VaR) and Expected Shortfall (ES). The spectral risk measures are conditioned on the coefficient of absolute risk aversion, and the latter two are conditioned on the confidence level. Our findings indicate that all risk measures increase dramatically and the...

  5. Evaluation of world's largest social welfare scheme: An assessment using non-parametric approach.

    Science.gov (United States)

    Singh, Sanjeet

    2016-08-01

    Mahatma Gandhi National Rural Employment Guarantee Act (MGNREGA) is the world's largest social welfare scheme in India for the poverty alleviation through rural employment generation. This paper aims to evaluate and rank the performance of the states in India under MGNREGA scheme. A non-parametric approach, Data Envelopment Analysis (DEA) is used to calculate the overall technical, pure technical, and scale efficiencies of states in India. The sample data is drawn from the annual official reports published by the Ministry of Rural Development, Government of India. Based on three selected input parameters (expenditure indicators) and five output parameters (employment generation indicators), I apply both input and output oriented DEA models to estimate how well the states utilize their resources and generate outputs during the financial year 2013-14. The relative performance evaluation has been made under the assumption of constant returns and also under variable returns to scale to assess the impact of scale on performance. The results indicate that the main source of inefficiency is both technical and managerial practices adopted. 11 states are overall technically efficient and operate at the optimum scale whereas 18 states are pure technical or managerially efficient. It has been found that for some states it necessary to alter scheme size to perform at par with the best performing states. For inefficient states optimal input and output targets along with the resource savings and output gains are calculated. Analysis shows that if all inefficient states operate at optimal input and output levels, on an average 17.89% of total expenditure and a total amount of $780million could have been saved in a single year. Most of the inefficient states perform poorly when it comes to the participation of women and disadvantaged sections (SC&ST) in the scheme. In order to catch up with the performance of best performing states, inefficient states on an average need to enhance

  6. A multitemporal and non-parametric approach for assessing the impacts of drought on vegetation greenness

    DEFF Research Database (Denmark)

    Carrao, Hugo; Sepulcre, Guadalupe; Horion, Stéphanie Marie Anne F;

    2013-01-01

    for the period between 1998 and 2010. The time-series analysis of vegetation greenness is performed during the growing season with a non-parametric method, namely the seasonal Relative Greenness (RG) of spatially accumulated fAPAR. The Global Land Cover map of 2000 and the GlobCover maps of 2005/2006 and 2009......This study evaluates the relationship between the frequency and duration of meteorological droughts and the subsequent temporal changes on the quantity of actively photosynthesizing biomass (greenness) estimated from satellite imagery on rainfed croplands in Latin America. An innovative non...... Full Data Reanalysis precipitation time-series product, which ranges from January 1901 to December 2010 and is interpolated at the spatial resolution of 1° (decimal degree, DD). Vegetation greenness composites are derived from 10-daily SPOT-VEGETATION images at the spatial resolution of 1/112° DD...

  7. Non-parametric Estimation approach in statistical investigation of nuclear spectra

    CERN Document Server

    Jafarizadeh, M A; Sabri, H; Maleki, B Rashidian

    2011-01-01

    In this paper, Kernel Density Estimation (KDE) as a non-parametric estimation method is used to investigate statistical properties of nuclear spectra. The deviation to regular or chaotic dynamics, is exhibited by closer distances to Poisson or Wigner limits respectively which evaluated by Kullback-Leibler Divergence (KLD) measure. Spectral statistics of different sequences prepared by nuclei corresponds to three dynamical symmetry limits of Interaction Boson Model(IBM), oblate and prolate nuclei and also the pairing effect on nuclear level statistics are analyzed (with pure experimental data). KD-based estimated density function, confirm previous predictions with minimum uncertainty (evaluated with Integrate Absolute Error (IAE)) in compare to Maximum Likelihood (ML)-based method. Also, the increasing of regularity degrees of spectra due to pairing effect is reveal.

  8. A Non-Parametric Delphi Approach to Foster Innovation Policy Debate in Spain

    Directory of Open Access Journals (Sweden)

    Juan Carlos Salazar-Elena

    2016-05-01

    Full Text Available The aim of this paper is to identify some changes needed in Spain’s innovation policy to fill the gap between its innovation results and those of other European countries in lieu of sustainable leadership. To do this we apply the Delphi methodology to experts from academia, business, and government. To overcome the shortcomings of traditional descriptive methods, we develop an inferential analysis by following a non-parametric bootstrap method which enables us to identify important changes that should be implemented. Particularly interesting is the support found for improving the interconnections among the relevant agents of the innovation system (instead of focusing exclusively in the provision of knowledge and technological inputs through R and D activities, or the support found for “soft” policy instruments aimed at providing a homogeneous framework to assess the innovation capabilities of firms (e.g., for funding purposes. Attention to potential innovators among small and medium enterprises (SMEs and traditional industries is particularly encouraged by experts.

  9. Cancer driver gene discovery through an integrative genomics approach in a non-parametric Bayesian framework.

    Science.gov (United States)

    Yang, Hai; Wei, Qiang; Zhong, Xue; Yang, Hushan; Li, Bingshan

    2017-02-15

    Comprehensive catalogue of genes that drive tumor initiation and progression in cancer is key to advancing diagnostics, therapeutics and treatment. Given the complexity of cancer, the catalogue is far from complete yet. Increasing evidence shows that driver genes exhibit consistent aberration patterns across multiple-omics in tumors. In this study, we aim to leverage complementary information encoded in each of the omics data to identify novel driver genes through an integrative framework. Specifically, we integrated mutations, gene expression, DNA copy numbers, DNA methylation and protein abundance, all available in The Cancer Genome Atlas (TCGA) and developed iDriver, a non-parametric Bayesian framework based on multivariate statistical modeling to identify driver genes in an unsupervised fashion. iDriver captures the inherent clusters of gene aberrations and constructs the background distribution that is used to assess and calibrate the confidence of driver genes identified through multi-dimensional genomic data. We applied the method to 4 cancer types in TCGA and identified candidate driver genes that are highly enriched with known drivers. (e.g.: P < 3.40 × 10 -36 for breast cancer). We are particularly interested in novel genes and observed multiple lines of supporting evidence. Using systematic evaluation from multiple independent aspects, we identified 45 candidate driver genes that were not previously known across these 4 cancer types. The finding has important implications that integrating additional genomic data with multivariate statistics can help identify cancer drivers and guide the next stage of cancer genomics research. The C ++ source code is freely available at https://medschool.vanderbilt.edu/cgg/ . hai.yang@vanderbilt.edu or bingshan.li@Vanderbilt.Edu. Supplementary data are available at Bioinformatics online.

  10. Non-parametric frontier approach to modelling the relationships among population, GDP, energy consumption and CO{sub 2} emissions

    Energy Technology Data Exchange (ETDEWEB)

    Lozano, Sebastian; Gutierrez, Ester [University of Seville, E.S.I., Department of Industrial Management, Camino de los Descubrimientos, s/n, 41092 Sevilla (Spain)

    2008-07-15

    In this paper, a non-parametric approach based in Data Envelopment Analysis (DEA) is proposed as an alternative to the Kaya identity (a.k.a ImPACT). This Frontier Method identifies and extends existing best practices. Population and GDP are considered as input and output, respectively. Both primary energy consumption and Greenhouse Gas (GHG) emissions are considered as undesirable outputs. Several Linear Programming models are formulated with different aims, namely: (a) determine efficiency levels, (b) estimate maximum GDP compatible with given levels of population, energy intensity and carbonization intensity, and (c) estimate the minimum level of GHG emissions compatible with given levels of population, GDP, energy intensity or carbonization index. The United States of America case is used as illustration of the proposed approach. (author)

  11. rSeqNP: a non-parametric approach for detecting differential expression and splicing from RNA-Seq data.

    Science.gov (United States)

    Shi, Yang; Chinnaiyan, Arul M; Jiang, Hui

    2015-07-01

    High-throughput sequencing of transcriptomes (RNA-Seq) has become a powerful tool to study gene expression. Here we present an R package, rSeqNP, which implements a non-parametric approach to test for differential expression and splicing from RNA-Seq data. rSeqNP uses permutation tests to access statistical significance and can be applied to a variety of experimental designs. By combining information across isoforms, rSeqNP is able to detect more differentially expressed or spliced genes from RNA-Seq data. The R package with its source code and documentation are freely available at http://www-personal.umich.edu/∼jianghui/rseqnp/. jianghui@umich.edu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  12. Prediction intervals for future BMI values of individual children: a non-parametric approach by quantile boosting.

    Science.gov (United States)

    Mayr, Andreas; Hothorn, Torsten; Fenske, Nora

    2012-01-25

    The construction of prediction intervals (PIs) for future body mass index (BMI) values of individual children based on a recent German birth cohort study with n = 2007 children is problematic for standard parametric approaches, as the BMI distribution in childhood is typically skewed depending on age. We avoid distributional assumptions by directly modelling the borders of PIs by additive quantile regression, estimated by boosting. We point out the concept of conditional coverage to prove the accuracy of PIs. As conditional coverage can hardly be evaluated in practical applications, we conduct a simulation study before fitting child- and covariate-specific PIs for future BMI values and BMI patterns for the present data. The results of our simulation study suggest that PIs fitted by quantile boosting cover future observations with the predefined coverage probability and outperform the benchmark approach. For the prediction of future BMI values, quantile boosting automatically selects informative covariates and adapts to the age-specific skewness of the BMI distribution. The lengths of the estimated PIs are child-specific and increase, as expected, with the age of the child. Quantile boosting is a promising approach to construct PIs with correct conditional coverage in a non-parametric way. It is in particular suitable for the prediction of BMI patterns depending on covariates, since it provides an interpretable predictor structure, inherent variable selection properties and can even account for longitudinal data structures.

  13. An Non-parametrical Approach to Estimate Location Parameters under Simple Order

    Institute of Scientific and Technical Information of China (English)

    孙旭

    2005-01-01

    This paper deals with estimating parameters under simple order when samples come from location models. Based on the idea of Hodges and Lehmann estimator (H-L estimator), a new approach to estimate parameters is proposed, which is difference with the classical L1 isotoaic regression and L2 isotonic regression. An algorithm to corupute estimators is given. Simulations by the Monte-Carlo method is applied to compare the likelihood functions with respect to L1 estimators and weighted isotonic H-L estimators.

  14. Economic capacity estimation in fisheries: A non-parametric ray approach

    Energy Technology Data Exchange (ETDEWEB)

    Pascoe, Sean; Tingley, Diana [Centre for the Economics and Management of Aquatic Resources (CEMARE), University of Portsmouth, Boathouse No. 6, College Road, HM Naval Base, Portsmouth PO1 3LJ (United Kingdom)

    2006-05-15

    Data envelopment analysis (DEA) has generally been adopted as the most appropriate methodology for the estimation of fishing capacity, particularly in multi-species fisheries. More recently, economic DEA methods have been developed that incorporate the costs and benefits of increasing capacity utilisation. One such method was applied to estimate the capacity utilisation and output of the Scottish fleet. By comparing the results of the economic and traditional DEA approaches, it can be concluded that many fleet segments are operating at or close to full capacity, and that the vessels defining the frontier are operating consistent with profit maximising behaviour. (author)

  15. A non-parametric approach for detecting gene-gene interactions associated with age-at-onset outcomes.

    Science.gov (United States)

    Li, Ming; Gardiner, Joseph C; Breslau, Naomi; Anthony, James C; Lu, Qing

    2014-07-01

    Cox-regression-based methods have been commonly used for the analyses of survival outcomes, such as age-at-disease-onset. These methods generally assume the hazard functions are proportional among various risk groups. However, such an assumption may not be valid in genetic association studies, especially when complex interactions are involved. In addition, genetic association studies commonly adopt case-control designs. Direct use of Cox regression to case-control data may yield biased estimators and incorrect statistical inference. We propose a non-parametric approach, the weighted Nelson-Aalen (WNA) approach, for detecting genetic variants that are associated with age-dependent outcomes. The proposed approach can be directly applied to prospective cohort studies, and can be easily extended for population-based case-control studies. Moreover, it does not rely on any assumptions of the disease inheritance models, and is able to capture high-order gene-gene interactions. Through simulations, we show the proposed approach outperforms Cox-regression-based methods in various scenarios. We also conduct an empirical study of progression of nicotine dependence by applying the WNA approach to three independent datasets from the Study of Addiction: Genetics and Environment. In the initial dataset, two SNPs, rs6570989 and rs2930357, located in genes GRIK2 and CSMD1, are found to be significantly associated with the progression of nicotine dependence (ND). The joint association is further replicated in two independent datasets. Further analysis suggests that these two genes may interact and be associated with the progression of ND. As demonstrated by the simulation studies and real data analysis, the proposed approach provides an efficient tool for detecting genetic interactions associated with age-at-onset outcomes.

  16. CAUSALITY BETWEEN GDP, ENERGY AND COAL CONSUMPTION IN INDIA, 1970-2011: A NON-PARAMETRIC BOOTSTRAP APPROACH

    Directory of Open Access Journals (Sweden)

    Rohin Anhal

    2013-10-01

    Full Text Available The aim of this paper is to examine the direction of causality between real GDP on the one hand and final energy and coal consumption on the other in India, for the period from 1970 to 2011. The methodology adopted is the non-parametric bootstrap procedure, which is used to construct the critical values for the hypothesis of causality. The results of the bootstrap tests show that for total energy consumption, there exists no causal relationship in either direction with GDP of India. However, if coal consumption is considered, we find evidence in support of unidirectional causality running from coal consumption to GDP. This clearly has important implications for the Indian economy. The most important implication is that curbing coal consumption in order to reduce carbon emissions would in turn have a limiting effect on economic growth. Our analysis contributes to the literature in three distinct ways. First, this is the first paper to use the bootstrap method to examine the growth-energy connection for the Indian economy. Second, we analyze data for the time period 1970 to 2011, thereby utilizing recently available data that has not been used by others. Finally, in contrast to the recently done studies, we adopt a disaggregated approach for the analysis of the growth-energy nexus by considering not only aggregate energy consumption, but coal consumption as well.

  17. A Non-parametric Approach to Measuring the $k^{-}\\pi^{+}$ Amplitudes in $D^{+} \\to K^{-}K^{+}\\pi{+}$ Decay

    CERN Document Server

    Link, J M; Alimonti, G; Anjos, J C; Arena, V; Barberis, S; Bediaga, I; Benussi, L; Bianco, S; Boca, G; Bonomi, G; Boschini, M; Butler, J N; Carrillo, S; Casimiro, E; Castromonte, C; Cawlfield, C; Cerutti, A; Cheung, H W K; Chiodini, G; Cho, K; Chung, Y S; Cinquini, L; Cuautle, E; Cumalat, J P; D'Angelo, P; Davenport, T F; De Miranda, J M; Di Corato, M; Dini, P; Dos Reis, A C; Edera, L; Engh, D; Erba, S; Fabbri, F L; Frisullo, V; Gaines, I; Garbincius, P H; Gardner, R; Garren, L A; Gianini, G; Gottschalk, E; Göbel, C; Handler, T; Hernández, H; Hosack, M; Inzani, P; Johns, W E; Kang, J S; Kasper, P H; Kim, D Y; Ko, B R; Kreymer, A E; Kryemadhi, A; Kutschke, R; Kwak, J W; Lee, K B; Leveraro, F; Liguori, G; Lopes-Pegna, D; Luiggi, E; López, A M; Machado, A A; Magnin, J; Malvezzi, S; Massafferri, A; Menasce, D; Merlo, M M; Mezzadri, M; Mitchell, R; Moroni, L; Méndez, H; Nehring, M; O'Reilly, B; Otalora, J; Pantea, D; Paris, A; Park, H; Pedrini, D; Pepe, I M; Polycarpo, E; Pontoglio, C; Prelz, F; Quinones, J; Rahimi, A; Ramírez, J E; Ratti, S P; Reyes, M; Riccardi, C; Rovere, M; Sala, S; Segoni, I; Sheaff, M; Sheldon, P D; Stenson, K; Sánchez-Hernández, A; Uribe, C; Vaandering, E W; Vitulo, P; Vázquez, F; Wang, M; Webster, M; Wilson, J R; Wiss, J; Yager, P M; Zallo, A; Zhang, Y

    2007-01-01

    Using a large sample of \\dpkkpi{} decays collected by the FOCUS photoproduction experiment at Fermilab, we present the first non-parametric analysis of the \\kpi{} amplitudes in \\dpkkpi{} decay. The technique is similar to the technique used for our non-parametric measurements of the \\krzmndk{} form factors. Although these results are in rough agreement with those of E687, we observe a wider S-wave contribution for the \\ksw{} contribution than the standard, PDG \\cite{pdg} Breit-Wigner parameterization. We have some weaker evidence for the existence of a new, D-wave component at low values of the $K^- \\pi^+$ mass.

  18. A non-parametric meta-analysis approach for combining independent microarray datasets: application using two microarray datasets pertaining to chronic allograft nephropathy

    Directory of Open Access Journals (Sweden)

    Archer Kellie J

    2008-02-01

    Full Text Available Abstract Background With the popularity of DNA microarray technology, multiple groups of researchers have studied the gene expression of similar biological conditions. Different methods have been developed to integrate the results from various microarray studies, though most of them rely on distributional assumptions, such as the t-statistic based, mixed-effects model, or Bayesian model methods. However, often the sample size for each individual microarray experiment is small. Therefore, in this paper we present a non-parametric meta-analysis approach for combining data from independent microarray studies, and illustrate its application on two independent Affymetrix GeneChip studies that compared the gene expression of biopsies from kidney transplant recipients with chronic allograft nephropathy (CAN to those with normal functioning allograft. Results The simulation study comparing the non-parametric meta-analysis approach to a commonly used t-statistic based approach shows that the non-parametric approach has better sensitivity and specificity. For the application on the two CAN studies, we identified 309 distinct genes that expressed differently in CAN. By applying Fisher's exact test to identify enriched KEGG pathways among those genes called differentially expressed, we found 6 KEGG pathways to be over-represented among the identified genes. We used the expression measurements of the identified genes as predictors to predict the class labels for 6 additional biopsy samples, and the predicted results all conformed to their pathologist diagnosed class labels. Conclusion We present a new approach for combining data from multiple independent microarray studies. This approach is non-parametric and does not rely on any distributional assumptions. The rationale behind the approach is logically intuitive and can be easily understood by researchers not having advanced training in statistics. Some of the identified genes and pathways have been

  19. A non-parametric meta-analysis approach for combining independent microarray datasets: application using two microarray datasets pertaining to chronic allograft nephropathy.

    Science.gov (United States)

    Kong, Xiangrong; Mas, Valeria; Archer, Kellie J

    2008-02-26

    With the popularity of DNA microarray technology, multiple groups of researchers have studied the gene expression of similar biological conditions. Different methods have been developed to integrate the results from various microarray studies, though most of them rely on distributional assumptions, such as the t-statistic based, mixed-effects model, or Bayesian model methods. However, often the sample size for each individual microarray experiment is small. Therefore, in this paper we present a non-parametric meta-analysis approach for combining data from independent microarray studies, and illustrate its application on two independent Affymetrix GeneChip studies that compared the gene expression of biopsies from kidney transplant recipients with chronic allograft nephropathy (CAN) to those with normal functioning allograft. The simulation study comparing the non-parametric meta-analysis approach to a commonly used t-statistic based approach shows that the non-parametric approach has better sensitivity and specificity. For the application on the two CAN studies, we identified 309 distinct genes that expressed differently in CAN. By applying Fisher's exact test to identify enriched KEGG pathways among those genes called differentially expressed, we found 6 KEGG pathways to be over-represented among the identified genes. We used the expression measurements of the identified genes as predictors to predict the class labels for 6 additional biopsy samples, and the predicted results all conformed to their pathologist diagnosed class labels. We present a new approach for combining data from multiple independent microarray studies. This approach is non-parametric and does not rely on any distributional assumptions. The rationale behind the approach is logically intuitive and can be easily understood by researchers not having advanced training in statistics. Some of the identified genes and pathways have been reported to be relevant to renal diseases. Further study on the

  20. Non-parametric Bayesian approach to post-translational modification refinement of predictions from tandem mass spectrometry.

    Science.gov (United States)

    Chung, Clement; Emili, Andrew; Frey, Brendan J

    2013-04-01

    Tandem mass spectrometry (MS/MS) is a dominant approach for large-scale high-throughput post-translational modification (PTM) profiling. Although current state-of-the-art blind PTM spectral analysis algorithms can predict thousands of modified peptides (PTM predictions) in an MS/MS experiment, a significant percentage of these predictions have inaccurate modification mass estimates and false modification site assignments. This problem can be addressed by post-processing the PTM predictions with a PTM refinement algorithm. We developed a novel PTM refinement algorithm, iPTMClust, which extends a recently introduced PTM refinement algorithm PTMClust and uses a non-parametric Bayesian model to better account for uncertainties in the quantity and identity of PTMs in the input data. The use of this new modeling approach enables iPTMClust to provide a confidence score per modification site that allows fine-tuning and interpreting resulting PTM predictions. The primary goal behind iPTMClust is to improve the quality of the PTM predictions. First, to demonstrate that iPTMClust produces sensible and accurate cluster assignments, we compare it with k-means clustering, mixtures of Gaussians (MOG) and PTMClust on a synthetically generated PTM dataset. Second, in two separate benchmark experiments using PTM data taken from a phosphopeptide and a yeast proteome study, we show that iPTMClust outperforms state-of-the-art PTM prediction and refinement algorithms, including PTMClust. Finally, we illustrate the general applicability of our new approach on a set of human chromatin protein complex data, where we are able to identify putative novel modified peptides and modification sites that may be involved in the formation and regulation of protein complexes. Our method facilitates accurate PTM profiling, which is an important step in understanding the mechanisms behind many biological processes and should be an integral part of any proteomic study. Our algorithm is implemented in

  1. Non-parametric data-based approach for the quantification and communication of uncertainties in river flood forecasts

    Science.gov (United States)

    Van Steenbergen, N.; Willems, P.

    2012-04-01

    Reliable flood forecasts are the most important non-structural measures to reduce the impact of floods. However flood forecasting systems are subject to uncertainty originating from the input data, model structure and model parameters of the different hydraulic and hydrological submodels. To quantify this uncertainty a non-parametric data-based approach has been developed. This approach analyses the historical forecast residuals (differences between the predictions and the observations at river gauging stations) without using a predefined statistical error distribution. Because the residuals are correlated with the value of the forecasted water level and the lead time, the residuals are split up into discrete classes of simulated water levels and lead times. For each class, percentile values are calculated of the model residuals and stored in a 'three dimensional error' matrix. By 3D interpolation in this error matrix, the uncertainty in new forecasted water levels can be quantified. In addition to the quantification of the uncertainty, the communication of this uncertainty is equally important. The communication has to be done in a consistent way, reducing the chance of misinterpretation. Also, the communication needs to be adapted to the audience; the majority of the larger public is not interested in in-depth information on the uncertainty on the predicted water levels, but only is interested in information on the likelihood of exceedance of certain alarm levels. Water managers need more information, e.g. time dependent uncertainty information, because they rely on this information to undertake the appropriate flood mitigation action. There are various ways in presenting uncertainty information (numerical, linguistic, graphical, time (in)dependent, etc.) each with their advantages and disadvantages for a specific audience. A useful method to communicate uncertainty of flood forecasts is by probabilistic flood mapping. These maps give a representation of the

  2. Parametric and Non-Parametric System Modelling

    DEFF Research Database (Denmark)

    Nielsen, Henrik Aalborg

    1999-01-01

    considered. It is shown that adaptive estimation in conditional parametric models can be performed by combining the well known methods of local polynomial regression and recursive least squares with exponential forgetting. The approach used for estimation in conditional parametric models also highlights how....... For this purpose non-parametric methods together with additive models are suggested. Also, a new approach specifically designed to detect non-linearities is introduced. Confidence intervals are constructed by use of bootstrapping. As a link between non-parametric and parametric methods a paper dealing with neural...... the focus is on combinations of parametric and non-parametric methods of regression. This combination can be in terms of additive models where e.g. one or more non-parametric term is added to a linear regression model. It can also be in terms of conditional parametric models where the coefficients...

  3. Estimation of the limit of detection with a bootstrap-derived standard error by a partly non-parametric approach. Application to HPLC drug assays

    DEFF Research Database (Denmark)

    Linnet, Kristian

    2005-01-01

    Bootstrap, HPLC, limit of blank, limit of detection, non-parametric statistics, type I and II errors......Bootstrap, HPLC, limit of blank, limit of detection, non-parametric statistics, type I and II errors...

  4. Evaluation of model-based versus non-parametric monaural noise-reduction approaches for hearing aids.

    Science.gov (United States)

    Harlander, Niklas; Rosenkranz, Tobias; Hohmann, Volker

    2012-08-01

    Single channel noise reduction has been well investigated and seems to have reached its limits in terms of speech intelligibility improvement, however, the quality of such schemes can still be advanced. This study tests to what extent novel model-based processing schemes might improve performance in particular for non-stationary noise conditions. Two prototype model-based algorithms, a speech-model-based, and a auditory-model-based algorithm were compared to a state-of-the-art non-parametric minimum statistics algorithm. A speech intelligibility test, preference rating, and listening effort scaling were performed. Additionally, three objective quality measures for the signal, background, and overall distortions were applied. For a better comparison of all algorithms, particular attention was given to the usage of the similar Wiener-based gain rule. The perceptual investigation was performed with fourteen hearing-impaired subjects. The results revealed that the non-parametric algorithm and the auditory model-based algorithm did not affect speech intelligibility, whereas the speech-model-based algorithm slightly decreased intelligibility. In terms of subjective quality, both model-based algorithms perform better than the unprocessed condition and the reference in particular for highly non-stationary noise environments. Data support the hypothesis that model-based algorithms are promising for improving performance in non-stationary noise conditions.

  5. Detrending the long-term stellar activity and the systematics of the Kepler data with a non-parametric approach

    CERN Document Server

    Danielski, C; Tinetti, G

    2013-01-01

    The NASA Kepler mission is delivering groundbreaking results, with an increasing number of Earth-sized and moon-sized objects been discovered. A high photometric precision can be reached only through a thorough removal of the stellar activity and the instrumental systematics. We have explored here the possibility of using non-parametric methods to analyse the Simple Aperture Photometry data observed by the Kepler mission. We focused on a sample of stellar light curves with different effective temperatures and flux modulations, and we found that Gaussian Processes-based techniques can very effectively correct the instrumental systematics along with the long-term stellar activity. Our method can disentangle astrophysical features (events), such as planetary transits, flares or general sudden variations in the intensity, from the star signal and it is very efficient as it requires only a few training iterations of the Gaussian Process model. The results obtained show the potential of our method to isolate the ma...

  6. Parametric versus non-parametric simulation

    OpenAIRE

    Dupeux, Bérénice; Buysse, Jeroen

    2014-01-01

    Most of ex-ante impact assessment policy models have been based on a parametric approach. We develop a novel non-parametric approach, called Inverse DEA. We use non parametric efficiency analysis for determining the farm’s technology and behaviour. Then, we compare the parametric approach and the Inverse DEA models to a known data generating process. We use a bio-economic model as a data generating process reflecting a real world situation where often non-linear relationships exist. Results s...

  7. Selection bias, vote counting, and money-priming effects: A comment on Rohrer, Pashler, and Harris (2015) and Vohs (2015).

    Science.gov (United States)

    Vadillo, Miguel A; Hardwicke, Tom E; Shanks, David R

    2016-05-01

    When a series of studies fails to replicate a well-documented effect, researchers might be tempted to use a "vote counting" approach to decide whether the effect is reliable-that is, simply comparing the number of successful and unsuccessful replications. Vohs's (2015) response to the absence of money priming effects reported by Rohrer, Pashler, and Harris (2015) provides an example of this approach. Unfortunately, vote counting is a poor strategy to assess the reliability of psychological findings because it neglects the impact of selection bias and questionable research practices. In the present comment, we show that a range of meta-analytic tools indicate irregularities in the money priming literature discussed by Rohrer et al. and Vohs, which all point to the conclusion that these effects are distorted by selection bias, reporting biases, or p-hacking. This could help to explain why money-priming effects have proven unreliable in a number of direct replication attempts in which biases have been minimized through preregistration or transparent reporting. Our major conclusion is that the simple proportion of significant findings is a poor guide to the reliability of research and that preregistered replications are an essential means to assess the reliability of money-priming effects.

  8. A simple 2D non-parametric resampling statistical approach to assess confidence in species identification in DNA barcoding--an alternative to likelihood and bayesian approaches.

    Science.gov (United States)

    Jin, Qian; He, Li-Jun; Zhang, Ai-Bing

    2012-01-01

    In the recent worldwide campaign for the global biodiversity inventory via DNA barcoding, a simple and easily used measure of confidence for assigning sequences to species in DNA barcoding has not been established so far, although the likelihood ratio test and the bayesian approach had been proposed to address this issue from a statistical point of view. The TDR (Two Dimensional non-parametric Resampling) measure newly proposed in this study offers users a simple and easy approach to evaluate the confidence of species membership in DNA barcoding projects. We assessed the validity and robustness of the TDR approach using datasets simulated under coalescent models, and an empirical dataset, and found that TDR measure is very robust in assessing species membership of DNA barcoding. In contrast to the likelihood ratio test and bayesian approach, the TDR method stands out due to simplicity in both concepts and calculations, with little in the way of restrictive population genetic assumptions. To implement this approach we have developed a computer program package (TDR1.0beta) freely available from ftp://202.204.209.200/education/video/TDR1.0beta.rar.

  9. Analysing Vote Counting Algorithms Via Logic - And its Application to the CADE Election Scheme

    DEFF Research Database (Denmark)

    Schürmann, Carsten; Beckert, Bernhard; Gore, Rejeev

    2013-01-01

    departs from its specification. When we applied our method and system to analyse the vote counting algorithm used for electing the CADE Board of Trustees, we found that it strictly differs from the standard definition of Single Transferable Vote (STV). We therefore argue that “STV” is a misnomer......We present a method for using first-order logic to specify the semantics of preferences as used in common vote counting algorithms. We also present a corresponding system that uses Celf linear-logic programs to describe voting algorithms and which generates explicit examples when the algorithm...

  10. Non-parametric convolution based image-segmentation of ill-posed objects applying context window approach

    CERN Document Server

    Kumar, Upendra; Pal, Manoj Kumar

    2012-01-01

    Context-dependence in human cognition process is a well-established fact. Following this, we introduced the image segmentation method that can use context to classify a pixel on the basis of its membership to a particular object-class of the concerned image. In the broad methodological steps, each pixel was defined by its context window (CW) surrounding it the size of which was fixed heuristically. CW texture defined by the intensities of its pixels was convoluted with weights optimized through a non-parametric function supported by a backpropagation network. Result of convolution was used to classify them. The training data points (i.e., pixels) were carefully chosen to include all variety of contexts of types, i) points within the object, ii) points near the edge but inside the objects, iii) points at the border of the objects, iv) points near the edge but outside the objects, v) points near or at the edge of the image frame. Moreover the training data points were selected from all the images within image-d...

  11. On the measurement of capacity utilisation and coast efficiency: a non-parametric approach at firm level

    Directory of Open Access Journals (Sweden)

    Prior Diego

    2002-01-01

    Full Text Available In this paper we evaluate the inefficiency generated by an inadequate structure of the fixed inputs and by the difficulty to adjust them in the short-run in a sample of Romanian firms in the chemical industry over the period 1996-1997. We use Data Envelopment Analysis (DEA and apply this methodology in an innovative setting using a cost analysis instead of the technical efficiency approach. The results show inefficiency in most of the cases due to a low degree of capacity utilisation.

  12. Non-Parametric Inference in Astrophysics

    CERN Document Server

    Wasserman, L H; Nichol, R C; Genovese, C; Jang, W; Connolly, A J; Moore, A W; Schneider, J; Wasserman, Larry; Miller, Christopher J.; Nichol, Robert C.; Genovese, Chris; Jang, Woncheol; Connolly, Andrew J.; Moore, Andrew W.; Schneider, Jeff; group, the PICA

    2001-01-01

    We discuss non-parametric density estimation and regression for astrophysics problems. In particular, we show how to compute non-parametric confidence intervals for the location and size of peaks of a function. We illustrate these ideas with recent data on the Cosmic Microwave Background. We also briefly discuss non-parametric Bayesian inference.

  13. Selection Bias, Vote Counting, and Money-Priming Effects: A Comment on Rohrer, Pashler, and Harris (2015) and Vohs (2015)

    Science.gov (United States)

    2016-01-01

    When a series of studies fails to replicate a well-documented effect, researchers might be tempted to use a “vote counting” approach to decide whether the effect is reliable—that is, simply comparing the number of successful and unsuccessful replications. Vohs’s (2015) response to the absence of money priming effects reported by Rohrer, Pashler, and Harris (2015) provides an example of this approach. Unfortunately, vote counting is a poor strategy to assess the reliability of psychological findings because it neglects the impact of selection bias and questionable research practices. In the present comment, we show that a range of meta-analytic tools indicate irregularities in the money priming literature discussed by Rohrer et al. and Vohs, which all point to the conclusion that these effects are distorted by selection bias, reporting biases, or p-hacking. This could help to explain why money-priming effects have proven unreliable in a number of direct replication attempts in which biases have been minimized through preregistration or transparent reporting. Our major conclusion is that the simple proportion of significant findings is a poor guide to the reliability of research and that preregistered replications are an essential means to assess the reliability of money-priming effects. PMID:27077759

  14. A Non-Parametric Approach for the Activation Detection of Block Design fMRI Simulated Data Using Self-Organizing Maps and Support Vector Machine.

    Science.gov (United States)

    Bahrami, Sheyda; Shamsi, Mousa

    2017-01-01

    Functional magnetic resonance imaging (fMRI) is a popular method to probe the functional organization of the brain using hemodynamic responses. In this method, volume images of the entire brain are obtained with a very good spatial resolution and low temporal resolution. However, they always suffer from high dimensionality in the face of classification algorithms. In this work, we combine a support vector machine (SVM) with a self-organizing map (SOM) for having a feature-based classification by using SVM. Then, a linear kernel SVM is used for detecting the active areas. Here, we use SOM for feature extracting and labeling the datasets. SOM has two major advances: (i) it reduces dimension of data sets for having less computational complexity and (ii) it is useful for identifying brain regions with small onset differences in hemodynamic responses. Our non-parametric model is compared with parametric and non-parametric methods. We use simulated fMRI data sets and block design inputs in this paper and consider the contrast to noise ratio (CNR) value equal to 0.6 for simulated datasets. fMRI simulated dataset has contrast 1-4% in active areas. The accuracy of our proposed method is 93.63% and the error rate is 6.37%.

  15. Non-parametric partitioning of SAR images

    Science.gov (United States)

    Delyon, G.; Galland, F.; Réfrégier, Ph.

    2006-09-01

    We describe and analyse a generalization of a parametric segmentation technique adapted to Gamma distributed SAR images to a simple non parametric noise model. The partition is obtained by minimizing the stochastic complexity of a quantized version on Q levels of the SAR image and lead to a criterion without parameters to be tuned by the user. We analyse the reliability of the proposed approach on synthetic images. The quality of the obtained partition will be studied for different possible strategies. In particular, one will discuss the reliability of the proposed optimization procedure. Finally, we will precisely study the performance of the proposed approach in comparison with the statistical parametric technique adapted to Gamma noise. These studies will be led by analyzing the number of misclassified pixels, the standard Hausdorff distance and the number of estimated regions.

  16. Bayesian non parametric modelling of Higgs pair production

    Science.gov (United States)

    Scarpa, Bruno; Dorigo, Tommaso

    2017-03-01

    Statistical classification models are commonly used to separate a signal from a background. In this talk we face the problem of isolating the signal of Higgs pair production using the decay channel in which each boson decays into a pair of b-quarks. Typically in this context non parametric methods are used, such as Random Forests or different types of boosting tools. We remain in the same non-parametric framework, but we propose to face the problem following a Bayesian approach. A Dirichlet process is used as prior for the random effects in a logit model which is fitted by leveraging the Polya-Gamma data augmentation. Refinements of the model include the insertion in the simple model of P-splines to relate explanatory variables with the response and the use of Bayesian trees (BART) to describe the atoms in the Dirichlet process.

  17. Bayesian non parametric modelling of Higgs pair production

    Directory of Open Access Journals (Sweden)

    Scarpa Bruno

    2017-01-01

    Full Text Available Statistical classification models are commonly used to separate a signal from a background. In this talk we face the problem of isolating the signal of Higgs pair production using the decay channel in which each boson decays into a pair of b-quarks. Typically in this context non parametric methods are used, such as Random Forests or different types of boosting tools. We remain in the same non-parametric framework, but we propose to face the problem following a Bayesian approach. A Dirichlet process is used as prior for the random effects in a logit model which is fitted by leveraging the Polya-Gamma data augmentation. Refinements of the model include the insertion in the simple model of P-splines to relate explanatory variables with the response and the use of Bayesian trees (BART to describe the atoms in the Dirichlet process.

  18. Non-Parametric Estimation of Correlation Functions

    DEFF Research Database (Denmark)

    Brincker, Rune; Rytter, Anders; Krenk, Steen

    In this paper three methods of non-parametric correlation function estimation are reviewed and evaluated: the direct method, estimation by the Fast Fourier Transform and finally estimation by the Random Decrement technique. The basic ideas of the techniques are reviewed, sources of bias are pointed...... out, and methods to prevent bias are presented. The techniques are evaluated by comparing their speed and accuracy on the simple case of estimating auto-correlation functions for the response of a single degree-of-freedom system loaded with white noise....

  19. Lottery spending: a non-parametric analysis.

    Science.gov (United States)

    Garibaldi, Skip; Frisoli, Kayla; Ke, Li; Lim, Melody

    2015-01-01

    We analyze the spending of individuals in the United States on lottery tickets in an average month, as reported in surveys. We view these surveys as sampling from an unknown distribution, and we use non-parametric methods to compare properties of this distribution for various demographic groups, as well as claims that some properties of this distribution are constant across surveys. We find that the observed higher spending by Hispanic lottery players can be attributed to differences in education levels, and we dispute previous claims that the top 10% of lottery players consistently account for 50% of lottery sales.

  20. Lottery spending: a non-parametric analysis.

    Directory of Open Access Journals (Sweden)

    Skip Garibaldi

    Full Text Available We analyze the spending of individuals in the United States on lottery tickets in an average month, as reported in surveys. We view these surveys as sampling from an unknown distribution, and we use non-parametric methods to compare properties of this distribution for various demographic groups, as well as claims that some properties of this distribution are constant across surveys. We find that the observed higher spending by Hispanic lottery players can be attributed to differences in education levels, and we dispute previous claims that the top 10% of lottery players consistently account for 50% of lottery sales.

  1. An updated weight of evidence approach to the aquatic hazard assessment of Bisphenol A and the derivation a new predicted no effect concentration (Pnec) using a non-parametric methodology.

    Science.gov (United States)

    Wright-Walters, Maxine; Volz, Conrad; Talbott, Evelyn; Davis, Devra

    2011-01-15

    An aquatic hazard assessment establishes a derived predicted no effect concentration (PNEC) below which it is assumed that aquatic organisms will not suffer adverse effects from exposure to a chemical. An aquatic hazard assessment of the endocrine disruptor Bisphenol A [BPA; 2, 2-bis (4-hydroxyphenyl) propane] was conducted using a weight of evidence approach, using the ecotoxicological endpoints of survival, growth and development and reproduction. New evidence has emerged that suggests that the aquatic system may not be sufficiently protected from adverse effects of BPA exposure at the current PNEC value of 100 μg/L. It is with this background that; 1) An aquatic hazard assessment for BPA using a weight of evidence approach, was conducted, 2) A PNEC value was derived using a non parametric hazardous concentration for 5% of the specie (HC(5)) approach and, 3) The derived BPA hazard assessment values were compared to aquatic environmental concentrations for BPA to determine, sufficient protectiveness from BPA exposure for aquatic species. A total of 61 studies yielded 94 no observed effect concentration (NOEC) and a toxicity dataset, which suggests that the aquatic effects of mortality, growth and development and reproduction are most likely to occur between the concentrations of 0.0483 μg/L and 2280 μg/L. This finding is within the range for aquatic adverse estrogenic effects reported in the literature. A PNEC of 0.06 μg/L was calculated. The 95% confidence interval was found to be (0.02, 3.40) μg/L. Thus, using the weight of evidence approach based on repeated measurements of these endpoints, the results indicate that currently observed BPA concentrations in surface waters exceed this newly derived PNEC value of 0.06 μg/L. This indicates that some aquatic receptors may be at risk for adverse effects on survival, growth and development and reproduction from BPA exposure at environmentally relevant concentrations. Copyright © 2010 Elsevier B.V. All rights

  2. A non-parametric framework for estimating threshold limit values

    Directory of Open Access Journals (Sweden)

    Ulm Kurt

    2005-11-01

    Full Text Available Abstract Background To estimate a threshold limit value for a compound known to have harmful health effects, an 'elbow' threshold model is usually applied. We are interested on non-parametric flexible alternatives. Methods We describe how a step function model fitted by isotonic regression can be used to estimate threshold limit values. This method returns a set of candidate locations, and we discuss two algorithms to select the threshold among them: the reduced isotonic regression and an algorithm considering the closed family of hypotheses. We assess the performance of these two alternative approaches under different scenarios in a simulation study. We illustrate the framework by analysing the data from a study conducted by the German Research Foundation aiming to set a threshold limit value in the exposure to total dust at workplace, as a causal agent for developing chronic bronchitis. Results In the paper we demonstrate the use and the properties of the proposed methodology along with the results from an application. The method appears to detect the threshold with satisfactory success. However, its performance can be compromised by the low power to reject the constant risk assumption when the true dose-response relationship is weak. Conclusion The estimation of thresholds based on isotonic framework is conceptually simple and sufficiently powerful. Given that in threshold value estimation context there is not a gold standard method, the proposed model provides a useful non-parametric alternative to the standard approaches and can corroborate or challenge their findings.

  3. Non-Parametric Bayesian Updating within the Assessment of Reliability for Offshore Wind Turbine Support Structures

    DEFF Research Database (Denmark)

    Ramirez, José Rangel; Sørensen, John Dalsgaard

    2011-01-01

    This work illustrates the updating and incorporation of information in the assessment of fatigue reliability for offshore wind turbine. The new information, coming from external and condition monitoring can be used to direct updating of the stochastic variables through a non-parametric Bayesian...... updating approach and be integrated in the reliability analysis by a third-order polynomial chaos expansion approximation. Although Classical Bayesian updating approaches are often used because of its parametric formulation, non-parametric approaches are better alternatives for multi-parametric updating...... with a non-conjugating formulation. The results in this paper show the influence on the time dependent updated reliability when non-parametric and classical Bayesian approaches are used. Further, the influence on the reliability of the number of updated parameters is illustrated....

  4. On Parametric (and Non-Parametric Variation

    Directory of Open Access Journals (Sweden)

    Neil Smith

    2009-11-01

    Full Text Available This article raises the issue of the correct characterization of ‘Parametric Variation’ in syntax and phonology. After specifying their theoretical commitments, the authors outline the relevant parts of the Principles–and–Parameters framework, and draw a three-way distinction among Universal Principles, Parameters, and Accidents. The core of the contribution then consists of an attempt to provide identity criteria for parametric, as opposed to non-parametric, variation. Parametric choices must be antecedently known, and it is suggested that they must also satisfy seven individually necessary and jointly sufficient criteria. These are that they be cognitively represented, systematic, dependent on the input, deterministic, discrete, mutually exclusive, and irreversible.

  5. Non-parametric seismic hazard analysis in the presence of incomplete data

    Science.gov (United States)

    Yazdani, Azad; Mirzaei, Sajjad; Dadkhah, Koroush

    2017-01-01

    The distribution of earthquake magnitudes plays a crucial role in the estimation of seismic hazard parameters. Due to the complexity of earthquake magnitude distribution, non-parametric approaches are recommended over classical parametric methods. The main deficiency of the non-parametric approach is the lack of complete magnitude data in almost all cases. This study aims to introduce an imputation procedure for completing earthquake catalog data that will allow the catalog to be used for non-parametric density estimation. Using a Monte Carlo simulation, the efficiency of introduced approach is investigated. This study indicates that when a magnitude catalog is incomplete, the imputation procedure can provide an appropriate tool for seismic hazard assessment. As an illustration, the imputation procedure was applied to estimate earthquake magnitude distribution in Tehran, the capital city of Iran.

  6. Using non-parametric methods in econometric production analysis

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard; Henningsen, Arne

    -Douglas function nor the Translog function are consistent with the “true” relationship between the inputs and the output in our data set. We solve this problem by using non-parametric regression. This approach delivers reasonable results, which are on average not too different from the results of the parametric......Econometric estimation of production functions is one of the most common methods in applied economic production analysis. These studies usually apply parametric estimation techniques, which obligate the researcher to specify the functional form of the production function. Most often, the Cobb...... results—including measures that are of interest of applied economists, such as elasticities. Therefore, we propose to use nonparametric econometric methods. First, they can be applied to verify the functional form used in parametric estimations of production functions. Second, they can be directly used...

  7. Non-parametric tests of productive efficiency with errors-in-variables

    NARCIS (Netherlands)

    Kuosmanen, T.K.; Post, T.; Scholtes, S.

    2007-01-01

    We develop a non-parametric test of productive efficiency that accounts for errors-in-variables, following the approach of Varian. [1985. Nonparametric analysis of optimizing behavior with measurement error. Journal of Econometrics 30(1/2), 445-458]. The test is based on the general Pareto-Koopmans

  8. Biological parametric mapping with robust and non-parametric statistics.

    Science.gov (United States)

    Yang, Xue; Beason-Held, Lori; Resnick, Susan M; Landman, Bennett A

    2011-07-15

    Mapping the quantitative relationship between structure and function in the human brain is an important and challenging problem. Numerous volumetric, surface, regions of interest and voxelwise image processing techniques have been developed to statistically assess potential correlations between imaging and non-imaging metrices. Recently, biological parametric mapping has extended the widely popular statistical parametric mapping approach to enable application of the general linear model to multiple image modalities (both for regressors and regressands) along with scalar valued observations. This approach offers great promise for direct, voxelwise assessment of structural and functional relationships with multiple imaging modalities. However, as presented, the biological parametric mapping approach is not robust to outliers and may lead to invalid inferences (e.g., artifactual low p-values) due to slight mis-registration or variation in anatomy between subjects. To enable widespread application of this approach, we introduce robust regression and non-parametric regression in the neuroimaging context of application of the general linear model. Through simulation and empirical studies, we demonstrate that our robust approach reduces sensitivity to outliers without substantial degradation in power. The robust approach and associated software package provide a reliable way to quantitatively assess voxelwise correlations between structural and functional neuroimaging modalities. Copyright © 2011 Elsevier Inc. All rights reserved.

  9. Non-parametric analysis of rating transition and default data

    DEFF Research Database (Denmark)

    Fledelius, Peter; Lando, David; Perch Nielsen, Jens

    2004-01-01

    We demonstrate the use of non-parametric intensity estimation - including construction of pointwise confidence sets - for analyzing rating transition data. We find that transition intensities away from the class studied here for illustration strongly depend on the direction of the previous move b...... but that this dependence vanishes after 2-3 years....

  10. A non-parametric model for the cosmic velocity field

    NARCIS (Netherlands)

    Branchini, E; Teodoro, L; Frenk, CS; Schmoldt, [No Value; Efstathiou, G; White, SDM; Saunders, W; Sutherland, W; Rowan-Robinson, M; Keeble, O; Tadros, H; Maddox, S; Oliver, S

    1999-01-01

    We present a self-consistent non-parametric model of the local cosmic velocity field derived from the distribution of IRAS galaxies in the PSCz redshift survey. The survey has been analysed using two independent methods, both based on the assumptions of gravitational instability and linear biasing.

  11. Non-parametric Bayesian inference for inhomogeneous Markov point processes

    DEFF Research Database (Denmark)

    Berthelsen, Kasper Klitgaard; Møller, Jesper

    With reference to a specific data set, we consider how to perform a flexible non-parametric Bayesian analysis of an inhomogeneous point pattern modelled by a Markov point process, with a location dependent first order term and pairwise interaction only. A priori we assume that the first order term...

  12. Non-parametric analysis of rating transition and default data

    DEFF Research Database (Denmark)

    Fledelius, Peter; Lando, David; Perch Nielsen, Jens

    2004-01-01

    We demonstrate the use of non-parametric intensity estimation - including construction of pointwise confidence sets - for analyzing rating transition data. We find that transition intensities away from the class studied here for illustration strongly depend on the direction of the previous move...

  13. A Comparison of Parametric and Non-Parametric Methods Applied to a Likert Scale.

    Science.gov (United States)

    Mircioiu, Constantin; Atkinson, Jeffrey

    2017-05-10

    A trenchant and passionate dispute over the use of parametric versus non-parametric methods for the analysis of Likert scale ordinal data has raged for the past eight decades. The answer is not a simple "yes" or "no" but is related to hypotheses, objectives, risks, and paradigms. In this paper, we took a pragmatic approach. We applied both types of methods to the analysis of actual Likert data on responses from different professional subgroups of European pharmacists regarding competencies for practice. Results obtained show that with "large" (>15) numbers of responses and similar (but clearly not normal) distributions from different subgroups, parametric and non-parametric analyses give in almost all cases the same significant or non-significant results for inter-subgroup comparisons. Parametric methods were more discriminant in the cases of non-similar conclusions. Considering that the largest differences in opinions occurred in the upper part of the 4-point Likert scale (ranks 3 "very important" and 4 "essential"), a "score analysis" based on this part of the data was undertaken. This transformation of the ordinal Likert data into binary scores produced a graphical representation that was visually easier to understand as differences were accentuated. In conclusion, in this case of Likert ordinal data with high response rates, restraining the analysis to non-parametric methods leads to a loss of information. The addition of parametric methods, graphical analysis, analysis of subsets, and transformation of data leads to more in-depth analyses.

  14. Using non-parametric methods in econometric production analysis

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard; Henningsen, Arne

    2012-01-01

    by investigating the relationship between the elasticity of scale and the farm size. We use a balanced panel data set of 371~specialised crop farms for the years 2004-2007. A non-parametric specification test shows that neither the Cobb-Douglas function nor the Translog function are consistent with the "true......Econometric estimation of production functions is one of the most common methods in applied economic production analysis. These studies usually apply parametric estimation techniques, which obligate the researcher to specify a functional form of the production function of which the Cobb...... parameter estimates, but also in biased measures which are derived from the parameters, such as elasticities. Therefore, we propose to use non-parametric econometric methods. First, these can be applied to verify the functional form used in parametric production analysis. Second, they can be directly used...

  15. Non-parametric versus parametric methods in environmental sciences

    Directory of Open Access Journals (Sweden)

    Muhammad Riaz

    2016-01-01

    Full Text Available This current report intends to highlight the importance of considering background assumptions required for the analysis of real datasets in different disciplines. We will provide comparative discussion of parametric methods (that depends on distributional assumptions (like normality relative to non-parametric methods (that are free from many distributional assumptions. We have chosen a real dataset from environmental sciences (one of the application areas. The findings may be extended to the other disciplines following the same spirit.

  16. Non-parametric Morphologies of Mergers in the Illustris Simulation

    CERN Document Server

    Bignone, Lucas A; Sillero, Emanuel; Pedrosa, Susana E; Pellizza, Leonardo J; Lambas, Diego G

    2016-01-01

    We study non-parametric morphologies of mergers events in a cosmological context, using the Illustris project. We produce mock g-band images comparable to observational surveys from the publicly available Illustris simulation idealized mock images at $z=0$. We then measure non parametric indicators: asymmetry, Gini, $M_{20}$, clumpiness and concentration for a set of galaxies with $M_* >10^{10}$ M$_\\odot$. We correlate these automatic statistics with the recent merger history of galaxies and with the presence of close companions. Our main contribution is to assess in a cosmological framework, the empirically derived non-parametric demarcation line and average time-scales used to determine the merger rate observationally. We found that 98 per cent of galaxies above the demarcation line have a close companion or have experienced a recent merger event. On average, merger signatures obtained from the $G-M_{20}$ criteria anticorrelate clearly with the elapsing time to the last merger event. We also find that the a...

  17. Posterior contraction rate for non-parametric Bayesian estimation of the dispersion coefficient of a stochastic differential equation

    NARCIS (Netherlands)

    Gugushvili, S.; Spreij, P.

    2016-01-01

    We consider the problem of non-parametric estimation of the deterministic dispersion coefficient of a linear stochastic differential equation based on discrete time observations on its solution. We take a Bayesian approach to the problem and under suitable regularity assumptions derive the posteror

  18. An Non-parametrical Approach to Estimate Location Parameters under Simple Order%简单半序约束下估计位置参数的一个非参方法

    Institute of Scientific and Technical Information of China (English)

    孙旭

    2005-01-01

    This paper deals with estimating parameters under simple order whensamples come from location models. Based on the idea of Hodges and Lehmann es-timator (H-L estimator), a new approach to estimate parameters is proposed, whichis difference with the classical L1 isotonic regression and L2 isotonic regression. Analgorithm to compute estimators is given. Simulations by the Monte-Carlo methodis applied to compare the likelihood functions with respect to L1 estimators andweighted isotonic H-L estimators.

  19. Generalized Correlation Coefficient for Non-Parametric Analysis of Microarray Time-Course Data.

    Science.gov (United States)

    Tan, Qihua; Thomassen, Mads; Burton, Mark; Mose, Kristian Fredløv; Andersen, Klaus Ejner; Hjelmborg, Jacob; Kruse, Torben

    2017-06-06

    Modeling complex time-course patterns is a challenging issue in microarray study due to complex gene expression patterns in response to the time-course experiment. We introduce the generalized correlation coefficient and propose a combinatory approach for detecting, testing and clustering the heterogeneous time-course gene expression patterns. Application of the method identified nonlinear time-course patterns in high agreement with parametric analysis. We conclude that the non-parametric nature in the generalized correlation analysis could be an useful and efficient tool for analyzing microarray time-course data and for exploring the complex relationships in the omics data for studying their association with disease and health.

  20. Generalized Correlation Coefficient for Non-Parametric Analysis of Microarray Time-Course Data

    DEFF Research Database (Denmark)

    Tan, Qihua; Thomassen, Mads; Burton, Mark

    2017-01-01

    Modeling complex time-course patterns is a challenging issue in microarray study due to complex gene expression patterns in response to the time-course experiment. We introduce the generalized correlation coefficient and propose a combinatory approach for detecting, testing and clustering...... the heterogeneous time-course gene expression patterns. Application of the method identified nonlinear time-course patterns in high agreement with parametric analysis. We conclude that the non-parametric nature in the generalized correlation analysis could be an useful and efficient tool for analyzing microarray...... time-course data and for exploring the complex relationships in the omics data for studying their association with disease and health....

  1. Generalized Correlation Coefficient for Non-Parametric Analysis of Microarray Time-Course Data

    DEFF Research Database (Denmark)

    Tan, Qihua; Thomassen, Mads; Burton, Mark

    2017-01-01

    Modeling complex time-course patterns is a challenging issue in microarray study due to complex gene expression patterns in response to the time-course experiment. We introduce the generalized correlation coefficient and propose a combinatory approach for detecting, testing and clustering...... the heterogeneous time-course gene expression patterns. Application of the method identified nonlinear time-course patterns in high agreement with parametric analysis. We conclude that the non-parametric nature in the generalized correlation analysis could be an useful and efficient tool for analyzing microarray...

  2. Digital spectral analysis parametric, non-parametric and advanced methods

    CERN Document Server

    Castanié, Francis

    2013-01-01

    Digital Spectral Analysis provides a single source that offers complete coverage of the spectral analysis domain. This self-contained work includes details on advanced topics that are usually presented in scattered sources throughout the literature.The theoretical principles necessary for the understanding of spectral analysis are discussed in the first four chapters: fundamentals, digital signal processing, estimation in spectral analysis, and time-series models.An entire chapter is devoted to the non-parametric methods most widely used in industry.High resolution methods a

  3. 基于非参数Malmquist指数方法的我国基础设施投资生产率研究%A study of the Malmquist productivity index of infrastructure investments using non-parametric approach

    Institute of Scientific and Technical Information of China (English)

    李玉龙; 李忠富

    2011-01-01

    Gross regional product, per capita gross regional product and urbanization level were regarded as indicators measuring the output of infrastructure investments, fully reflecting the performance of infrastructure investments. The Malmquist productivity index of infrastructure investments, for analysis and evaluation of infrastructure investment efficiency, was calculated through the application of DEA non-parameters approach, without the deficiency of using an indicator to measure infrastructure investment performance. The results showed that the pulling effect of infrastructure investments for economic development was being reduced gradually and the focus has shifted from infrastructure investments of promoting economic development in the eastern areas to those promoting economic development of the middle and western regions, that the scale efficiency of infrastructure investments was gradually improved, that the pure technical efficiency change was leveling off, which showed that the current scale of infrastructure investments was modest, and that the key of getting rid of declining productivity was to promote technology innovation of infrastructure investment and allocation.%综合包括地区生产总值等指标在内的多项指标作为衡量基础设施投资产出效果的指标,更能全面地反映基础设施投资的绩效.通过应用非参数的DEA方法,计算我国1999~2006年的基础设施投资的Malmquist生产率指数,克服了参数方法只能用一个指标衡量基础设施投资绩效的不足,并对基础设施投资的效率变动进行分析.研究结果表明,我国基础设施投资对经济拉动的作用正逐渐变小,同时由过去注重通过加大基础设施投入促进沿海地区的经济发展,正逐步向关注中西部地区的基础设施投入对经济发展促进作用的转移;同时,我国基础设施投资规模效率,正逐年有所改善,各地区的基础设施配置的纯技术效率变化不大,表明我国

  4. Using Mathematica to build Non-parametric Statistical Tables

    Directory of Open Access Journals (Sweden)

    Gloria Perez Sainz de Rozas

    2003-01-01

    Full Text Available In this paper, I present computational procedures to obtian statistical tables. The tables of the asymptotic distribution and the exact distribution of Kolmogorov-Smirnov statistic Dn for one population, the table of the distribution of the runs R, the table of the distribution of Wilcoxon signed-rank statistic W+ and the table of the distribution of Mann-Whitney statistic Ux using Mathematica, Version 3.9 under Window98. I think that it is an interesting cuestion because many statistical packages give the asymptotic significance level in the statistical tests and with these porcedures one can easily calculate the exact significance levels and the left-tail and right-tail probabilities with non-parametric distributions. I have used mathematica to make these calculations because one can use symbolic language to solve recursion relations. It's very easy to generate the format of the tables, and it's possible to obtain any table of the mentioned non-parametric distributions with any precision, not only with the standard parameters more used in Statistics, and without transcription mistakes. Furthermore, using similar procedures, we can generate tables for the following distribution functions: Binomial, Poisson, Hypergeometric, Normal, x2 Chi-Square, T-Student, F-Snedecor, Geometric, Gamma and Beta.

  5. Using non-parametric methods in econometric production analysis

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard; Henningsen, Arne

    2012-01-01

    Econometric estimation of production functions is one of the most common methods in applied economic production analysis. These studies usually apply parametric estimation techniques, which obligate the researcher to specify a functional form of the production function of which the Cobb-Douglas a......Econometric estimation of production functions is one of the most common methods in applied economic production analysis. These studies usually apply parametric estimation techniques, which obligate the researcher to specify a functional form of the production function of which the Cobb...... parameter estimates, but also in biased measures which are derived from the parameters, such as elasticities. Therefore, we propose to use non-parametric econometric methods. First, these can be applied to verify the functional form used in parametric production analysis. Second, they can be directly used...... to estimate production functions without the specification of a functional form. Therefore, they avoid possible misspecification errors due to the use of an unsuitable functional form. In this paper, we use parametric and non-parametric methods to identify the optimal size of Polish crop farms...

  6. Non-parametric transformation for data correlation and integration: From theory to practice

    Energy Technology Data Exchange (ETDEWEB)

    Datta-Gupta, A.; Xue, Guoping; Lee, Sang Heon [Texas A& M Univ., College Station, TX (United States)

    1997-08-01

    The purpose of this paper is two-fold. First, we introduce the use of non-parametric transformations for correlating petrophysical data during reservoir characterization. Such transformations are completely data driven and do not require a priori functional relationship between response and predictor variables which is the case with traditional multiple regression. The transformations are very general, computationally efficient and can easily handle mixed data types for example, continuous variables such as porosity, permeability and categorical variables such as rock type, lithofacies. The power of the non-parametric transformation techniques for data correlation has been illustrated through synthetic and field examples. Second, we utilize these transformations to propose a two-stage approach for data integration during heterogeneity characterization. The principal advantages of our approach over traditional cokriging or cosimulation methods are: (1) it does not require a linear relationship between primary and secondary data, (2) it exploits the secondary information to its fullest potential by maximizing the correlation between the primary and secondary data, (3) it can be easily applied to cases where several types of secondary or soft data are involved, and (4) it significantly reduces variance function calculations and thus, greatly facilitates non-Gaussian cosimulation. We demonstrate the data integration procedure using synthetic and field examples. The field example involves estimation of pore-footage distribution using well data and multiple seismic attributes.

  7. Non-parametric estimation of Fisher information from real data

    CERN Document Server

    Shemesh, Omri Har; Miñano, Borja; Hoekstra, Alfons G; Sloot, Peter M A

    2015-01-01

    The Fisher Information matrix is a widely used measure for applications ranging from statistical inference, information geometry, experiment design, to the study of criticality in biological systems. Yet there is no commonly accepted non-parametric algorithm to estimate it from real data. In this rapid communication we show how to accurately estimate the Fisher information in a nonparametric way. We also develop a numerical procedure to minimize the errors by choosing the interval of the finite difference scheme necessary to compute the derivatives in the definition of the Fisher information. Our method uses the recently published "Density Estimation using Field Theory" algorithm to compute the probability density functions for continuous densities. We use the Fisher information of the normal distribution to validate our method and as an example we compute the temperature component of the Fisher Information Matrix in the two dimensional Ising model and show that it obeys the expected relation to the heat capa...

  8. A Non-Parametric Spatial Independence Test Using Symbolic Entropy

    Directory of Open Access Journals (Sweden)

    López Hernández, Fernando

    2008-01-01

    Full Text Available In the present paper, we construct a new, simple, consistent and powerful test forspatial independence, called the SG test, by using symbolic dynamics and symbolic entropyas a measure of spatial dependence. We also give a standard asymptotic distribution of anaffine transformation of the symbolic entropy under the null hypothesis of independencein the spatial process. The test statistic and its standard limit distribution, with theproposed symbolization, are invariant to any monotonuous transformation of the data.The test applies to discrete or continuous distributions. Given that the test is based onentropy measures, it avoids smoothed nonparametric estimation. We include a MonteCarlo study of our test, together with the well-known Moran’s I, the SBDS (de Graaffet al, 2001 and (Brett and Pinkse, 1997 non parametric test, in order to illustrate ourapproach.

  9. Multi-Directional Non-Parametric Analysis of Agricultural Efficiency

    DEFF Research Database (Denmark)

    Balezentis, Tomas

    This thesis seeks to develop methodologies for assessment of agricultural efficiency and employ them to Lithuanian family farms. In particular, we focus on three particular objectives throughout the research: (i) to perform a fully non-parametric analysis of efficiency effects, (ii) to extend...... relative to labour, intermediate consumption and land (in some cases land was not treated as a discretionary input). These findings call for further research on relationships among financial structure, investment decisions, and efficiency in Lithuanian family farms. Application of different techniques...... of stochasticity associated with Lithuanian family farm performance. The former technique showed that the farms differed in terms of the mean values and variance of the efficiency scores over time with some clear patterns prevailing throughout the whole research period. The fuzzy Free Disposal Hull showed...

  10. Application of the LSQR algorithm in non-parametric estimation of aerosol size distribution

    Science.gov (United States)

    He, Zhenzong; Qi, Hong; Lew, Zhongyuan; Ruan, Liming; Tan, Heping; Luo, Kun

    2016-05-01

    Based on the Least Squares QR decomposition (LSQR) algorithm, the aerosol size distribution (ASD) is retrieved in non-parametric approach. The direct problem is solved by the Anomalous Diffraction Approximation (ADA) and the Lambert-Beer Law. An optimal wavelength selection method is developed to improve the retrieval accuracy of the ASD. The proposed optimal wavelength set is selected by the method which can make the measurement signals sensitive to wavelength and decrease the degree of the ill-condition of coefficient matrix of linear systems effectively to enhance the anti-interference ability of retrieval results. Two common kinds of monomodal and bimodal ASDs, log-normal (L-N) and Gamma distributions, are estimated, respectively. Numerical tests show that the LSQR algorithm can be successfully applied to retrieve the ASD with high stability in the presence of random noise and low susceptibility to the shape of distributions. Finally, the experimental measurement ASD over Harbin in China is recovered reasonably. All the results confirm that the LSQR algorithm combined with the optimal wavelength selection method is an effective and reliable technique in non-parametric estimation of ASD.

  11. Non-parametric iterative model constraint graph min-cut for automatic kidney segmentation.

    Science.gov (United States)

    Freiman, M; Kronman, A; Esses, S J; Joskowicz, L; Sosna, J

    2010-01-01

    We present a new non-parametric model constraint graph min-cut algorithm for automatic kidney segmentation in CT images. The segmentation is formulated as a maximum a-posteriori estimation of a model-driven Markov random field. A non-parametric hybrid shape and intensity model is treated as a latent variable in the energy functional. The latent model and labeling map that minimize the energy functional are then simultaneously computed with an expectation maximization approach. The main advantages of our method are that it does not assume a fixed parametric prior model, which is subjective to inter-patient variability and registration errors, and that it combines both the model and the image information into a unified graph min-cut based segmentation framework. We evaluated our method on 20 kidneys from 10 CT datasets with and without contrast agent for which ground-truth segmentations were generated by averaging three manual segmentations. Our method yields an average volumetric overlap error of 10.95%, and average symmetric surface distance of 0.79 mm. These results indicate that our method is accurate and robust for kidney segmentation.

  12. APPLICATION OF PARAMETRIC AND NON-PARAMETRIC BENCHMARKING METHODS IN COST EFFICIENCY ANALYSIS OF THE ELECTRICITY DISTRIBUTION SECTOR

    Directory of Open Access Journals (Sweden)

    Andrea Furková

    2007-06-01

    Full Text Available This paper explores the aplication of parametric and non-parametric benchmarking methods in measuring cost efficiency of Slovak and Czech electricity distribution companies. We compare the relative cost efficiency of Slovak and Czech distribution companies using two benchmarking methods: the non-parametric Data Envelopment Analysis (DEA and the Stochastic Frontier Analysis (SFA as the parametric approach. The first part of analysis was based on DEA models. Traditional cross-section CCR and BCC model were modified to cost efficiency estimation. In further analysis we focus on two versions of stochastic frontier cost functioin using panel data: MLE model and GLS model. These models have been applied to an unbalanced panel of 11 (Slovakia 3 and Czech Republic 8 regional electricity distribution utilities over a period from 2000 to 2004. The differences in estimated scores, parameters and ranking of utilities were analyzed. We observed significant differences between parametric methods and DEA approach.

  13. Non-parametric and least squares Langley plot methods

    Directory of Open Access Journals (Sweden)

    P. W. Kiedron

    2015-04-01

    Full Text Available Langley plots are used to calibrate sun radiometers primarily for the measurement of the aerosol component of the atmosphere that attenuates (scatters and absorbs incoming direct solar radiation. In principle, the calibration of a sun radiometer is a straightforward application of the Bouguer–Lambert–Beer law V=V>/i>0e−τ ·m, where a plot of ln (V voltage vs. m air mass yields a straight line with intercept ln (V0. This ln (V0 subsequently can be used to solve for τ for any measurement of V and calculation of m. This calibration works well on some high mountain sites, but the application of the Langley plot calibration technique is more complicated at other, more interesting, locales. This paper is concerned with ferreting out calibrations at difficult sites and examining and comparing a number of conventional and non-conventional methods for obtaining successful Langley plots. The eleven techniques discussed indicate that both least squares and various non-parametric techniques produce satisfactory calibrations with no significant differences among them when the time series of ln (V0's are smoothed and interpolated with median and mean moving window filters.

  14. Parametric and non-parametric modeling of short-term synaptic plasticity. Part II: Experimental study.

    Science.gov (United States)

    Song, Dong; Wang, Zhuo; Marmarelis, Vasilis Z; Berger, Theodore W

    2009-02-01

    This paper presents a synergistic parametric and non-parametric modeling study of short-term plasticity (STP) in the Schaffer collateral to hippocampal CA1 pyramidal neuron (SC) synapse. Parametric models in the form of sets of differential and algebraic equations have been proposed on the basis of the current understanding of biological mechanisms active within the system. Non-parametric Poisson-Volterra models are obtained herein from broadband experimental input-output data. The non-parametric model is shown to provide better prediction of the experimental output than a parametric model with a single set of facilitation/depression (FD) process. The parametric model is then validated in terms of its input-output transformational properties using the non-parametric model since the latter constitutes a canonical and more complete representation of the synaptic nonlinear dynamics. Furthermore, discrepancies between the experimentally-derived non-parametric model and the equivalent non-parametric model of the parametric model suggest the presence of multiple FD processes in the SC synapses. Inclusion of an additional set of FD process in the parametric model makes it replicate better the characteristics of the experimentally-derived non-parametric model. This improved parametric model in turn provides the requisite biological interpretability that the non-parametric model lacks.

  15. Parametric modeling of DSC-MRI data with stochastic filtration and optimal input design versus non-parametric modeling.

    Science.gov (United States)

    Kalicka, Renata; Pietrenko-Dabrowska, Anna

    2007-03-01

    In the paper MRI measurements are used for assessment of brain tissue perfusion and other features and functions of the brain (cerebral blood flow - CBF, cerebral blood volume - CBV, mean transit time - MTT). Perfusion is an important indicator of tissue viability and functioning as in pathological tissue blood flow, vascular and tissue structure are altered with respect to normal tissue. MRI enables diagnosing diseases at an early stage of their course. The parametric and non-parametric approaches to the identification of MRI models are presented and compared. The non-parametric modeling adopts gamma variate functions. The parametric three-compartmental catenary model, based on the general kinetic model, is also proposed. The parameters of the models are estimated on the basis of experimental data. The goodness of fit of the gamma variate and the three-compartmental models to the data and the accuracy of the parameter estimates are compared. Kalman filtering, smoothing the measurements, was adopted to improve the estimate accuracy of the parametric model. Parametric modeling gives a better fit and better parameter estimates than non-parametric and allows an insight into the functioning of the system. To improve the accuracy optimal experiment design related to the input signal was performed.

  16. Non-parametric kernel density estimation of species sensitivity distributions in developing water quality criteria of metals.

    Science.gov (United States)

    Wang, Ying; Wu, Fengchang; Giesy, John P; Feng, Chenglian; Liu, Yuedan; Qin, Ning; Zhao, Yujie

    2015-09-01

    Due to use of different parametric models for establishing species sensitivity distributions (SSDs), comparison of water quality criteria (WQC) for metals of the same group or period in the periodic table is uncertain and results can be biased. To address this inadequacy, a new probabilistic model, based on non-parametric kernel density estimation was developed and optimal bandwidths and testing methods are proposed. Zinc (Zn), cadmium (Cd), and mercury (Hg) of group IIB of the periodic table are widespread in aquatic environments, mostly at small concentrations, but can exert detrimental effects on aquatic life and human health. With these metals as target compounds, the non-parametric kernel density estimation method and several conventional parametric density estimation methods were used to derive acute WQC of metals for protection of aquatic species in China that were compared and contrasted with WQC for other jurisdictions. HC5 values for protection of different types of species were derived for three metals by use of non-parametric kernel density estimation. The newly developed probabilistic model was superior to conventional parametric density estimations for constructing SSDs and for deriving WQC for these metals. HC5 values for the three metals were inversely proportional to atomic number, which means that the heavier atoms were more potent toxicants. The proposed method provides a novel alternative approach for developing SSDs that could have wide application prospects in deriving WQC and use in assessment of risks to ecosystems.

  17. Non-parametric combination and related permutation tests for neuroimaging.

    Science.gov (United States)

    Winkler, Anderson M; Webster, Matthew A; Brooks, Jonathan C; Tracey, Irene; Smith, Stephen M; Nichols, Thomas E

    2016-04-01

    In this work, we show how permutation methods can be applied to combination analyses such as those that include multiple imaging modalities, multiple data acquisitions of the same modality, or simply multiple hypotheses on the same data. Using the well-known definition of union-intersection tests and closed testing procedures, we use synchronized permutations to correct for such multiplicity of tests, allowing flexibility to integrate imaging data with different spatial resolutions, surface and/or volume-based representations of the brain, including non-imaging data. For the problem of joint inference, we propose and evaluate a modification of the recently introduced non-parametric combination (NPC) methodology, such that instead of a two-phase algorithm and large data storage requirements, the inference can be performed in a single phase, with reasonable computational demands. The method compares favorably to classical multivariate tests (such as MANCOVA), even when the latter is assessed using permutations. We also evaluate, in the context of permutation tests, various combining methods that have been proposed in the past decades, and identify those that provide the best control over error rate and power across a range of situations. We show that one of these, the method of Tippett, provides a link between correction for the multiplicity of tests and their combination. Finally, we discuss how the correction can solve certain problems of multiple comparisons in one-way ANOVA designs, and how the combination is distinguished from conjunctions, even though both can be assessed using permutation tests. We also provide a common algorithm that accommodates combination and correction.

  18. Non-parametric frequency analysis of extreme values for integrated disaster management considering probable maximum events

    Science.gov (United States)

    Takara, K. T.

    2015-12-01

    This paper describes a non-parametric frequency analysis method for hydrological extreme-value samples with a size larger than 100, verifying the estimation accuracy with a computer intensive statistics (CIS) resampling such as the bootstrap. Probable maximum values are also incorporated into the analysis for extreme events larger than a design level of flood control. Traditional parametric frequency analysis methods of extreme values include the following steps: Step 1: Collecting and checking extreme-value data; Step 2: Enumerating probability distributions that would be fitted well to the data; Step 3: Parameter estimation; Step 4: Testing goodness of fit; Step 5: Checking the variability of quantile (T-year event) estimates by the jackknife resampling method; and Step_6: Selection of the best distribution (final model). The non-parametric method (NPM) proposed here can skip Steps 2, 3, 4 and 6. Comparing traditional parameter methods (PM) with the NPM, this paper shows that PM often underestimates 100-year quantiles for annual maximum rainfall samples with records of more than 100 years. Overestimation examples are also demonstrated. The bootstrap resampling can do bias correction for the NPM and can also give the estimation accuracy as the bootstrap standard error. This NPM has advantages to avoid various difficulties in above-mentioned steps in the traditional PM. Probable maximum events are also incorporated into the NPM as an upper bound of the hydrological variable. Probable maximum precipitation (PMP) and probable maximum flood (PMF) can be a new parameter value combined with the NPM. An idea how to incorporate these values into frequency analysis is proposed for better management of disasters that exceed the design level. The idea stimulates more integrated approach by geoscientists and statisticians as well as encourages practitioners to consider the worst cases of disasters in their disaster management planning and practices.

  19. An artificial neural network architecture for non-parametric visual odometry in wireless capsule endoscopy

    Science.gov (United States)

    Dimas, George; Iakovidis, Dimitris K.; Karargyris, Alexandros; Ciuti, Gastone; Koulaouzidis, Anastasios

    2017-09-01

    Wireless capsule endoscopy is a non-invasive screening procedure of the gastrointestinal (GI) tract performed with an ingestible capsule endoscope (CE) of the size of a large vitamin pill. Such endoscopes are equipped with a usually low-frame-rate color camera which enables the visualization of the GI lumen and the detection of pathologies. The localization of the commercially available CEs is performed in the 3D abdominal space using radio-frequency (RF) triangulation from external sensor arrays, in combination with transit time estimation. State-of-the-art approaches, such as magnetic localization, which have been experimentally proved more accurate than the RF approach, are still at an early stage. Recently, we have demonstrated that CE localization is feasible using solely visual cues and geometric models. However, such approaches depend on camera parameters, many of which are unknown. In this paper the authors propose a novel non-parametric visual odometry (VO) approach to CE localization based on a feed-forward neural network architecture. The effectiveness of this approach in comparison to state-of-the-art geometric VO approaches is validated using a robotic-assisted in vitro experimental setup.

  20. Continuous/discrete non parametric Bayesian belief nets with UNICORN and UNINET

    NARCIS (Netherlands)

    Cooke, R.M.; Kurowicka, D.; Hanea, A.M.; Morales Napoles, O.; Ababei, D.A.; Ale, B.J.M.; Roelen, A.

    2007-01-01

    Hanea et al. (2006) presented a method for quantifying and computing continuous/discrete non parametric Bayesian Belief Nets (BBN). Influences are represented as conditional rank correlations, and the joint normal copula enables rapid sampling and conditionalization. Further mathematical background

  1. Kernel bandwidth estimation for non-parametric density estimation: a comparative study

    CSIR Research Space (South Africa)

    Van der Walt, CM

    2013-12-01

    Full Text Available We investigate the performance of conventional bandwidth estimators for non-parametric kernel density estimation on a number of representative pattern-recognition tasks, to gain a better understanding of the behaviour of these estimators in high...

  2. Non-Parametric Evolutionary Algorithm for Estimating Root Zone Soil Moisture

    Science.gov (United States)

    Mohanty, B.; Shin, Y.; Ines, A. M.

    2013-12-01

    Prediction of root zone soil moisture is critical for water resources management. In this study, we explored a non-parametric evolutionary algorithm for estimating root zone soil moisture from a time series of spatially-distributed rainfall across multiple weather locations under two different hydro-climatic regions. A new genetic algorithm-based hidden Markov model (HMMGA) was developed to estimate long-term root zone soil moisture dynamics at different soil depths. Also, we analyzed rainfall occurrence probabilities and dry/wet spell lengths reproduced by this approach. The HMMGA was used to estimate the optimal state sequences (weather states) based on the precipitation history. Historical root zone soil moisture statistics were then determined based on the weather state conditions. To test the new approach, we selected two different soil moisture fields, Oklahoma (130 km x 130 km) and Illinois (300 km x 500 km), during 1995 to 2009 and 1994 to 2010, respectively. We found that the newly developed framework performed well in predicting root zone soil moisture dynamics at both the spatial scales. Also, the reproduced rainfall occurrence probabilities and dry/wet spell lengths matched well with the observations at the spatio-temporal scales. Since the proposed algorithm requires only precipitation and historical soil moisture data from existing, established weather stations, it can serve an attractive alternative for predicting root zone soil moisture in the future using climate change scenarios and root zone soil moisture history.

  3. Structuring feature space: a non-parametric method for volumetric transfer function generation.

    Science.gov (United States)

    Maciejewski, Ross; Woo, Insoo; Chen, Wei; Ebert, David S

    2009-01-01

    The use of multi-dimensional transfer functions for direct volume rendering has been shown to be an effective means of extracting materials and their boundaries for both scalar and multivariate data. The most common multi-dimensional transfer function consists of a two-dimensional (2D) histogram with axes representing a subset of the feature space (e.g., value vs. value gradient magnitude), with each entry in the 2D histogram being the number of voxels at a given feature space pair. Users then assign color and opacity to the voxel distributions within the given feature space through the use of interactive widgets (e.g., box, circular, triangular selection). Unfortunately, such tools lead users through a trial-and-error approach as they assess which data values within the feature space map to a given area of interest within the volumetric space. In this work, we propose the addition of non-parametric clustering within the transfer function feature space in order to extract patterns and guide transfer function generation. We apply a non-parametric kernel density estimation to group voxels of similar features within the 2D histogram. These groups are then binned and colored based on their estimated density, and the user may interactively grow and shrink the binned regions to explore feature boundaries and extract regions of interest. We also extend this scheme to temporal volumetric data in which time steps of 2D histograms are composited into a histogram volume. A three-dimensional (3D) density estimation is then applied, and users can explore regions within the feature space across time without adjusting the transfer function at each time step. Our work enables users to effectively explore the structures found within a feature space of the volume and provide a context in which the user can understand how these structures relate to their volumetric data. We provide tools for enhanced exploration and manipulation of the transfer function, and we show that the initial

  4. THE DARK MATTER PROFILE OF THE MILKY WAY: A NON-PARAMETRIC RECONSTRUCTION

    Energy Technology Data Exchange (ETDEWEB)

    Pato, Miguel [The Oskar Klein Centre for Cosmoparticle Physics, Department of Physics, Stockholm University, AlbaNova, SE-106 91 Stockholm (Sweden); Iocco, Fabio [ICTP South American Institute for Fundamental Research, and Instituto de Física Teórica—Universidade Estadual Paulista (UNESP), Rua Dr. Bento Teobaldo Ferraz 271, 01140-070 São Paulo, SP (Brazil)

    2015-04-10

    We present the results of a new, non-parametric method to reconstruct the Galactic dark matter profile directly from observations. Using the latest kinematic data to track the total gravitational potential and the observed distribution of stars and gas to set the baryonic component, we infer the dark matter contribution to the circular velocity across the Galaxy. The radial derivative of this dynamical contribution is then estimated to extract the dark matter profile. The innovative feature of our approach is that it makes no assumption on the functional form or shape of the profile, thus allowing for a clean determination with no theoretical bias. We illustrate the power of the method by constraining the spherical dark matter profile between 2.5 and 25 kpc away from the Galactic center. The results show that the proposed method, free of widely used assumptions, can already be applied to pinpoint the dark matter distribution in the Milky Way with competitive accuracy, and paves the way for future developments.

  5. Non-parametric Deprojection of Surface Brightness Profiles of Galaxies in Generalised Geometries

    CERN Document Server

    Chakrabarty, Dalia

    2009-01-01

    We present a new Bayesian non-parametric deprojection algorithm DOPING (Deprojection of Observed Photometry using and INverse Gambit), that is designed to extract 3-D luminosity density distributions $\\rho$ from observed surface brightness maps $I$, in generalised geometries, while taking into account changes in intrinsic shape with radius, using a penalised likelihood approach and an MCMC optimiser. We provide the most likely solution to the integral equation that represents deprojection of the measured $I$ to $\\rho$. In order to keep the solution modular, we choose to express $\\rho$ as a function of the line-of-sight (LOS) coordinate $z$. We calculate the extent of the system along the ${\\bf z}$-axis, for a given point on the image that lies within an identified isophotal annulus. The extent along the LOS is binned and density is held a constant over each such $z$-bin. The code begins with a seed density and at the beginning of an iterative step, the trial $\\rho$ is updated. Comparison of the projection of ...

  6. Multi-Directional Non-Parametric Analysis of Agricultural Efficiency

    DEFF Research Database (Denmark)

    Balezentis, Tomas

    the Multi-Directional Efficiency Analysis approach, (iii) to account for uncertainties via the use of probabilistic and fuzzy measures. Therefore, the thesis encompass six papers dedicated to (the combinations of) these objectives. One of the main contributions of this thesis is a number of extensions...... relative to labour, intermediate consumption and land (in some cases land was not treated as a discretionary input). These findings call for further research on relationships among financial structure, investment decisions, and efficiency in Lithuanian family farms. Application of different techniques...

  7. Power of non-parametric linkage analysis in mapping genes contributing to human longevity in long-lived sib-pairs

    DEFF Research Database (Denmark)

    Tan, Qihua; Zhao, J H; Iachine, I

    2004-01-01

    This report investigates the power issue in applying the non-parametric linkage analysis of affected sib-pairs (ASP) [Kruglyak and Lander, 1995: Am J Hum Genet 57:439-454] to localize genes that contribute to human longevity using long-lived sib-pairs. Data were simulated by introducing a recently...... developed statistical model for measuring marker-longevity associations [Yashin et al., 1999: Am J Hum Genet 65:1178-1193], enabling direct power comparison between linkage and association approaches. The non-parametric linkage (NPL) scores estimated in the region harboring the causal allele are evaluated...... in case of a dominant effect. Although the power issue may depend heavily on the true genetic nature in maintaining survival, our study suggests that results from small-scale sib-pair investigations should be referred with caution, given the complexity of human longevity....

  8. A non-parametric peak finder algorithm and its application in searches for new physics

    CERN Document Server

    Chekanov, S

    2011-01-01

    We have developed an algorithm for non-parametric fitting and extraction of statistically significant peaks in the presence of statistical and systematic uncertainties. Applications of this algorithm for analysis of high-energy collision data are discussed. In particular, we illustrate how to use this algorithm in general searches for new physics in invariant-mass spectra using pp Monte Carlo simulations.

  9. Non-parametric Estimation of Diffusion-Paths Using Wavelet Scaling Methods

    DEFF Research Database (Denmark)

    Høg, Esben

    In continuous time, diffusion processes have been used for modelling financial dynamics for a long time. For example the Ornstein-Uhlenbeck process (the simplest mean-reverting process) has been used to model non-speculative price processes. We discuss non--parametric estimation of these processes...

  10. Non-parametric system identification from non-linear stochastic response

    DEFF Research Database (Denmark)

    Rüdinger, Finn; Krenk, Steen

    2001-01-01

    An estimation method is proposed for identification of non-linear stiffness and damping of single-degree-of-freedom systems under stationary white noise excitation. Non-parametric estimates of the stiffness and damping along with an estimate of the white noise intensity are obtained by suitable p...

  11. Non-Parametric Bayesian Updating within the Assessment of Reliability for Offshore Wind Turbine Support Structures

    DEFF Research Database (Denmark)

    Ramirez, José Rangel; Sørensen, John Dalsgaard

    2011-01-01

    This work illustrates the updating and incorporation of information in the assessment of fatigue reliability for offshore wind turbine. The new information, coming from external and condition monitoring can be used to direct updating of the stochastic variables through a non-parametric Bayesian u...

  12. Comparison of reliability techniques of parametric and non-parametric method

    Directory of Open Access Journals (Sweden)

    C. Kalaiselvan

    2016-06-01

    Full Text Available Reliability of a product or system is the probability that the product performs adequately its intended function for the stated period of time under stated operating conditions. It is function of time. The most widely used nano ceramic capacitor C0G and X7R is used in this reliability study to generate the Time-to failure (TTF data. The time to failure data are identified by Accelerated Life Test (ALT and Highly Accelerated Life Testing (HALT. The test is conducted at high stress level to generate more failure rate within the short interval of time. The reliability method used to convert accelerated to actual condition is Parametric method and Non-Parametric method. In this paper, comparative study has been done for Parametric and Non-Parametric methods to identify the failure data. The Weibull distribution is identified for parametric method; Kaplan–Meier and Simple Actuarial Method are identified for non-parametric method. The time taken to identify the mean time to failure (MTTF in accelerating condition is the same for parametric and non-parametric method with relative deviation.

  13. Non-Parametric Estimation of Diffusion-Paths Using Wavelet Scaling Methods

    DEFF Research Database (Denmark)

    Høg, Esben

    2003-01-01

    In continuous time, diffusion processes have been used for modelling financial dynamics for a long time. For example the Ornstein-Uhlenbeck process (the simplest mean--reverting process) has been used to model non-speculative price processes. We discuss non--parametric estimation of these processes...

  14. A non-parametric method for correction of global radiation observations

    DEFF Research Database (Denmark)

    Bacher, Peder; Madsen, Henrik; Perers, Bengt;

    2013-01-01

    in the observations are corrected. These are errors such as: tilt in the leveling of the sensor, shadowing from surrounding objects, clipping and saturation in the signal processing, and errors from dirt and wear. The method is based on a statistical non-parametric clear-sky model which is applied to both...

  15. Non-parametric production analysis of pesticides use in the Netherlands

    NARCIS (Netherlands)

    Oude Lansink, A.G.J.M.; Silva, E.

    2004-01-01

    Many previous empirical studies on the productivity of pesticides suggest that pesticides are under-utilized in agriculture despite the general held believe that these inputs are substantially over-utilized. This paper uses data envelopment analysis (DEA) to calculate non-parametric measures of the

  16. Low default credit scoring using two-class non-parametric kernel density estimation

    CSIR Research Space (South Africa)

    Rademeyer, E

    2016-12-01

    Full Text Available This paper investigates the performance of two-class classification credit scoring data sets with low default ratios. The standard two-class parametric Gaussian and non-parametric Parzen classifiers are extended, using Bayes’ rule, to include either...

  17. Measuring the influence of networks on transaction costs using a non-parametric regression technique

    DEFF Research Database (Denmark)

    Henningsen, Géraldine; Henningsen, Arne; Henning, Christian H.C.A.

    . We empirically analyse the effect of networks on productivity using a cross-validated local linear non-parametric regression technique and a data set of 384 farms in Poland. Our empirical study generally supports our hypothesis that networks affect productivity. Large and dense trading networks...

  18. Non-Parametric Estimation of Diffusion-Paths Using Wavelet Scaling Methods

    DEFF Research Database (Denmark)

    Høg, Esben

    2003-01-01

    In continuous time, diffusion processes have been used for modelling financial dynamics for a long time. For example the Ornstein-Uhlenbeck process (the simplest mean--reverting process) has been used to model non-speculative price processes. We discuss non--parametric estimation of these processes...

  19. Non-parametric Estimation of Diffusion-Paths Using Wavelet Scaling Methods

    DEFF Research Database (Denmark)

    Høg, Esben

    In continuous time, diffusion processes have been used for modelling financial dynamics for a long time. For example the Ornstein-Uhlenbeck process (the simplest mean-reverting process) has been used to model non-speculative price processes. We discuss non--parametric estimation of these processes...

  20. Bayesian Semi- and Non-Parametric Models for Longitudinal Data with Multiple Membership Effects in R

    Directory of Open Access Journals (Sweden)

    Terrance Savitsky

    2014-03-01

    Full Text Available We introduce growcurves for R that performs analysis of repeated measures multiple membership (MM data. This data structure arises in studies under which an intervention is delivered to each subject through the subjects participation in a set of multiple elements that characterize the intervention. In our motivating study design under which subjects receive a group cognitive behavioral therapy (CBT treatment, an element is a group CBT session and each subject attends multiple sessions that, together, comprise the treatment. The sets of elements, or group CBT sessions, attended by subjects will partly overlap with some of those from other subjects to induce a dependence in their responses. The growcurves package offers two alternative sets of hierarchical models: 1. Separate terms are specified for multivariate subject and MM element random effects, where the subject effects are modeled under a Dirichlet process prior to produce a semi-parametric construction; 2. A single term is employed to model joint subject-by-MM effects. A fully non-parametric dependent Dirichlet process formulation allows exploration of differences in subject responses across different MM elements. This model allows for borrowing information among subjects who express similar longitudinal trajectories for flexible estimation. growcurves deploys estimation functions to perform posterior sampling under a suite of prior options. An accompanying set of plot functions allows the user to readily extract by-subject growth curves. The design approach intends to anticipate inferential goals with tools that fully extract information from repeated measures data. Computational efficiency is achieved by performing the sampling for estimation functions using compiled C++ code.

  1. Non-parametric Bayesian human motion recognition using a single MEMS tri-axial accelerometer.

    Science.gov (United States)

    Ahmed, M Ejaz; Song, Ju Bin

    2012-09-27

    In this paper, we propose a non-parametric clustering method to recognize the number of human motions using features which are obtained from a single microelectromechanical system (MEMS) accelerometer. Since the number of human motions under consideration is not known a priori and because of the unsupervised nature of the proposed technique, there is no need to collect training data for the human motions. The infinite Gaussian mixture model (IGMM) and collapsed Gibbs sampler are adopted to cluster the human motions using extracted features. From the experimental results, we show that the unanticipated human motions are detected and recognized with significant accuracy, as compared with the parametric Fuzzy C-Mean (FCM) technique, the unsupervised K-means algorithm, and the non-parametric mean-shift method.

  2. Non-Parametric Bayesian Human Motion Recognition Using a Single MEMS Tri-Axial Accelerometer

    Directory of Open Access Journals (Sweden)

    M. Ejaz Ahmed

    2012-09-01

    Full Text Available In this paper, we propose a non-parametric clustering method to recognize the number of human motions using features which are obtained from a single microelectromechanical system (MEMS accelerometer. Since the number of human motions under consideration is not known a priori and because of the unsupervised nature of the proposed technique, there is no need to collect training data for the human motions. The infinite Gaussian mixture model (IGMM and collapsed Gibbs sampler are adopted to cluster the human motions using extracted features. From the experimental results, we show that the unanticipated human motions are detected and recognized with significant accuracy, as compared with the parametric Fuzzy C-Mean (FCM technique, the unsupervised K-means algorithm, and the non-parametric mean-shift method.

  3. Non-Parametric Tests of Structure for High Angular Resolution Diffusion Imaging in Q-Space

    CERN Document Server

    Olhede, Sofia C

    2010-01-01

    High angular resolution diffusion imaging data is the observed characteristic function for the local diffusion of water molecules in tissue. This data is used to infer structural information in brain imaging. Non-parametric scalar measures are proposed to summarize such data, and to locally characterize spatial features of the diffusion probability density function (PDF), relying on the geometry of the characteristic function. Summary statistics are defined so that their distributions are, to first order, both independent of nuisance parameters and also analytically tractable. The dominant direction of the diffusion at a spatial location (voxel) is determined, and a new set of axes are introduced in Fourier space. Variation quantified in these axes determines the local spatial properties of the diffusion density. Non-parametric hypothesis tests for determining whether the diffusion is unimodal, isotropic or multi-modal are proposed. More subtle characteristics of white-matter microstructure, such as the degre...

  4. Variable selection in identification of a high dimensional nonlinear non-parametric system

    Institute of Scientific and Technical Information of China (English)

    Er-Wei BAI; Wenxiao ZHAO; Weixing ZHENG

    2015-01-01

    The problem of variable selection in system identification of a high dimensional nonlinear non-parametric system is described. The inherent difficulty, the curse of dimensionality, is introduced. Then its connections to various topics and research areas are briefly discussed, including order determination, pattern recognition, data mining, machine learning, statistical regression and manifold embedding. Finally, some results of variable selection in system identification in the recent literature are presented.

  5. Measuring the influence of information networks on transaction costs using a non-parametric regression technique

    DEFF Research Database (Denmark)

    Henningsen, Geraldine; Henningsen, Arne; Henning, Christian H. C. A.

    All business transactions as well as achieving innovations take up resources, subsumed under the concept of transaction costs (TAC). One of the major factors in TAC theory is information. Information networks can catalyse the interpersonal information exchange and hence, increase the access to no...... are unveiled by reduced productivity. A cross-validated local linear non-parametric regression shows that good information networks increase the productivity of farms. A bootstrapping procedure confirms that this result is statistically significant....

  6. A Non Parametric Study of the Volatility of the Economy as a Country Risk Predictor

    CERN Document Server

    Costanzo, Sabatino; Dominguez, Ramses; Moreno, William

    2007-01-01

    This paper intends to explain Venezuela's country spread behavior through the Neural Networks analysis of a monthly economic activity general index of economic indicators constructed by the Central Bank of Venezuela, a measure of the shocks affecting country risk of emerging markets and the U.S. short term interest rate. The use of non parametric methods allowed the finding of non linear relationship between these inputs and the country risk. The networks performance was evaluated using the method of excess predictability.

  7. t-tests, non-parametric tests, and large studies—a paradox of statistical practice?

    Directory of Open Access Journals (Sweden)

    Fagerland Morten W

    2012-06-01

    Full Text Available Abstract Background During the last 30 years, the median sample size of research studies published in high-impact medical journals has increased manyfold, while the use of non-parametric tests has increased at the expense of t-tests. This paper explores this paradoxical practice and illustrates its consequences. Methods A simulation study is used to compare the rejection rates of the Wilcoxon-Mann-Whitney (WMW test and the two-sample t-test for increasing sample size. Samples are drawn from skewed distributions with equal means and medians but with a small difference in spread. A hypothetical case study is used for illustration and motivation. Results The WMW test produces, on average, smaller p-values than the t-test. This discrepancy increases with increasing sample size, skewness, and difference in spread. For heavily skewed data, the proportion of p Conclusions Non-parametric tests are most useful for small studies. Using non-parametric tests in large studies may provide answers to the wrong question, thus confusing readers. For studies with a large sample size, t-tests and their corresponding confidence intervals can and should be used even for heavily skewed data.

  8. Non-parametric foreground subtraction for 21cm epoch of reionization experiments

    CERN Document Server

    Harker, Geraint; Bernardi, Gianni; Brentjens, Michiel A; De Bruyn, A G; Ciardi, Benedetta; Jelic, Vibor; Koopmans, Leon V E; Labropoulos, Panagiotis; Mellema, Garrelt; Offringa, Andre; Pandey, V N; Schaye, Joop; Thomas, Rajat M; Yatawatta, Sarod

    2009-01-01

    An obstacle to the detection of redshifted 21cm emission from the epoch of reionization (EoR) is the presence of foregrounds which exceed the cosmological signal in intensity by orders of magnitude. We argue that in principle it would be better to fit the foregrounds non-parametrically - allowing the data to determine their shape - rather than selecting some functional form in advance and then fitting its parameters. Non-parametric fits often suffer from other problems, however. We discuss these before suggesting a non-parametric method, Wp smoothing, which seems to avoid some of them. After outlining the principles of Wp smoothing we describe an algorithm used to implement it. We then apply Wp smoothing to a synthetic data cube for the LOFAR EoR experiment. The performance of Wp smoothing, measured by the extent to which it is able to recover the variance of the cosmological signal and to which it avoids leakage of power from the foregrounds, is compared to that of a parametric fit, and to another non-parame...

  9. Democracy and Leadership: Does Every Vote Count?

    Science.gov (United States)

    Helterbran, Valeri R.

    2008-01-01

    Good judgment and ethics in decision making are important criteria in all areas of teaching and administration. This case study chronicles the actions of a class sponsor whose decisions modify the results of a high school's senior class elections and the fallout that results. The principal must assess the problem, take appropriate action to…

  10. Non-parametric change-point method for differential gene expression detection.

    Directory of Open Access Journals (Sweden)

    Yao Wang

    Full Text Available BACKGROUND: We proposed a non-parametric method, named Non-Parametric Change Point Statistic (NPCPS for short, by using a single equation for detecting differential gene expression (DGE in microarray data. NPCPS is based on the change point theory to provide effective DGE detecting ability. METHODOLOGY: NPCPS used the data distribution of the normal samples as input, and detects DGE in the cancer samples by locating the change point of gene expression profile. An estimate of the change point position generated by NPCPS enables the identification of the samples containing DGE. Monte Carlo simulation and ROC study were applied to examine the detecting accuracy of NPCPS, and the experiment on real microarray data of breast cancer was carried out to compare NPCPS with other methods. CONCLUSIONS: Simulation study indicated that NPCPS was more effective for detecting DGE in cancer subset compared with five parametric methods and one non-parametric method. When there were more than 8 cancer samples containing DGE, the type I error of NPCPS was below 0.01. Experiment results showed both good accuracy and reliability of NPCPS. Out of the 30 top genes ranked by using NPCPS, 16 genes were reported as relevant to cancer. Correlations between the detecting result of NPCPS and the compared methods were less than 0.05, while between the other methods the values were from 0.20 to 0.84. This indicates that NPCPS is working on different features and thus provides DGE identification from a distinct perspective comparing with the other mean or median based methods.

  11. Two new non-parametric tests to the distance duality relation with galaxy clusters

    CERN Document Server

    Costa, S S; Holanda, R F L

    2015-01-01

    The cosmic distance duality relation is a milestone of cosmology involving the luminosity and angular diameter distances. Any departure of the relation points to new physics or systematic errors in the observations, therefore tests of the relation are extremely important to build a consistent cosmological framework. Here, two new tests are proposed based on galaxy clusters observations (angular diameter distance and gas mass fraction) and $H(z)$ measurements. By applying Gaussian Processes, a non-parametric method, we are able to derive constraints on departures of the relation where no evidence of deviation is found in both methods, reinforcing the cosmological and astrophysical hypotheses adopted so far.

  12. Non-parametric trend analysis of water quality data of rivers in Kansas

    Science.gov (United States)

    Yu, Y.-S.; Zou, S.; Whittemore, D.

    1993-01-01

    Surface water quality data for 15 sampling stations in the Arkansas, Verdigris, Neosho, and Walnut river basins inside the state of Kansas were analyzed to detect trends (or lack of trends) in 17 major constituents by using four different non-parametric methods. The results show that concentrations of specific conductance, total dissolved solids, calcium, total hardness, sodium, potassium, alkalinity, sulfate, chloride, total phosphorus, ammonia plus organic nitrogen, and suspended sediment generally have downward trends. Some of the downward trends are related to increases in discharge, while others could be caused by decreases in pollution sources. Homogeneity tests show that both station-wide trends and basinwide trends are non-homogeneous. ?? 1993.

  13. Non-parametric analysis of infrared spectra for recognition of glass and glass ceramic fragments in recycling plants.

    Science.gov (United States)

    Farcomeni, Alessio; Serranti, Silvia; Bonifazi, Giuseppe

    2008-01-01

    Glass ceramic detection in glass recycling plants represents a still unsolved problem, as glass ceramic material looks like normal glass and is usually detected only by specialized personnel. The presence of glass-like contaminants inside waste glass products, resulting from both industrial and differentiated urban waste collection, increases process production costs and reduces final product quality. In this paper an innovative approach for glass ceramic recognition, based on the non-parametric analysis of infrared spectra, is proposed and investigated. The work was specifically addressed to the spectral classification of glass and glass ceramic fragments collected in an actual recycling plant from three different production lines: flat glass, colored container-glass and white container-glass. The analyses, carried out in the near and mid-infrared (NIR-MIR) spectral field (1280-4480 nm), show that glass ceramic and glass fragments can be recognized by applying a wavelet transform, with a small classification error. Moreover, a method for selecting only a small subset of relevant wavelength ratios is suggested, allowing the conduct of a fast recognition of the two classes of materials. The results show how the proposed approach can be utilized to develop a classification engine to be integrated inside a hardware and software sorting architecture for fast "on-line" ceramic glass recognition and separation.

  14. Detecting correlation changes in multivariate time series: A comparison of four non-parametric change point detection methods.

    Science.gov (United States)

    Cabrieto, Jedelyn; Tuerlinckx, Francis; Kuppens, Peter; Grassmann, Mariel; Ceulemans, Eva

    2017-06-01

    Change point detection in multivariate time series is a complex task since next to the mean, the correlation structure of the monitored variables may also alter when change occurs. DeCon was recently developed to detect such changes in mean and\\or correlation by combining a moving windows approach and robust PCA. However, in the literature, several other methods have been proposed that employ other non-parametric tools: E-divisive, Multirank, and KCP. Since these methods use different statistical approaches, two issues need to be tackled. First, applied researchers may find it hard to appraise the differences between the methods. Second, a direct comparison of the relative performance of all these methods for capturing change points signaling correlation changes is still lacking. Therefore, we present the basic principles behind DeCon, E-divisive, Multirank, and KCP and the corresponding algorithms, to make them more accessible to readers. We further compared their performance through extensive simulations using the settings of Bulteel et al. (Biological Psychology, 98 (1), 29-42, 2014) implying changes in mean and in correlation structure and those of Matteson and James (Journal of the American Statistical Association, 109 (505), 334-345, 2014) implying different numbers of (noise) variables. KCP emerged as the best method in almost all settings. However, in case of more than two noise variables, only DeCon performed adequately in detecting correlation changes.

  15. A web application for evaluating Phase I methods using a non-parametric optimal benchmark.

    Science.gov (United States)

    Wages, Nolan A; Varhegyi, Nikole

    2017-06-01

    In evaluating the performance of Phase I dose-finding designs, simulation studies are typically conducted to assess how often a method correctly selects the true maximum tolerated dose under a set of assumed dose-toxicity curves. A necessary component of the evaluation process is to have some concept for how well a design can possibly perform. The notion of an upper bound on the accuracy of maximum tolerated dose selection is often omitted from the simulation study, and the aim of this work is to provide researchers with accessible software to quickly evaluate the operating characteristics of Phase I methods using a benchmark. The non-parametric optimal benchmark is a useful theoretical tool for simulations that can serve as an upper limit for the accuracy of maximum tolerated dose identification based on a binary toxicity endpoint. It offers researchers a sense of the plausibility of a Phase I method's operating characteristics in simulation. We have developed an R shiny web application for simulating the benchmark. The web application has the ability to quickly provide simulation results for the benchmark and requires no programming knowledge. The application is free to access and use on any device with an Internet browser. The application provides the percentage of correct selection of the maximum tolerated dose and an accuracy index, operating characteristics typically used in evaluating the accuracy of dose-finding designs. We hope this software will facilitate the use of the non-parametric optimal benchmark as an evaluation tool in dose-finding simulation.

  16. Robust non-parametric one-sample tests for the analysis of recurrent events.

    Science.gov (United States)

    Rebora, Paola; Galimberti, Stefania; Valsecchi, Maria Grazia

    2010-12-30

    One-sample non-parametric tests are proposed here for inference on recurring events. The focus is on the marginal mean function of events and the basis for inference is the standardized distance between the observed and the expected number of events under a specified reference rate. Different weights are considered in order to account for various types of alternative hypotheses on the mean function of the recurrent events process. A robust version and a stratified version of the test are also proposed. The performance of these tests was investigated through simulation studies under various underlying event generation processes, such as homogeneous and nonhomogeneous Poisson processes, autoregressive and renewal processes, with and without frailty effects. The robust versions of the test have been shown to be suitable in a wide variety of event generating processes. The motivating context is a study on gene therapy in a very rare immunodeficiency in children, where a major end-point is the recurrence of severe infections. Robust non-parametric one-sample tests for recurrent events can be useful to assess efficacy and especially safety in non-randomized studies or in epidemiological studies for comparison with a standard population.

  17. MEASURING DARK MATTER PROFILES NON-PARAMETRICALLY IN DWARF SPHEROIDALS: AN APPLICATION TO DRACO

    Energy Technology Data Exchange (ETDEWEB)

    Jardel, John R.; Gebhardt, Karl [Department of Astronomy, The University of Texas, 2515 Speedway, Stop C1400, Austin, TX 78712-1205 (United States); Fabricius, Maximilian H.; Williams, Michael J. [Max-Planck Institut fuer extraterrestrische Physik, Giessenbachstrasse, D-85741 Garching bei Muenchen (Germany); Drory, Niv, E-mail: jardel@astro.as.utexas.edu [Instituto de Astronomia, Universidad Nacional Autonoma de Mexico, Avenida Universidad 3000, Ciudad Universitaria, C.P. 04510 Mexico D.F. (Mexico)

    2013-02-15

    We introduce a novel implementation of orbit-based (or Schwarzschild) modeling that allows dark matter density profiles to be calculated non-parametrically in nearby galaxies. Our models require no assumptions to be made about velocity anisotropy or the dark matter profile. The technique can be applied to any dispersion-supported stellar system, and we demonstrate its use by studying the Local Group dwarf spheroidal galaxy (dSph) Draco. We use existing kinematic data at larger radii and also present 12 new radial velocities within the central 13 pc obtained with the VIRUS-W integral field spectrograph on the 2.7 m telescope at McDonald Observatory. Our non-parametric Schwarzschild models find strong evidence that the dark matter profile in Draco is cuspy for 20 {<=} r {<=} 700 pc. The profile for r {>=} 20 pc is well fit by a power law with slope {alpha} = -1.0 {+-} 0.2, consistent with predictions from cold dark matter simulations. Our models confirm that, despite its low baryon content relative to other dSphs, Draco lives in a massive halo.

  18. A Bayesian non-parametric Potts model with application to pre-surgical FMRI data.

    Science.gov (United States)

    Johnson, Timothy D; Liu, Zhuqing; Bartsch, Andreas J; Nichols, Thomas E

    2013-08-01

    The Potts model has enjoyed much success as a prior model for image segmentation. Given the individual classes in the model, the data are typically modeled as Gaussian random variates or as random variates from some other parametric distribution. In this article, we present a non-parametric Potts model and apply it to a functional magnetic resonance imaging study for the pre-surgical assessment of peritumoral brain activation. In our model, we assume that the Z-score image from a patient can be segmented into activated, deactivated, and null classes, or states. Conditional on the class, or state, the Z-scores are assumed to come from some generic distribution which we model non-parametrically using a mixture of Dirichlet process priors within the Bayesian framework. The posterior distribution of the model parameters is estimated with a Markov chain Monte Carlo algorithm, and Bayesian decision theory is used to make the final classifications. Our Potts prior model includes two parameters, the standard spatial regularization parameter and a parameter that can be interpreted as the a priori probability that each voxel belongs to the null, or background state, conditional on the lack of spatial regularization. We assume that both of these parameters are unknown, and jointly estimate them along with other model parameters. We show through simulation studies that our model performs on par, in terms of posterior expected loss, with parametric Potts models when the parametric model is correctly specified and outperforms parametric models when the parametric model in misspecified.

  19. When the Single Matters more than the Group (II): Addressing the Problem of High False Positive Rates in Single Case Voxel Based Morphometry Using Non-parametric Statistics.

    Science.gov (United States)

    Scarpazza, Cristina; Nichols, Thomas E; Seramondi, Donato; Maumet, Camille; Sartori, Giuseppe; Mechelli, Andrea

    2016-01-01

    In recent years, an increasing number of studies have used Voxel Based Morphometry (VBM) to compare a single patient with a psychiatric or neurological condition of interest against a group of healthy controls. However, the validity of this approach critically relies on the assumption that the single patient is drawn from a hypothetical population with a normal distribution and variance equal to that of the control group. In a previous investigation, we demonstrated that family-wise false positive error rate (i.e., the proportion of statistical comparisons yielding at least one false positive) in single case VBM are much higher than expected (Scarpazza et al., 2013). Here, we examine whether the use of non-parametric statistics, which does not rely on the assumptions of normal distribution and equal variance, would enable the investigation of single subjects with good control of false positive risk. We empirically estimated false positive rates (FPRs) in single case non-parametric VBM, by performing 400 statistical comparisons between a single disease-free individual and a group of 100 disease-free controls. The impact of smoothing (4, 8, and 12 mm) and type of pre-processing (Modulated, Unmodulated) was also examined, as these factors have been found to influence FPRs in previous investigations using parametric statistics. The 400 statistical comparisons were repeated using two independent, freely available data sets in order to maximize the generalizability of the results. We found that the family-wise error rate was 5% for increases and 3.6% for decreases in one data set; and 5.6% for increases and 6.3% for decreases in the other data set (5% nominal). Further, these results were not dependent on the level of smoothing and modulation. Therefore, the present study provides empirical evidence that single case VBM studies with non-parametric statistics are not susceptible to high false positive rates. The critical implication of this finding is that VBM can be used

  20. Spatial Modeling of Rainfall Patterns over the Ebro River Basin Using Multifractality and Non-Parametric Statistical Techniques

    Directory of Open Access Journals (Sweden)

    José L. Valencia

    2015-11-01

    Full Text Available Rainfall, one of the most important climate variables, is commonly studied due to its great heterogeneity, which occasionally causes negative economic, social, and environmental consequences. Modeling the spatial distributions of rainfall patterns over watersheds has become a major challenge for water resources management. Multifractal analysis can be used to reproduce the scale invariance and intermittency of rainfall processes. To identify which factors are the most influential on the variability of multifractal parameters and, consequently, on the spatial distribution of rainfall patterns for different time scales in this study, universal multifractal (UM analysis—C1, α, and γs UM parameters—was combined with non-parametric statistical techniques that allow spatial-temporal comparisons of distributions by gradients. The proposed combined approach was applied to a daily rainfall dataset of 132 time-series from 1931 to 2009, homogeneously spatially-distributed across a 25 km × 25 km grid covering the Ebro River Basin. A homogeneous increase in C1 over the watershed and a decrease in α mainly in the western regions, were detected, suggesting an increase in the frequency of dry periods at different scales and an increase in the occurrence of rainfall process variability over the last decades.

  1. Inferring the three-dimensional distribution of dust in the Galaxy with a non-parametric method: Preparing for Gaia

    CERN Document Server

    Kh., S Rezaei; Hanson, R J; Fouesneau, M

    2016-01-01

    We present a non-parametric model for inferring the three-dimensional (3D) distribution of dust density in the Milky Way. Our approach uses the extinction measured towards stars at different locations in the Galaxy at approximately known distances. Each extinction measurement is proportional to the integrated dust density along its line-of-sight. Making simple assumptions about the spatial correlation of the dust density, we can infer the most probable 3D distribution of dust across the entire observed region, including along sight lines which were not observed. This is possible because our model employs a Gaussian Process to connect all lines-of-sight. We demonstrate the capability of our model to capture detailed dust density variations using mock data as well as simulated data from the Gaia Universe Model Snapshot. We then apply our method to a sample of giant stars observed by APOGEE and Kepler to construct a 3D dust map over a small region of the Galaxy. Due to our smoothness constraint and its isotropy,...

  2. Cliff´s Delta Calculator: A non-parametric effect size program for two groups of observations

    Directory of Open Access Journals (Sweden)

    Guillermo Macbeth

    2011-05-01

    Full Text Available The Cliff´s Delta statistic is an effect size measure that quantifies the amount of difference between two non-parametric variables beyond p-values interpretation. This measure can be understood as a useful complementary analysis for the corresponding hypothesis testing. During the last two decades the use of effect size measures has been strongly encouraged by methodologists and leading institutions of behavioral sciences. The aim of this contribution is to introduce the Cliff´s Delta Calculator software that performs such analysis and offers some interpretation tips. Differences and similarities with the parametric case are analysed and illustrated. The implementation of this free program is fully described and compared with other calculators. Alternative algorithmic approaches are mathematically analysed and a basic linear algebra proof of its equivalence is formally presented. Two worked examples in cognitive psychology are commented. A visual interpretation of Cliff´s Delta is suggested. Availability, installation and applications of the program are presented and discussed.

  3. Non-parametric star formation histories for 5 dwarf spheroidal galaxies of the local group

    CERN Document Server

    Hernández, X; Valls-Gabaud, D; Gilmore, Gerard; Valls-Gabaud, David

    2000-01-01

    We use recent HST colour-magnitude diagrams of the resolved stellar populations of a sample of local dSph galaxies (Carina, LeoI, LeoII, Ursa Minor and Draco) to infer the star formation histories of these systems, $SFR(t)$. Applying a new variational calculus maximum likelihood method which includes a full Bayesian analysis and allows a non-parametric estimate of the function one is solving for, we infer the star formation histories of the systems studied. This method has the advantage of yielding an objective answer, as one need not assume {\\it a priori} the form of the function one is trying to recover. The results are checked independently using Saha's $W$ statistic. The total luminosities of the systems are used to normalize the results into physical units and derive SN type II rates. We derive the luminosity weighted mean star formation history of this sample of galaxies.

  4. Non-parametric co-clustering of large scale sparse bipartite networks on the GPU

    DEFF Research Database (Denmark)

    Hansen, Toke Jansen; Mørup, Morten; Hansen, Lars Kai

    2011-01-01

    Co-clustering is a problem of both theoretical and practical importance, e.g., market basket analysis and collaborative filtering, and in web scale text processing. We state the co-clustering problem in terms of non-parametric generative models which can address the issue of estimating the number...... of row and column clusters from a hypothesis space of an infinite number of clusters. To reach large scale applications of co-clustering we exploit that parameter inference for co-clustering is well suited for parallel computing. We develop a generic GPU framework for efficient inference on large scale......-life large scale collaborative filtering data and web scale text corpora, demonstrating that latent mesoscale structures extracted by the co-clustering problem as formulated by the Infinite Relational Model (IRM) are consistent across consecutive runs with different initializations and also relevant...

  5. A non-parametric method for correction of global radiation observations

    DEFF Research Database (Denmark)

    Bacher, Peder; Madsen, Henrik; Perers, Bengt;

    2013-01-01

    This paper presents a method for correction and alignment of global radiation observations based on information obtained from calculated global radiation, in the present study one-hour forecast of global radiation from a numerical weather prediction (NWP) model is used. Systematical errors detected...... in the observations are corrected. These are errors such as: tilt in the leveling of the sensor, shadowing from surrounding objects, clipping and saturation in the signal processing, and errors from dirt and wear. The method is based on a statistical non-parametric clear-sky model which is applied to both...... University. The method can be useful for optimized use of solar radiation observations for forecasting, monitoring, and modeling of energy production and load which are affected by solar radiation....

  6. Non-parametric Reconstruction of Cluster Mass Distribution from Strong Lensing Modelling Abell 370

    CERN Document Server

    Abdel-Salam, H M; Williams, L L R

    1997-01-01

    We describe a new non-parametric technique for reconstructing the mass distribution in galaxy clusters with strong lensing, i.e., from multiple images of background galaxies. The observed positions and redshifts of the images are considered as rigid constraints and through the lens (ray-trace) equation they provide us with linear constraint equations. These constraints confine the mass distribution to some allowed region, which is then found by linear programming. Within this allowed region we study in detail the mass distribution with minimum mass-to-light variation; also some others, such as the smoothest mass distribution. The method is applied to the extensively studied cluster Abell 370, which hosts a giant luminous arc and several other multiply imaged background galaxies. Our mass maps are constrained by the observed positions and redshifts (spectroscopic or model-inferred by previous authors) of the giant arc and multiple image systems. The reconstructed maps obtained for A370 reveal a detailed mass d...

  7. Measuring the influence of information networks on transaction costs using a non-parametric regression technique

    DEFF Research Database (Denmark)

    Henningsen, Geraldine; Henningsen, Arne; Henning, Christian H. C. A.

    All business transactions as well as achieving innovations take up resources, subsumed under the concept of transaction costs (TAC). One of the major factors in TAC theory is information. Information networks can catalyse the interpersonal information exchange and hence, increase the access...... to nonpublic information. Our analysis shows that information networks have an impact on the level of TAC. Many resources that are sacrificed for TAC are inputs that also enter the technical production process. As most production data do not separate between these two usages of inputs, high transaction costs...... are unveiled by reduced productivity. A cross-validated local linear non-parametric regression shows that good information networks increase the productivity of farms. A bootstrapping procedure confirms that this result is statistically significant....

  8. Depth Transfer: Depth Extraction from Video Using Non-Parametric Sampling.

    Science.gov (United States)

    Karsch, Kevin; Liu, Ce; Kang, Sing Bing

    2014-11-01

    We describe a technique that automatically generates plausible depth maps from videos using non-parametric depth sampling. We demonstrate our technique in cases where past methods fail (non-translating cameras and dynamic scenes). Our technique is applicable to single images as well as videos. For videos, we use local motion cues to improve the inferred depth maps, while optical flow is used to ensure temporal depth consistency. For training and evaluation, we use a Kinect-based system to collect a large data set containing stereoscopic videos with known depths. We show that our depth estimation technique outperforms the state-of-the-art on benchmark databases. Our technique can be used to automatically convert a monoscopic video into stereo for 3D visualization, and we demonstrate this through a variety of visually pleasing results for indoor and outdoor scenes, including results from the feature film Charade.

  9. The application of non-parametric statistical techniques to an ALARA programme.

    Science.gov (United States)

    Moon, J H; Cho, Y H; Kang, C S

    2001-01-01

    For the cost-effective reduction of occupational radiation dose (ORD) at nuclear power plants, it is necessary to identify what are the processes of repetitive high ORD during maintenance and repair operations. To identify the processes, the point values such as mean and median are generally used, but they sometimes lead to misjudgment since they cannot show other important characteristics such as dose distributions and frequencies of radiation jobs. As an alternative, the non-parametric analysis method is proposed, which effectively identifies the processes of repetitive high ORD. As a case study, the method is applied to ORD data of maintenance and repair processes at Kori Units 3 and 4 that are pressurised water reactors with 950 MWe capacity and have been operating since 1986 and 1987 respectively, in Korea and the method is demonstrated to be an efficient way of analysing the data.

  10. Comparison of non-parametric methods for ungrouping coarsely aggregated data

    DEFF Research Database (Denmark)

    Rizzi, Silvia; Thinggaard, Mikael; Engholm, Gerda

    2016-01-01

    Background Histograms are a common tool to estimate densities non-parametrically. They are extensively encountered in health sciences to summarize data in a compact format. Examples are age-specific distributions of death or onset of diseases grouped in 5-years age classes with an open-ended age...... methods for ungrouping count data. We compare the performance of two spline interpolation methods, two kernel density estimators and a penalized composite link model first via a simulation study and then with empirical data obtained from the NORDCAN Database. All methods analyzed can be used to estimate...... composite link model performs the best. Conclusion We give an overview and test different methods to estimate detailed distributions from grouped count data. Health researchers can benefit from these versatile methods, which are ready for use in the statistical software R. We recommend using the penalized...

  11. Developing two non-parametric performance models for higher learning institutions

    Science.gov (United States)

    Kasim, Maznah Mat; Kashim, Rosmaini; Rahim, Rahela Abdul; Khan, Sahubar Ali Muhamed Nadhar

    2016-08-01

    Measuring the performance of higher learning Institutions (HLIs) is a must for these institutions to improve their excellence. This paper focuses on formation of two performance models: efficiency and effectiveness models by utilizing a non-parametric method, Data Envelopment Analysis (DEA). The proposed models are validated by measuring the performance of 16 public universities in Malaysia for year 2008. However, since data for one of the variables is unavailable, an estimate was used as a proxy to represent the real data. The results show that average efficiency and effectiveness scores were 0.817 and 0.900 respectively, while six universities were fully efficient and eight universities were fully effective. A total of six universities were both efficient and effective. It is suggested that the two proposed performance models would work as complementary methods to the existing performance appraisal method or as alternative methods in monitoring the performance of HLIs especially in Malaysia.

  12. Non-parametric method for separating domestic hot water heating spikes and space heating

    DEFF Research Database (Denmark)

    Bacher, Peder; de Saint-Aubain, Philip Anton; Christiansen, Lasse Engbo;

    2016-01-01

    In this paper a method for separating spikes from a noisy data series, where the data change and evolve over time, is presented. The method is applied on measurements of the total heat load for a single family house. It relies on the fact that the domestic hot water heating is a process generating...... short-lived spikes in the time series, while the space heating changes in slower patterns during the day dependent on the climate and user behavior. The challenge is to separate the domestic hot water heating spikes from the space heating without affecting the natural noise in the space heating...... measurements. The assumption behind the developed method is that the space heating can be estimated by a non-parametric kernel smoother, such that every value significantly above this kernel smoother estimate is identified as a domestic hot water heating spike. First, it is showed how a basic kernel smoothing...

  13. LICORS: Light Cone Reconstruction of States for Non-parametric Forecasting of Spatio-Temporal Systems

    CERN Document Server

    Goerg, Georg M

    2012-01-01

    We present a new, non-parametric forecasting method for data where continuous values are observed discretely in space and time. Our method, "light-cone reconstruction of states" (LICORS), uses physical principles to identify predictive states which are local properties of the system, both in space and time. LICORS discovers the number of predictive states and their predictive distributions automatically, and consistently, under mild assumptions on the data source. We provide an algorithm to implement our method, along with a cross-validation scheme to pick control settings. Simulations show that CV-tuned LICORS outperforms standard methods in forecasting challenging spatio-temporal dynamics. Our work provides applied researchers with a new, highly automatic method to analyze and forecast spatio-temporal data.

  14. Non-parametric PSF estimation from celestial transit solar images using blind deconvolution

    Science.gov (United States)

    González, Adriana; Delouille, Véronique; Jacques, Laurent

    2016-01-01

    Context: Characterization of instrumental effects in astronomical imaging is important in order to extract accurate physical information from the observations. The measured image in a real optical instrument is usually represented by the convolution of an ideal image with a Point Spread Function (PSF). Additionally, the image acquisition process is also contaminated by other sources of noise (read-out, photon-counting). The problem of estimating both the PSF and a denoised image is called blind deconvolution and is ill-posed. Aims: We propose a blind deconvolution scheme that relies on image regularization. Contrarily to most methods presented in the literature, our method does not assume a parametric model of the PSF and can thus be applied to any telescope. Methods: Our scheme uses a wavelet analysis prior model on the image and weak assumptions on the PSF. We use observations from a celestial transit, where the occulting body can be assumed to be a black disk. These constraints allow us to retain meaningful solutions for the filter and the image, eliminating trivial, translated, and interchanged solutions. Under an additive Gaussian noise assumption, they also enforce noise canceling and avoid reconstruction artifacts by promoting the whiteness of the residual between the blurred observations and the cleaned data. Results: Our method is applied to synthetic and experimental data. The PSF is estimated for the SECCHI/EUVI instrument using the 2007 Lunar transit, and for SDO/AIA using the 2012 Venus transit. Results show that the proposed non-parametric blind deconvolution method is able to estimate the core of the PSF with a similar quality to parametric methods proposed in the literature. We also show that, if these parametric estimations are incorporated in the acquisition model, the resulting PSF outperforms both the parametric and non-parametric methods.

  15. Non-parametric PSF estimation from celestial transit solar images using blind deconvolution

    Directory of Open Access Journals (Sweden)

    González Adriana

    2016-01-01

    Full Text Available Context: Characterization of instrumental effects in astronomical imaging is important in order to extract accurate physical information from the observations. The measured image in a real optical instrument is usually represented by the convolution of an ideal image with a Point Spread Function (PSF. Additionally, the image acquisition process is also contaminated by other sources of noise (read-out, photon-counting. The problem of estimating both the PSF and a denoised image is called blind deconvolution and is ill-posed. Aims: We propose a blind deconvolution scheme that relies on image regularization. Contrarily to most methods presented in the literature, our method does not assume a parametric model of the PSF and can thus be applied to any telescope. Methods: Our scheme uses a wavelet analysis prior model on the image and weak assumptions on the PSF. We use observations from a celestial transit, where the occulting body can be assumed to be a black disk. These constraints allow us to retain meaningful solutions for the filter and the image, eliminating trivial, translated, and interchanged solutions. Under an additive Gaussian noise assumption, they also enforce noise canceling and avoid reconstruction artifacts by promoting the whiteness of the residual between the blurred observations and the cleaned data. Results: Our method is applied to synthetic and experimental data. The PSF is estimated for the SECCHI/EUVI instrument using the 2007 Lunar transit, and for SDO/AIA using the 2012 Venus transit. Results show that the proposed non-parametric blind deconvolution method is able to estimate the core of the PSF with a similar quality to parametric methods proposed in the literature. We also show that, if these parametric estimations are incorporated in the acquisition model, the resulting PSF outperforms both the parametric and non-parametric methods.

  16. Semi-automatic liver tumor segmentation with hidden Markov measure field model and non-parametric distribution estimation.

    Science.gov (United States)

    Häme, Yrjö; Pollari, Mika

    2012-01-01

    A novel liver tumor segmentation method for CT images is presented. The aim of this work was to reduce the manual labor and time required in the treatment planning of radiofrequency ablation (RFA), by providing accurate and automated tumor segmentations reliably. The developed method is semi-automatic, requiring only minimal user interaction. The segmentation is based on non-parametric intensity distribution estimation and a hidden Markov measure field model, with application of a spherical shape prior. A post-processing operation is also presented to remove the overflow to adjacent tissue. In addition to the conventional approach of using a single image as input data, an approach using images from multiple contrast phases was developed. The accuracy of the method was validated with two sets of patient data, and artificially generated samples. The patient data included preoperative RFA images and a public data set from "3D Liver Tumor Segmentation Challenge 2008". The method achieved very high accuracy with the RFA data, and outperformed other methods evaluated with the public data set, receiving an average overlap error of 30.3% which represents an improvement of 2.3% points to the previously best performing semi-automatic method. The average volume difference was 23.5%, and the average, the RMS, and the maximum surface distance errors were 1.87, 2.43, and 8.09 mm, respectively. The method produced good results even for tumors with very low contrast and ambiguous borders, and the performance remained high with noisy image data.

  17. Further Empirical Results on Parametric Versus Non-Parametric IRT Modeling of Likert-Type Personality Data

    Science.gov (United States)

    Maydeu-Olivares, Albert

    2005-01-01

    Chernyshenko, Stark, Chan, Drasgow, and Williams (2001) investigated the fit of Samejima's logistic graded model and Levine's non-parametric MFS model to the scales of two personality questionnaires and found that the graded model did not fit well. We attribute the poor fit of the graded model to small amounts of multidimensionality present in…

  18. Wind speed forecasting at different time scales: a non parametric approach

    CERN Document Server

    D'Amico, Guglielmo; Prattico, Flavio

    2013-01-01

    The prediction of wind speed is one of the most important aspects when dealing with renewable energy. In this paper we show a new nonparametric model, based on semi-Markov chains, to predict wind speed. Particularly we use an indexed semi-Markov model, that reproduces accurately the statistical behavior of wind speed, to forecast wind speed one step ahead for different time scales and for very long time horizon maintaining the goodness of prediction. In order to check the main features of the model we show, as indicator of goodness, the root mean square error between real data and predicted ones and we compare our forecasting results with those of a persistence model.

  19. Comparing and optimizing land use classification in a Himalayan area using parametric and non parametric approaches

    NARCIS (Netherlands)

    Sterk, G.; Sameer Saran,; Raju, P.L.N.; Amit, Bharti

    2007-01-01

    Supervised classification is one of important tasks in remote sensing image interpretation, in which the image pixels are classified to various predefined land use/land cover classes based on the spectral reflectance values in different bands. In reality some classes may have very close spectral ref

  20. Performance in European Higher Education: A Non-Parametric Production Frontier Approach

    Science.gov (United States)

    Joumady, Othman; Ris, Catherine

    2005-01-01

    This study examines technical efficiency in European higher education (HE) institutions. To measure efficiency, we consider the capacity of each HE institution, on one hand, to provide competencies to graduates and, on the other hand, to match competencies provided during education to competencies required in the job. We use a large sample of…

  1. A multitemporal and non-parametric approach for assessing the impacts of drought on vegetation greenness

    DEFF Research Database (Denmark)

    Carrao, Hugo; Sepulcre, Guadalupe; Horion, Stéphanie Marie Anne F

    2013-01-01

    This study evaluates the relationship between the frequency and duration of meteorological droughts and the subsequent temporal changes on the quantity of actively photosynthesizing biomass (greenness) estimated from satellite imagery on rainfed croplands in Latin America. An innovative non...... Full Data Reanalysis precipitation time-series product, which ranges from January 1901 to December 2010 and is interpolated at the spatial resolution of 1° (decimal degree, DD). Vegetation greenness composites are derived from 10-daily SPOT-VEGETATION images at the spatial resolution of 1/112° DD...

  2. A non-parametric statistical test to compare clusters with applications in functional magnetic resonance imaging data.

    Science.gov (United States)

    Fujita, André; Takahashi, Daniel Y; Patriota, Alexandre G; Sato, João R

    2014-12-10

    Statistical inference of functional magnetic resonance imaging (fMRI) data is an important tool in neuroscience investigation. One major hypothesis in neuroscience is that the presence or not of a psychiatric disorder can be explained by the differences in how neurons cluster in the brain. Therefore, it is of interest to verify whether the properties of the clusters change between groups of patients and controls. The usual method to show group differences in brain imaging is to carry out a voxel-wise univariate analysis for a difference between the mean group responses using an appropriate test and to assemble the resulting 'significantly different voxels' into clusters, testing again at cluster level. In this approach, of course, the primary voxel-level test is blind to any cluster structure. Direct assessments of differences between groups at the cluster level seem to be missing in brain imaging. For this reason, we introduce a novel non-parametric statistical test called analysis of cluster structure variability (ANOCVA), which statistically tests whether two or more populations are equally clustered. The proposed method allows us to compare the clustering structure of multiple groups simultaneously and also to identify features that contribute to the differential clustering. We illustrate the performance of ANOCVA through simulations and an application to an fMRI dataset composed of children with attention deficit hyperactivity disorder (ADHD) and controls. Results show that there are several differences in the clustering structure of the brain between them. Furthermore, we identify some brain regions previously not described to be involved in the ADHD pathophysiology, generating new hypotheses to be tested. The proposed method is general enough to be applied to other types of datasets, not limited to fMRI, where comparison of clustering structures is of interest. Copyright © 2014 John Wiley & Sons, Ltd.

  3. Comparative study of species sensitivity distributions based on non-parametric kernel density estimation for some transition metals.

    Science.gov (United States)

    Wang, Ying; Feng, Chenglian; Liu, Yuedan; Zhao, Yujie; Li, Huixian; Zhao, Tianhui; Guo, Wenjing

    2017-02-01

    Transition metals in the fourth period of the periodic table of the elements are widely widespread in aquatic environments. They could often occur at certain concentrations to cause adverse effects on aquatic life and human health. Generally, parametric models are mostly used to construct species sensitivity distributions (SSDs), which result in comparison for water quality criteria (WQC) of elements in the same period or group of the periodic table might be inaccurate and the results could be biased. To address this inadequacy, the non-parametric kernel density estimation (NPKDE) with its optimal bandwidths and testing methods were developed for establishing SSDs. The NPKDE was better fit, more robustness and better predicted than conventional normal and logistic parametric density estimations for constructing SSDs and deriving acute HC5 and WQC for transition metals in the fourth period of the periodic table. The decreasing sequence of HC5 values for the transition metals in the fourth period was Ti > Mn > V > Ni > Zn > Cu > Fe > Co > Cr(VI), which were not proportional to atomic number in the periodic table, and for different metals the relatively sensitive species were also different. The results indicated that except for physical and chemical properties there are other factors affecting toxicity mechanisms of transition metals. The proposed method enriched the methodological foundation for WQC. Meanwhile, it also provided a relatively innovative, accurate approach for the WQC derivation and risk assessment of the same group and period metals in aquatic environments to support protection of aquatic organisms.

  4. Trend Analysis of Golestan's Rivers Discharges Using Parametric and Non-parametric Methods

    Science.gov (United States)

    Mosaedi, Abolfazl; Kouhestani, Nasrin

    2010-05-01

    One of the major problems in human life is climate changes and its problems. Climate changes will cause changes in rivers discharges. The aim of this research is to investigate the trend analysis of seasonal and yearly rivers discharges of Golestan province (Iran). In this research four trend analysis method including, conjunction point, linear regression, Wald-Wolfowitz and Mann-Kendall, for analyzing of river discharges in seasonal and annual periods in significant level of 95% and 99% were applied. First, daily discharge data of 12 hydrometrics stations with a length of 42 years (1965-2007) were selected, after some common statistical tests such as, homogeneity test (by applying G-B and M-W tests), the four mentioned trends analysis tests were applied. Results show that in all stations, for summer data time series, there are decreasing trends with a significant level of 99% according to Mann-Kendall (M-K) test. For autumn time series data, all four methods have similar results. For other periods, the results of these four tests were more or less similar together. While, for some stations the results of tests were different. Keywords: Trend Analysis, Discharge, Non-parametric methods, Wald-Wolfowitz, The Mann-Kendall test, Golestan Province.

  5. Non-parametric determination of H and He IS fluxes from cosmic-ray data

    CERN Document Server

    Ghelfi, A; Derome, L; Maurin, D

    2015-01-01

    Top-of-atmosphere (TOA) cosmic-ray (CR) fluxes from satellites and balloon-borne experiments are snapshots of the solar activity imprinted on the interstellar (IS) fluxes. Given a series of snapshots, the unknown IS flux shape and the level of modulation (for each snapshot) can be recovered. We wish (i) to provide the most accurate determination of the IS H and He fluxes from TOA data only, (ii) to obtain the associated modulation levels (and uncertainties) fully accounting for the correlations with the IS flux uncertainties, and (iii) to inspect whether the minimal Force-Field approximation is sufficient to explain all the data at hand. Using H and He TOA measurements, including the recent high precision AMS, BESS-Polar and PAMELA data, we perform a non-parametric fit of the IS fluxes $J^{\\rm IS}_{\\rm H,~He}$ and modulation level $\\phi_i$ for each data taking period. We rely on a Markov Chain Monte Carlo (MCMC) engine to extract the PDF and correlations (hence the credible intervals) of the sought parameters...

  6. Comparison of non-parametric methods for ungrouping coarsely aggregated data

    Directory of Open Access Journals (Sweden)

    Silvia Rizzi

    2016-05-01

    Full Text Available Abstract Background Histograms are a common tool to estimate densities non-parametrically. They are extensively encountered in health sciences to summarize data in a compact format. Examples are age-specific distributions of death or onset of diseases grouped in 5-years age classes with an open-ended age group at the highest ages. When histogram intervals are too coarse, information is lost and comparison between histograms with different boundaries is arduous. In these cases it is useful to estimate detailed distributions from grouped data. Methods From an extensive literature search we identify five methods for ungrouping count data. We compare the performance of two spline interpolation methods, two kernel density estimators and a penalized composite link model first via a simulation study and then with empirical data obtained from the NORDCAN Database. All methods analyzed can be used to estimate differently shaped distributions; can handle unequal interval length; and allow stretches of 0 counts. Results The methods show similar performance when the grouping scheme is relatively narrow, i.e. 5-years age classes. With coarser age intervals, i.e. in the presence of open-ended age groups, the penalized composite link model performs the best. Conclusion We give an overview and test different methods to estimate detailed distributions from grouped count data. Health researchers can benefit from these versatile methods, which are ready for use in the statistical software R. We recommend using the penalized composite link model when data are grouped in wide age classes.

  7. Non-parametric method for measuring gas inhomogeneities from X-ray observations of galaxy clusters

    CERN Document Server

    Morandi, Andrea; Cui, Wei

    2013-01-01

    We present a non-parametric method to measure inhomogeneities in the intracluster medium (ICM) from X-ray observations of galaxy clusters. Analyzing mock Chandra X-ray observations of simulated clusters, we show that our new method enables the accurate recovery of the 3D gas density and gas clumping factor profiles out to large radii of galaxy clusters. We then apply this method to Chandra X-ray observations of Abell 1835 and present the first determination of the gas clumping factor from the X-ray cluster data. We find that the gas clumping factor in Abell 1835 increases with radius and reaches ~2-3 at r=R_{200}. This is in good agreement with the predictions of hydrodynamical simulations, but it is significantly below the values inferred from recent Suzaku observations. We further show that the radially increasing gas clumping factor causes flattening of the derived entropy profile of the ICM and affects physical interpretation of the cluster gas structure, especially at the large cluster-centric radii. Our...

  8. Non-parametric three-way mixed ANOVA with aligned rank tests.

    Science.gov (United States)

    Oliver-Rodríguez, Juan C; Wang, X T

    2015-02-01

    Research problems that require a non-parametric analysis of multifactor designs with repeated measures arise in the behavioural sciences. There is, however, a lack of available procedures in commonly used statistical packages. In the present study, a generalization of the aligned rank test for the two-way interaction is proposed for the analysis of the typical sources of variation in a three-way analysis of variance (ANOVA) with repeated measures. It can be implemented in the usual statistical packages. Its statistical properties are tested by using simulation methods with two sample sizes (n = 30 and n = 10) and three distributions (normal, exponential and double exponential). Results indicate substantial increases in power for non-normal distributions in comparison with the usual parametric tests. Similar levels of Type I error for both parametric and aligned rank ANOVA were obtained with non-normal distributions and large sample sizes. Degrees-of-freedom adjustments for Type I error control in small samples are proposed. The procedure is applied to a case study with 30 participants per group where it detects gender differences in linguistic abilities in blind children not shown previously by other methods.

  9. Modeling the World Health Organization Disability Assessment Schedule II using non-parametric item response models.

    Science.gov (United States)

    Galindo-Garre, Francisca; Hidalgo, María Dolores; Guilera, Georgina; Pino, Oscar; Rojo, J Emilio; Gómez-Benito, Juana

    2015-03-01

    The World Health Organization Disability Assessment Schedule II (WHO-DAS II) is a multidimensional instrument developed for measuring disability. It comprises six domains (getting around, self-care, getting along with others, life activities and participation in society). The main purpose of this paper is the evaluation of the psychometric properties for each domain of the WHO-DAS II with parametric and non-parametric Item Response Theory (IRT) models. A secondary objective is to assess whether the WHO-DAS II items within each domain form a hierarchy of invariantly ordered severity indicators of disability. A sample of 352 patients with a schizophrenia spectrum disorder is used in this study. The 36 items WHO-DAS II was administered during the consultation. Partial Credit and Mokken scale models are used to study the psychometric properties of the questionnaire. The psychometric properties of the WHO-DAS II scale are satisfactory for all the domains. However, we identify a few items that do not discriminate satisfactorily between different levels of disability and cannot be invariantly ordered in the scale. In conclusion the WHO-DAS II can be used to assess overall disability in patients with schizophrenia, but some domains are too general to assess functionality in these patients because they contain items that are not applicable to this pathology.

  10. Non-parametric reconstruction of the galaxy-lens in PG1115+080

    CERN Document Server

    Saha, P; Saha, Prasenjit; Williams, Liliya L. R.

    1997-01-01

    We describe a new, non-parametric, method for reconstructing lensing mass distributions in multiple-image systems, and apply it to PG1115, for which time delays have recently been measured. It turns out that the image positions and the ratio of time delays between different pairs of images constrain the mass distribution in a linear fashion. Since observational errors on image positions and time delay ratios are constantly improving, we use these data as a rigid constraint in our modelling. In addition, we require the projected mass distributions to be inversion-symmetric and to have inward-pointing density gradients. With these realistic yet non-restrictive conditions it is very easy to produce mass distributions that fit the data precisely. We then present models, for $H_0=42$, 63 and 84 \\kmsmpc, that in each case minimize mass-to-light variations while strictly obeying the lensing constraints. (Only a very rough light distribution is available at present.) All three values of $H_0$ are consistent with the ...

  11. Comparison between linear and non-parametric regression models for genome-enabled prediction in wheat.

    Science.gov (United States)

    Pérez-Rodríguez, Paulino; Gianola, Daniel; González-Camacho, Juan Manuel; Crossa, José; Manès, Yann; Dreisigacker, Susanne

    2012-12-01

    In genome-enabled prediction, parametric, semi-parametric, and non-parametric regression models have been used. This study assessed the predictive ability of linear and non-linear models using dense molecular markers. The linear models were linear on marker effects and included the Bayesian LASSO, Bayesian ridge regression, Bayes A, and Bayes B. The non-linear models (this refers to non-linearity on markers) were reproducing kernel Hilbert space (RKHS) regression, Bayesian regularized neural networks (BRNN), and radial basis function neural networks (RBFNN). These statistical models were compared using 306 elite wheat lines from CIMMYT genotyped with 1717 diversity array technology (DArT) markers and two traits, days to heading (DTH) and grain yield (GY), measured in each of 12 environments. It was found that the three non-linear models had better overall prediction accuracy than the linear regression specification. Results showed a consistent superiority of RKHS and RBFNN over the Bayesian LASSO, Bayesian ridge regression, Bayes A, and Bayes B models.

  12. Decision making in coal mine planning using a non-parametric technique of indicator kriging

    Energy Technology Data Exchange (ETDEWEB)

    Mamurekli, D. [Hacettepe University, Ankara (Turkey). Mining Engineering Dept.

    1997-03-01

    In countries where low calorific value coal reserves are abundant and oil reserves are short or none, the requirement of energy production is mainly supported by coal-fired power stations. Consequently, planning to mine the low calorific value coal deposits gains much importance considering the technical and environmental restrictions. Such a mine in Kangal Town of Sivas City is the one that delivers run of mine coal directly to the power station built in the region. In case the calorific value and the ash content of the extracted coal are lower and higher than the required limits, 1300 kcal/kg and 21%, respectively, the power station may apply penalties to the coal producing company. Since the delivery is continuous and made by relying on in situ determination of pre-estimated values these assessments without defining any confidence levels are inevitably subject to inaccuracy. Thus, the company should be aware of uncertainties in making decisions and avoid conceivable risks. In this study, valuable information is provided in the form of conditional distribution to be used during planning process. It maps the indicator variogram corresponding to calorific value of 1300 kcal/kg and the ash content of 21% estimating the conditional probabilities that the true ash contents are less and calorific values are higher than the critical limits by the application of non-parametric technique, indicator kriging. In addition, it outlines the areas that are most uncertain for decision making. 4 refs., 8 figs., 3 tabs.

  13. Patterns of trunk muscle activation during walking and pole walking using statistical non-parametric mapping.

    Science.gov (United States)

    Zoffoli, Luca; Ditroilo, Massimiliano; Federici, Ario; Lucertini, Francesco

    2017-09-09

    This study used surface electromyography (EMG) to investigate the regions and patterns of activity of the external oblique (EO), erector spinae longissimus (ES), multifidus (MU) and rectus abdominis (RA) muscles during walking (W) and pole walking (PW) performed at different speeds and grades. Eighteen healthy adults undertook W and PW on a motorized treadmill at 60% and 100% of their walk-to-run preferred transition speed at 0% and 7% treadmill grade. The Teager-Kaiser energy operator was employed to improve the muscle activity detection and statistical non-parametric mapping based on paired t-tests was used to highlight statistical differences in the EMG patterns corresponding to different trials. The activation amplitude of all trunk muscles increased at high speed, while no differences were recorded at 7% treadmill grade. ES and MU appeared to support the upper body at the heel-strike during both W and PW, with the latter resulting in elevated recruitment of EO and RA as required to control for the longer stride and the push of the pole. Accordingly, the greater activity of the abdominal muscles and the comparable intervention of the spine extensors supports the use of poles by walkers seeking higher engagement of the lower trunk region. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. COLOR IMAGE RETRIEVAL BASED ON NON-PARAMETRIC STATISTICAL TESTS OF HYPOTHESIS

    Directory of Open Access Journals (Sweden)

    R. Shekhar

    2016-09-01

    Full Text Available A novel method for color image retrieval, based on statistical non-parametric tests such as twosample Wald Test for equality of variance and Man-Whitney U test, is proposed in this paper. The proposed method tests the deviation, i.e. distance in terms of variance between the query and target images; if the images pass the test, then it is proceeded to test the spectrum of energy, i.e. distance between the mean values of the two images; otherwise, the test is dropped. If the query and target images pass the tests then it is inferred that the two images belong to the same class, i.e. both the images are same; otherwise, it is assumed that the images belong to different classes, i.e. both images are different. The proposed method is robust for scaling and rotation, since it adjusts itself and treats either the query image or the target image is the sample of other.

  15. Non-parametric mass reconstruction of A1689 from strong lensing data with SLAP

    CERN Document Server

    Diego-Rodriguez, J M; Protopapas, P; Tegmark, M; Benítez, N; Broadhurst, T J

    2004-01-01

    We present the mass distribution in the central area of the cluster A1689 by fitting over 100 multiply lensed images with the non-parametric Strong Lensing Analysis Package (SLAP, Diego et al. 2004). The surface mass distribution is obtained in a robust way finding a total mass of 0.25E15 M_sun/h within a 70'' circle radius from the central peak. Our reconstructed density profile fits well an NFW profile with small perturbations due to substructure and is compatible with the more model dependent analysis of Broadhurst et al. (2004a) based on the same data. Our estimated mass does not rely on any prior information about the distribution of dark matter in the cluster. The peak of the mass distribution falls very close to the central cD and there is substructure near the center suggesting that the cluster is not fully relaxed. We also examine the effect on the recovered mass when we include the uncertainties in the redshift of the sources and in the original shape of the sources. Using simulations designed to mi...

  16. Parametric and Non-Parametric Vibration-Based Structural Identification Under Earthquake Excitation

    Science.gov (United States)

    Pentaris, Fragkiskos P.; Fouskitakis, George N.

    2014-05-01

    The problem of modal identification in civil structures is of crucial importance, and thus has been receiving increasing attention in recent years. Vibration-based methods are quite promising as they are capable of identifying the structure's global characteristics, they are relatively easy to implement and they tend to be time effective and less expensive than most alternatives [1]. This paper focuses on the off-line structural/modal identification of civil (concrete) structures subjected to low-level earthquake excitations, under which, they remain within their linear operating regime. Earthquakes and their details are recorded and provided by the seismological network of Crete [2], which 'monitors' the broad region of south Hellenic arc, an active seismic region which functions as a natural laboratory for earthquake engineering of this kind. A sufficient number of seismic events are analyzed in order to reveal the modal characteristics of the structures under study, that consist of the two concrete buildings of the School of Applied Sciences, Technological Education Institute of Crete, located in Chania, Crete, Hellas. Both buildings are equipped with high-sensitivity and accuracy seismographs - providing acceleration measurements - established at the basement (structure's foundation) presently considered as the ground's acceleration (excitation) and at all levels (ground floor, 1st floor, 2nd floor and terrace). Further details regarding the instrumentation setup and data acquisition may be found in [3]. The present study invokes stochastic, both non-parametric (frequency-based) and parametric methods for structural/modal identification (natural frequencies and/or damping ratios). Non-parametric methods include Welch-based spectrum and Frequency response Function (FrF) estimation, while parametric methods, include AutoRegressive (AR), AutoRegressive with eXogeneous input (ARX) and Autoregressive Moving-Average with eXogeneous input (ARMAX) models[4, 5

  17. Revisiting the Distance Duality Relation using a non-parametric regression method

    Science.gov (United States)

    Rana, Akshay; Jain, Deepak; Mahajan, Shobhit; Mukherjee, Amitabha

    2016-07-01

    The interdependence of luminosity distance, DL and angular diameter distance, DA given by the distance duality relation (DDR) is very significant in observational cosmology. It is very closely tied with the temperature-redshift relation of Cosmic Microwave Background (CMB) radiation. Any deviation from η(z)≡ DL/DA (1+z)2 =1 indicates a possible emergence of new physics. Our aim in this work is to check the consistency of these relations using a non-parametric regression method namely, LOESS with SIMEX. This technique avoids dependency on the cosmological model and works with a minimal set of assumptions. Further, to analyze the efficiency of the methodology, we simulate a dataset of 020 points of η (z) data based on a phenomenological model η(z)= (1+z)epsilon. The error on the simulated data points is obtained by using the temperature of CMB radiation at various redshifts. For testing the distance duality relation, we use the JLA SNe Ia data for luminosity distances, while the angular diameter distances are obtained from radio galaxies datasets. Since the DDR is linked with CMB temperature-redshift relation, therefore we also use the CMB temperature data to reconstruct η (z). It is important to note that with CMB data, we are able to study the evolution of DDR upto a very high redshift z = 2.418. In this analysis, we find no evidence of deviation from η=1 within a 1σ region in the entire redshift range used in this analysis (0 < z <= 2.418).

  18. A non-parametric peak calling algorithm for DamID-Seq.

    Directory of Open Access Journals (Sweden)

    Renhua Li

    Full Text Available Protein-DNA interactions play a significant role in gene regulation and expression. In order to identify transcription factor binding sites (TFBS of double sex (DSX-an important transcription factor in sex determination, we applied the DNA adenine methylation identification (DamID technology to the fat body tissue of Drosophila, followed by deep sequencing (DamID-Seq. One feature of DamID-Seq data is that induced adenine methylation signals are not assured to be symmetrically distributed at TFBS, which renders the existing peak calling algorithms for ChIP-Seq, including SPP and MACS, inappropriate for DamID-Seq data. This challenged us to develop a new algorithm for peak calling. A challenge in peaking calling based on sequence data is estimating the averaged behavior of background signals. We applied a bootstrap resampling method to short sequence reads in the control (Dam only. After data quality check and mapping reads to a reference genome, the peaking calling procedure compromises the following steps: 1 reads resampling; 2 reads scaling (normalization and computing signal-to-noise fold changes; 3 filtering; 4 Calling peaks based on a statistically significant threshold. This is a non-parametric method for peak calling (NPPC. We also used irreproducible discovery rate (IDR analysis, as well as ChIP-Seq data to compare the peaks called by the NPPC. We identified approximately 6,000 peaks for DSX, which point to 1,225 genes related to the fat body tissue difference between female and male Drosophila. Statistical evidence from IDR analysis indicated that these peaks are reproducible across biological replicates. In addition, these peaks are comparable to those identified by use of ChIP-Seq on S2 cells, in terms of peak number, location, and peaks width.

  19. A non-parametric peak calling algorithm for DamID-Seq.

    Science.gov (United States)

    Li, Renhua; Hempel, Leonie U; Jiang, Tingbo

    2015-01-01

    Protein-DNA interactions play a significant role in gene regulation and expression. In order to identify transcription factor binding sites (TFBS) of double sex (DSX)-an important transcription factor in sex determination, we applied the DNA adenine methylation identification (DamID) technology to the fat body tissue of Drosophila, followed by deep sequencing (DamID-Seq). One feature of DamID-Seq data is that induced adenine methylation signals are not assured to be symmetrically distributed at TFBS, which renders the existing peak calling algorithms for ChIP-Seq, including SPP and MACS, inappropriate for DamID-Seq data. This challenged us to develop a new algorithm for peak calling. A challenge in peaking calling based on sequence data is estimating the averaged behavior of background signals. We applied a bootstrap resampling method to short sequence reads in the control (Dam only). After data quality check and mapping reads to a reference genome, the peaking calling procedure compromises the following steps: 1) reads resampling; 2) reads scaling (normalization) and computing signal-to-noise fold changes; 3) filtering; 4) Calling peaks based on a statistically significant threshold. This is a non-parametric method for peak calling (NPPC). We also used irreproducible discovery rate (IDR) analysis, as well as ChIP-Seq data to compare the peaks called by the NPPC. We identified approximately 6,000 peaks for DSX, which point to 1,225 genes related to the fat body tissue difference between female and male Drosophila. Statistical evidence from IDR analysis indicated that these peaks are reproducible across biological replicates. In addition, these peaks are comparable to those identified by use of ChIP-Seq on S2 cells, in terms of peak number, location, and peaks width.

  20. Does sunspot numbers cause global temperatures? A reconsideration using non-parametric causality tests

    Science.gov (United States)

    Hassani, Hossein; Huang, Xu; Gupta, Rangan; Ghodsi, Mansi

    2016-10-01

    In a recent paper, Gupta et al., (2015), analyzed whether sunspot numbers cause global temperatures based on monthly data covering the period 1880:1-2013:9. The authors find that standard time domain Granger causality test fails to reject the null hypothesis that sunspot numbers do not cause global temperatures for both full and sub-samples, namely 1880:1-1936:2, ​1936:3-1986:11 and 1986:12-2013:9 (identified based on tests of structural breaks). However, frequency domain causality test detects predictability for the full-sample at short (2-2.6 months) cycle lengths, but not the sub-samples. But since, full-sample causality cannot be relied upon due to structural breaks, Gupta et al., (2015) conclude that the evidence of causality running from sunspot numbers to global temperatures is weak and inconclusive. Given the importance of the issue of global warming, our current paper aims to revisit this issue of whether sunspot numbers cause global temperatures, using the same data set and sub-samples used by Gupta et al., (2015), based on an nonparametric Singular Spectrum Analysis (SSA)-based causality test. Based on this test, we however, show that sunspot numbers have predictive ability for global temperatures for the three sub-samples, over and above the full-sample. Thus, generally speaking, our non-parametric SSA-based causality test outperformed both time domain and frequency domain causality tests and highlighted that sunspot numbers have always been important in predicting global temperatures.

  1. Assessment of water quality trends in the Minnesota River using non-parametric and parametric methods

    Science.gov (United States)

    Johnson, H.O.; Gupta, S.C.; Vecchia, A.V.; Zvomuya, F.

    2009-01-01

    Excessive loading of sediment and nutrients to rivers is a major problem in many parts of the United States. In this study, we tested the non-parametric Seasonal Kendall (SEAKEN) trend model and the parametric USGS Quality of Water trend program (QWTREND) to quantify trends in water quality of the Minnesota River at Fort Snelling from 1976 to 2003. Both methods indicated decreasing trends in flow-adjusted concentrations of total suspended solids (TSS), total phosphorus (TP), and orthophosphorus (OP) and a generally increasing trend in flow-adjusted nitrate plus nitrite-nitrogen (NO3-N) concentration. The SEAKEN results were strongly influenced by the length of the record as well as extreme years (dry or wet) earlier in the record. The QWTREND results, though influenced somewhat by the same factors, were more stable. The magnitudes of trends between the two methods were somewhat different and appeared to be associated with conceptual differences between the flow-adjustment processes used and with data processing methods. The decreasing trends in TSS, TP, and OP concentrations are likely related to conservation measures implemented in the basin. However, dilution effects from wet climate or additional tile drainage cannot be ruled out. The increasing trend in NO3-N concentrations was likely due to increased drainage in the basin. Since the Minnesota River is the main source of sediments to the Mississippi River, this study also addressed the rapid filling of Lake Pepin on the Mississippi River and found the likely cause to be increased flow due to recent wet climate in the region. Copyright ?? 2009 by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America. All rights reserved.

  2. A Non-Parametric Item Response Theory Evaluation of the CAGE Instrument Among Older Adults.

    Science.gov (United States)

    Abdin, Edimansyah; Sagayadevan, Vathsala; Vaingankar, Janhavi Ajit; Picco, Louisa; Chong, Siow Ann; Subramaniam, Mythily

    2017-08-04

    The validity of the CAGE using item response theory (IRT) has not yet been examined in older adult population. This study aims to investigate the psychometric properties of the CAGE using both non-parametric and parametric IRT models, assess whether there is any differential item functioning (DIF) by age, gender and ethnicity and examine the measurement precision at the cut-off scores. We used data from the Well-being of the Singapore Elderly study to conduct Mokken scaling analysis (MSA), dichotomous Rasch and 2-parameter logistic IRT models. The measurement precision at the cut-off scores were evaluated using classification accuracy (CA) and classification consistency (CC). The MSA showed the overall scalability H index was 0.459, indicating a medium performing instrument. All items were found to be homogenous, measuring the same construct and able to discriminate well between respondents with high levels of the construct and the ones with lower levels. The item discrimination ranged from 1.07 to 6.73 while the item difficulty ranged from 0.33 to 2.80. Significant DIF was found for 2-item across ethnic group. More than 90% (CC and CA ranged from 92.5% to 94.3%) of the respondents were consistently and accurately classified by the CAGE cut-off scores of 2 and 3. The current study provides new evidence on the validity of the CAGE from the IRT perspective. This study provides valuable information of each item in the assessment of the overall severity of alcohol problem and the precision of the cut-off scores in older adult population.

  3. Transit Timing Observations From Kepler: Ii. Confirmation of Two Multiplanet Systems via a Non-Parametric Correlation Analysis

    OpenAIRE

    Ford, Eric B.; Fabrycky, Daniel C.; Steffen, Jason H.; Carter, Joshua A.; Fressin, Francois; Holman, Matthew Jon; Lissauer, Jack J.; Moorhead, Althea V.; Morehead, Robert C.; Ragozzine, Darin; Rowe, Jason F.; Welsh, William F.; Allen, Christopher; Batalha, Natalie M.; Borucki, William J.

    2012-01-01

    We present a new method for confirming transiting planets based on the combination of transit timingn variations (TTVs) and dynamical stability. Correlated TTVs provide evidence that the pair of bodies are in the same physical system. Orbital stability provides upper limits for the masses of the transiting companions that are in the planetary regime. This paper describes a non-parametric technique for quantifying the statistical significance of TTVs based on the correlation of two TTV data se...

  4. Statistic Non-Parametric Methods of Measurement and Interpretation of Existing Statistic Connections within Seaside Hydro Tourism

    OpenAIRE

    MIRELA SECARĂ

    2008-01-01

    Tourism represents an important field of economic and social life in our country, and the main sector of the economy of Constanta County is the balneary touristic capitalization of Romanian seaside. In order to statistically analyze hydro tourism on Romanian seaside, we have applied non-parametric methods of measuring and interpretation of existing statistic connections within seaside hydro tourism. Major objective of this research is represented by hydro tourism re-establishment on Romanian ...

  5. Non-parametric determination of H and He interstellar fluxes from cosmic-ray data

    Science.gov (United States)

    Ghelfi, A.; Barao, F.; Derome, L.; Maurin, D.

    2016-06-01

    Context. Top-of-atmosphere (TOA) cosmic-ray (CR) fluxes from satellites and balloon-borne experiments are snapshots of the solar activity imprinted on the interstellar (IS) fluxes. Given a series of snapshots, the unknown IS flux shape and the level of modulation (for each snapshot) can be recovered. Aims: We wish (i) to provide the most accurate determination of the IS H and He fluxes from TOA data alone; (ii) to obtain the associated modulation levels (and uncertainties) while fully accounting for the correlations with the IS flux uncertainties; and (iii) to inspect whether the minimal force-field approximation is sufficient to explain all the data at hand. Methods: Using H and He TOA measurements, including the recent high-precision AMS, BESS-Polar, and PAMELA data, we performed a non-parametric fit of the IS fluxes JISH,~He and modulation level φi for each data-taking period. We relied on a Markov chain Monte Carlo (MCMC) engine to extract the probability density function and correlations (hence the credible intervals) of the sought parameters. Results: Although H and He are the most abundant and best measured CR species, several datasets had to be excluded from the analysis because of inconsistencies with other measurements. From the subset of data passing our consistency cut, we provide ready-to-use best-fit and credible intervals for the H and He IS fluxes from MeV/n to PeV/n energy (with a relative precision in the range [ 2-10% ] at 1σ). Given the strong correlation between JIS and φi parameters, the uncertainties on JIS translate into Δφ ≈ ± 30 MV (at 1σ) for all experiments. We also find that the presence of 3He in He data biases φ towards higher φ values by ~30 MV. The force-field approximation, despite its limitation, gives an excellent (χ2/d.o.f. = 1.02) description of the recent high-precision TOA H and He fluxes. Conclusions: The analysis must be extended to different charge species and more realistic modulation models. It would benefit

  6. Validation of two (parametric vs non-parametric) daily weather generators

    Science.gov (United States)

    Dubrovsky, M.; Skalak, P.

    2015-12-01

    As the climate models (GCMs and RCMs) fail to satisfactorily reproduce the real-world surface weather regime, various statistical methods are applied to downscale GCM/RCM outputs into site-specific weather series. The stochastic weather generators are among the most favourite downscaling methods capable to produce realistic (observed-like) meteorological inputs for agrological, hydrological and other impact models used in assessing sensitivity of various ecosystems to climate change/variability. To name their advantages, the generators may (i) produce arbitrarily long multi-variate synthetic weather series representing both present and changed climates (in the latter case, the generators are commonly modified by GCM/RCM-based climate change scenarios), (ii) be run in various time steps and for multiple weather variables (the generators reproduce the correlations among variables), (iii) be interpolated (and run also for sites where no weather data are available to calibrate the generator). This contribution will compare two stochastic daily weather generators in terms of their ability to reproduce various features of the daily weather series. M&Rfi is a parametric generator: Markov chain model is used to model precipitation occurrence, precipitation amount is modelled by the Gamma distribution, and the 1st order autoregressive model is used to generate non-precipitation surface weather variables. The non-parametric GoMeZ generator is based on the nearest neighbours resampling technique making no assumption on the distribution of the variables being generated. Various settings of both weather generators will be assumed in the present validation tests. The generators will be validated in terms of (a) extreme temperature and precipitation characteristics (annual and 30-years extremes and maxima of duration of hot/cold/dry/wet spells); (b) selected validation statistics developed within the frame of VALUE project. The tests will be based on observational weather series

  7. Scaling of preferential flow in biopores by parametric or non parametric transfer functions

    Science.gov (United States)

    Zehe, E.; Hartmann, N.; Klaus, J.; Palm, J.; Schroeder, B.

    2009-04-01

    finally assign the measured hydraulic capacities to these pores. By combining this population of macropores with observed data on soil hydraulic properties we obtain a virtual reality. Flow and transport is simulated for different rainfall forcings comparing two models, Hydrus 3d and Catflow. The simulated cumulative travel depths distributions for different forcings will be linked to the cumulative depth distribution of connected flow paths. The latter describes the fraction of connected paths - where flow resistance is always below a selected threshold that links the surface to a certain critical depth. Systematic variation of the average number of macropores and their depth distributions will show whether a clear link between the simulated travel depths distributions and the depth distribution of connected paths may be identified. The third essential step is to derive a non parametric transfer function that predicts travel depth distributions of tracers and on the long term pesticides based on easy-to-assess subsurface characteristics (mainly density and depth distribution of worm burrows, soil matrix properties), initial conditions and rainfall forcing. Such a transfer function is independent of scale ? as long as we stay in the same ensemble i.e. worm population and soil properties stay the same. Shipitalo, M.J. and Butt, K.R. (1999): Occupancy and geometrical properties of Lumbricus terrestris L. burrows affecting infiltration. Pedobiologia 43:782-794 Zehe E, and Fluehler H. (2001b): Slope scale distribution of flow patterns in soil profiles. J. Hydrol. 247: 116-132.

  8. 污染线性模型的非参数估计%NON-PARAMETRIC ESTIMATION IN CONTAMINATED LINEAR MODEL

    Institute of Scientific and Technical Information of China (English)

    柴根象; 孙燕; 杨筱菡

    2001-01-01

    In this paper, the following contaminated linear model is considered: yi=(1-ε)xτiβ+zi, 1≤i≤n, where r.v.'s {yi} are contaminated with errors {zi}. To assume that the errors have the finite moment of order 2 only. The non-parametric estimation of contaminated coefficient ε and regression parameter β are established, and the strong consistency and convergence rate almost surely of the estimators are obtained. A simulated example is also given to show the visual performance of the estimations.

  9. On The Robustness of z=0-1 Galaxy Size Measurements Through Model and Non-Parametric Fits

    CERN Document Server

    Mosleh, Moein; Franx, Marijn

    2013-01-01

    We present the size-stellar mass relations of nearby (z=0.01-0.02) SDSS galaxies, for samples selected by color, morphology, Sersic index n, and specific star formation rate. Several commonly-employed size measurement techniques are used, including single Sersic fits, two-component Sersic models and a non-parametric method. Through simple simulations we show that the non-parametric and two-component Sersic methods provide the most robust effective radius measurements, while those based on single Sersic profiles are often overestimates, especially for massive red/early-type galaxies. Using our robust sizes, we show that for all sub-samples, the mass-size relations are shallow at low stellar masses and steepen above ~3-4 x 10^{10}\\Msun. The mass-size relations for galaxies classified as late-type, low-n, and star-forming are consistent with each other, while blue galaxies follow a somewhat steeper relation. The mass-size relations of early-type, high-n, red, and quiescent galaxies all agree with each other but ...

  10. Further Empirical Results on Parametric Versus Non-Parametric IRT Modeling of Likert-Type Personality Data.

    Science.gov (United States)

    Maydeu-Olivares, Albert

    2005-04-01

    Chernyshenko, Stark, Chan, Drasgow, and Williams (2001) investigated the fit of Samejima's logistic graded model and Levine's non-parametric MFS model to the scales of two personality questionnaires and found that the graded model did not fit well. We attribute the poor fit of the graded model to small amounts of multidimensionality present in their data. To verify this conjecture, we compare the fit of these models to the Social Problem Solving Inventory-Revised, whose scales were designed to be unidimensional. A calibration and a cross-validation sample of new observations were used. We also included the following parametric models in the comparison: Bock's nominal model, Masters' partial credit model, and Thissen and Steinberg's extension of the latter. All models were estimated using full information maximum likelihood. We also included in the comparison a normal ogive model version of Samejima's model estimated using limited information estimation. We found that for all scales Samejima's model outperformed all other parametric IRT models in both samples, regardless of the estimation method employed. The non-parametric model outperformed all parametric models in the calibration sample. However, the graded model outperformed MFS in the cross-validation sample in some of the scales. We advocate employing the graded model estimated using limited information methods in modeling Likert-type data, as these methods are more versatile than full information methods to capture the multidimensionality that is generally present in personality data.

  11. Climatic, parametric and non-parametric analysis of energy performance of double-glazed windows in different climates

    Directory of Open Access Journals (Sweden)

    Saeed Banihashemi

    2015-12-01

    Full Text Available In line with the growing global trend toward energy efficiency in buildings, this paper aims to first; investigate the energy performance of double-glazed windows in different climates and second; analyze the most dominant used parametric and non-parametric tests in dimension reduction for simulating this component. A four-story building representing the conventional type of residential apartments for four climates of cold, temperate, hot-arid and hot-humid was selected for simulation. 10 variables of U-factor, SHGC, emissivity, visible transmittance, monthly average dry bulb temperature, monthly average percent humidity, monthly average wind speed, monthly average direct solar radiation, monthly average diffuse solar radiation and orientation constituted the parameters considered in the calculation of cooling and heating loads of the case. Design of Experiment and Principal Component Analysis methods were applied to find the most significant factors and reduction dimension of initial variables. It was observed that in two climates of temperate and hot-arid, using double glazed windows was beneficial in both cold and hot months whereas in cold and hot-humid climates where heating and cooling loads are dominant respectively, they were advantageous in only those dominant months. Furthermore, an inconsistency was revealed between parametric and non-parametric tests in terms of identifying the most significant variables.

  12. 'nparACT' package for R: A free software tool for the non-parametric analysis of actigraphy data.

    Science.gov (United States)

    Blume, Christine; Santhi, Nayantara; Schabus, Manuel

    2016-01-01

    For many studies, participants' sleep-wake patterns are monitored and recorded prior to, during and following an experimental or clinical intervention using actigraphy, i.e. the recording of data generated by movements. Often, these data are merely inspected visually without computation of descriptive parameters, in part due to the lack of user-friendly software. To address this deficit, we developed a package for R Core Team [6], that allows computing several non-parametric measures from actigraphy data. Specifically, it computes the interdaily stability (IS), intradaily variability (IV) and relative amplitude (RA) of activity and gives the start times and average activity values of M10 (i.e. the ten hours with maximal activity) and L5 (i.e. the five hours with least activity). Two functions compute these 'classical' parameters and handle either single or multiple files. Two other functions additionally allow computing an L-value (i.e. the least activity value) for a user-defined time span termed 'Lflex' value. A plotting option is included in all functions. The package can be downloaded from the Comprehensive R Archives Network (CRAN). •The package 'nparACT' for R serves the non-parametric analysis of actigraphy data.•Computed parameters include interdaily stability (IS), intradaily variability (IV) and relative amplitude (RA) as well as start times and average activity during the 10 h with maximal and the 5 h with minimal activity (i.e. M10 and L5).

  13. A novel non-parametric method for uncertainty evaluation of correlation-based molecular signatures: its application on PAM50 algorithm.

    Science.gov (United States)

    Fresno, Cristóbal; González, Germán Alexis; Merino, Gabriela Alejandra; Flesia, Ana Georgina; Podhajcer, Osvaldo Luis; Llera, Andrea Sabina; Fernández, Elmer Andrés

    2017-03-01

    The PAM50 classifier is used to assign patients to the highest correlated breast cancer subtype irrespectively of the obtained value. Nonetheless, all subtype correlations are required to build the risk of recurrence (ROR) score, currently used in therapeutic decisions. Present subtype uncertainty estimations are not accurate, seldom considered or require a population-based approach for this context. Here we present a novel single-subject non-parametric uncertainty estimation based on PAM50's gene label permutations. Simulations results ( n  = 5228) showed that only 61% subjects can be reliably 'Assigned' to the PAM50 subtype, whereas 33% should be 'Not Assigned' (NA), leaving the rest to tight 'Ambiguous' correlations between subtypes. The NA subjects exclusion from the analysis improved survival subtype curves discrimination yielding a higher proportion of low and high ROR values. Conversely, all NA subjects showed similar survival behaviour regardless of the original PAM50 assignment. We propose to incorporate our PAM50 uncertainty estimation to support therapeutic decisions. Source code can be found in 'pbcmc' R package at Bioconductor. cristobalfresno@gmail.com or efernandez@bdmg.com.ar. Supplementary data are available at Bioinformatics online.

  14. Application of non-parametric bootstrap methods to estimate confidence intervals for QTL location in a beef cattle QTL experimental population.

    Science.gov (United States)

    Jongjoo, Kim; Davis, Scott K; Taylor, Jeremy F

    2002-06-01

    Empirical confidence intervals (CIs) for the estimated quantitative trait locus (QTL) location from selective and non-selective non-parametric bootstrap resampling methods were compared for a genome scan involving an Angus x Brahman reciprocal fullsib backcross population. Genetic maps, based on 357 microsatellite markers, were constructed for 29 chromosomes using CRI-MAP V2.4. Twelve growth, carcass composition and beef quality traits (n = 527-602) were analysed to detect QTLs utilizing (composite) interval mapping approaches. CIs were investigated for 28 likelihood ratio test statistic (LRT) profiles for the one QTL per chromosome model. The CIs from the non-selective bootstrap method were largest (87 7 cM average or 79-2% coverage of test chromosomes). The Selective II procedure produced the smallest CI size (42.3 cM average). However, CI sizes from the Selective II procedure were more variable than those produced by the two LOD drop method. CI ranges from the Selective II procedure were also asymmetrical (relative to the most likely QTL position) due to the bias caused by the tendency for the estimated QTL position to be at a marker position in the bootstrap samples and due to monotonicity and asymmetry of the LRT curve in the original sample.

  15. Productivity improvement in Korean rice farming: parametric and non-parametric analysis

    OpenAIRE

    Kwon, Oh Sang; Lee, Hyunok

    2004-01-01

    The published empirical literature on frontier production functions is dominated by two broadly defined estimation approaches – parametric and non‐parametric. Using panel data on Korean rice production, parametric and non‐parametric production frontiers are estimated and compared with estimated productivity. The non‐parametric approach employs two alternative measures based on the Malmquist index and the Luenberger indicator, while the parametric approach is closely related to the time‐varian...

  16. A note on the use of the non-parametric Wilcoxon-Mann-Whitney test in the analysis of medical studies

    Directory of Open Access Journals (Sweden)

    Kühnast, Corinna

    2008-04-01

    Full Text Available Background: Although non-normal data are widespread in biomedical research, parametric tests unnecessarily predominate in statistical analyses. Methods: We surveyed five biomedical journals and – for all studies which contain at least the unpaired t-test or the non-parametric Wilcoxon-Mann-Whitney test – investigated the relationship between the choice of a statistical test and other variables such as type of journal, sample size, randomization, sponsoring etc. Results: The non-parametric Wilcoxon-Mann-Whitney was used in 30% of the studies. In a multivariable logistic regression the type of journal, the test object, the scale of measurement and the statistical software were significant. The non-parametric test was more common in case of non-continuous data, in high-impact journals, in studies in humans, and when the statistical software is specified, in particular when SPSS was used.

  17. The Non-Parametric Model for Linking Galaxy Luminosity with Halo/Subhalo Mass: Are First Brightest Galaxies Special?

    CERN Document Server

    Vale, A

    2007-01-01

    We revisit the longstanding question of whether first brightest cluster galaxies are statistically drawn from the same distribution as other cluster galaxies or are "special", using the new non-parametric, empirically based model presented in Vale&Ostriker (2006) for associating galaxy luminosity with halo/subhalo masses. We introduce scatter in galaxy luminosity at fixed halo mass into this model, building a conditional luminosity function (CLF) by considering two possible models: a simple lognormal and a model based on the distribution of concentration in haloes of a given mass. We show that this model naturally allows an identification of halo/subhalo systems with groups and clusters of galaxies, giving rise to a clear central/satellite galaxy distinction. We then use these results to build up the dependence of brightest cluster galaxy (BCG) magnitudes on cluster luminosity, focusing on two statistical indicators, the dispersion in BCG magnitude and the magnitude difference between first and second bri...

  18. Singular Value Decomposition, Hessian Errors, and Linear Algebra of Non-parametric Extraction of Partons from DIS

    CERN Document Server

    Goshtasbpour, Mehrdad

    2014-01-01

    By singular value decomposition (SVD) of a numerically singular Hessian matrix and a numerically singular system of linear equations for the experimental data (accumulated in the respective ${\\chi ^2}$ function) and constraints, least square solutions and their propagated errors for the non-parametric extraction of Partons from $F_2$ are obtained. SVD and its physical application is phenomenologically described in the two cases. Among the subjects covered are: identification and properties of the boundary between the two subsets of ordered eigenvalues corresponding to range and null space, and the eigenvalue structure of the null space of the singular matrix, including a second boundary separating the smallest eigenvalues of essentially no information, in a particular case. The eigenvector-eigenvalue structure of "redundancy and smallness" of the errors of two pdf sets, in our simplified Hessian model, is described by a secondary manifestation of deeper null space, in the context of SVD.

  19. Detection of Bistability in Phase Space of a Real Galaxy, using a New Non-parametric Bayesian Test of Hypothesis

    CERN Document Server

    Chakrabarty, Dalia

    2013-01-01

    In lieu of direct detection of dark matter, estimation of the distribution of the gravitational mass in distant galaxies is of crucial importance in Astrophysics. Typically, such estimation is performed using small samples of noisy, partially missing measurements - only some of the three components of the velocity and location vectors of individual particles that live in the galaxy are measurable. Such limitations of the available data in turn demands that simplifying model assumptions be undertaken. Thus, assuming that the phase space of a galaxy manifests simple symmetries - such as isotropy - allows for the learning of the density of the gravitational mass in galaxies. This is equivalent to assuming that the phase space $pdf$ from which the velocity and location vectors of galactic particles are sampled from, is an isotropic function of these vectors. We present a new non-parametric test of hypothesis that tests for relative support in two or more measured data sets of disparate sizes, for the undertaken m...

  20. Transit Timing Observations from Kepler: II. Confirmation of Two Multiplanet Systems via a Non-parametric Correlation Analysis

    CERN Document Server

    Ford, Eric B; Steffen, Jason H; Carter, Joshua A; Fressin, Francois; Holman, Matthew J; Lissauer, Jack J; Moorhead, Althea V; Morehead, Robert C; Ragozzine, Darin; Rowe, Jason F; Welsh, William F; Allen, Christopher; Batalha, Natalie M; Borucki, William J; Bryson, Stephen T; Buchhave, Lars A; Burke, Christopher J; Caldwell, Douglas A; Charbonneau, David; Clarke, Bruce D; Cochran, William D; Désert, Jean-Michel; Endl, Michael; Everett, Mark E; Fischer, Debra A; Gautier, Thomas N; Gilliland, Ron L; Jenkins, Jon M; Haas, Michael R; Horch, Elliott; Howell, Steve B; Ibrahim, Khadeejah A; Isaacson, Howard; Koch, David G; Latham, David W; Li, Jie; Lucas, Philip; MacQueen, Phillip J; Marcy, Geoffrey W; McCauliff, Sean; Mullally, Fergal R; Quinn, Samuel N; Quintana, Elisa; Shporer, Avi; Still, Martin; Tenenbaum, Peter; Thompson, Susan E; Torres, Guillermo; Twicken, Joseph D; Wohler, Bill

    2012-01-01

    We present a new method for confirming transiting planets based on the combination of transit timingn variations (TTVs) and dynamical stability. Correlated TTVs provide evidence that the pair of bodies are in the same physical system. Orbital stability provides upper limits for the masses of the transiting companions that are in the planetary regime. This paper describes a non-parametric technique for quantifying the statistical significance of TTVs based on the correlation of two TTV data sets. We apply this method to an analysis of the transit timing variations of two stars with multiple transiting planet candidates identified by Kepler. We confirm four transiting planets in two multiple planet systems based on their TTVs and the constraints imposed by dynamical stability. An additional three candidates in these same systems are not confirmed as planets, but are likely to be validated as real planets once further observations and analyses are possible. If all were confirmed, these systems would be near 4:6:...

  1. A non-parametric conditional bivariate reference region with an application to height/weight measurements on normal girls

    DEFF Research Database (Denmark)

    Petersen, Jørgen Holm

    2009-01-01

    A conceptually simple two-dimensional conditional reference curve is described. The curve gives a decision basis for determining whether a bivariate response from an individual is "normal" or "abnormal" when taking into account that a third (conditioning) variable may influence the bivariate...... response. The reference curve is not only characterized analytically but also by geometric properties that are easily communicated to medical doctors - the users of such curves. The reference curve estimator is completely non-parametric, so no distributional assumptions are needed about the two......-dimensional response. An example that will serve to motivate and illustrate the reference is the study of the height/weight distribution of 7-8-year-old Danish school girls born in 1930, 1950, or 1970....

  2. Adaptive ILC algorithms of nonlinear continuous systems with non-parametric uncertainties for non-repetitive trajectory tracking

    Science.gov (United States)

    Li, Xiao-Dong; Lv, Mang-Mang; Ho, John K. L.

    2016-07-01

    In this article, two adaptive iterative learning control (ILC) algorithms are presented for nonlinear continuous systems with non-parametric uncertainties. Unlike general ILC techniques, the proposed adaptive ILC algorithms allow that both the initial error at each iteration and the reference trajectory are iteration-varying in the ILC process, and can achieve non-repetitive trajectory tracking beyond a small initial time interval. Compared to the neural network or fuzzy system-based adaptive ILC schemes and the classical ILC methods, in which the number of iterative variables is generally larger than or equal to the number of control inputs, the first adaptive ILC algorithm proposed in this paper uses just two iterative variables, while the second even uses a single iterative variable provided that some bound information on system dynamics is known. As a result, the memory space in real-time ILC implementations is greatly reduced.

  3. A Non-Parametric and Entropy Based Analysis of the Relationship between the VIX and S&P 500

    Directory of Open Access Journals (Sweden)

    Abhay K. Singh

    2013-10-01

    Full Text Available This paper features an analysis of the relationship between the S&P 500 Index and the VIX using daily data obtained from the CBOE website and SIRCA (The Securities Industry Research Centre of the Asia Pacific. We explore the relationship between the S&P 500 daily return series and a similar series for the VIX in terms of a long sample drawn from the CBOE from 1990 to mid 2011 and a set of returns from SIRCA’s TRTH datasets from March 2005 to-date. This shorter sample, which captures the behavior of the new VIX, introduced in 2003, is divided into four sub-samples which permit the exploration of the impact of the Global Financial Crisis. We apply a series of non-parametric based tests utilizing entropy based metrics. These suggest that the PDFs and CDFs of these two return distributions change shape in various subsample periods. The entropy and MI statistics suggest that the degree of uncertainty attached to these distributions changes through time and using the S&P 500 return as the dependent variable, that the amount of information obtained from the VIX changes with time and reaches a relative maximum in the most recent period from 2011 to 2012. The entropy based non-parametric tests of the equivalence of the two distributions and their symmetry all strongly reject their respective nulls. The results suggest that parametric techniques do not adequately capture the complexities displayed in the behavior of these series. This has practical implications for hedging utilizing derivatives written on the VIX.

  4. 非参数项目反应理论回顾与展望%The Retrospect and Prospect of Non-parametric Item Response Theory

    Institute of Scientific and Technical Information of China (English)

    陈婧; 康春花; 钟晓玲

    2013-01-01

      相比参数项目反应理论,非参数项目反应理论提供了更吻合实践情境的理论框架。目前非参数项目反应理论研究主要关注参数估计方法及其比较、数据-模型拟合验证等方面,其应用研究则集中于量表修订及个性数据和项目功能差异分析,而在认知诊断理论基础上发展起来的非参数认知诊断理论更是凸显其应用优势。未来研究应更多侧重于非参数项目反应理论的实践应用,对非参数认知诊断理论的研究也值得关注,以充分发挥非参数方法在实践领域的应用优势。%  Compared to parametric item response theory, non-parametric item response theory provide a more appropriate theoretical framework of practice situations. Non-parametric item response theory research focuses on parameter estimation methods and its comparison, data- model fitting verify etc. currently.Its applied research concentrate on scale amendments, personalized data and differential item functioning analysis. Non-parametric cognitive diagnostic theory which based on the parametric cognitive diagnostic theory gives prominence to the advantages of its application.To give full play to the advantages of non-parametric methods in practice,future studies should emphasis on the application of non-parametric item response theory while cognitive diagnosis of the non-parametric study is also worth of attention.

  5. Technical Topic 3.2.2.d Bayesian and Non-Parametric Statistics: Integration of Neural Networks with Bayesian Networks for Data Fusion and Predictive Modeling

    Science.gov (United States)

    2016-05-31

    Distribution Unlimited UU UU UU UU 31-05-2016 15-Apr-2014 14-Jan-2015 Final Report: Technical Topic 3.2.2.d Bayesian and Non- parametric Statistics...of Papers published in non peer-reviewed journals: Final Report: Technical Topic 3.2.2.d Bayesian and Non- parametric Statistics: Integration of Neural...Transfer N/A Number of graduating undergraduates who achieved a 3.5 GPA to 4.0 (4.0 max scale ): Number of graduating undergraduates funded by a DoD funded

  6. Bayesian Non-parametric model to Target Gamification Notifications Using Big Data

    OpenAIRE

    Nia, Meisam Hejazi; Ratchford, Brian

    2016-01-01

    I suggest an approach that helps the online marketers to target their Gamification elements to users by modifying the order of the list of tasks that they send to users. It is more realistic and flexible as it allows the model to learn more parameters when the online marketers collect more data. The targeting approach is scalable and quick, and it can be used over streaming data.

  7. Applying a non-parametric efficiency analysis to measure conversion efficiency in Great Britain

    NARCIS (Netherlands)

    Binder, M.; Broekel, T.

    2011-01-01

    In the literature on Sen's capability approach, studies focusing on the empirical measurement of conversion factors are comparatively rare. We add to this field by adopting a measure of 'conversion efficiency' that captures the efficiency with which individuals convert their resources into achieved

  8. Predicting students’ grades using fuzzy non-parametric regression method and ReliefF-based algorithm

    Directory of Open Access Journals (Sweden)

    Javad Ghasemian

    Full Text Available In this paper we introduce two new approaches to predict the grades that university students will acquire in the final exam of a course and improve the obtained result on some features extracted from logged data in an educational web-based system. First w ...

  9. Applying a non-parametric efficiency analysis to measure conversion efficiency in Great Britain

    NARCIS (Netherlands)

    Binder, M.; Broekel, T.

    2011-01-01

    In the literature on Sen's capability approach, studies focusing on the empirical measurement of conversion factors are comparatively rare. We add to this field by adopting a measure of 'conversion efficiency' that captures the efficiency with which individuals convert their resources into achieved

  10. "Happiness in Life Domains: Evidence from Bangladesh Based on Parametric and Non-Parametric Models"

    OpenAIRE

    Minhaj Mahmud; Yasuyuki Sawada

    2015-01-01

    This paper applies a two layer approach to explain overall happiness both as a function of happiness in different life-domains and conventional explanatory variables such as income, education and health etc. Then it tests the happiness-income relationship in different happiness domains. Overall, the results suggest that income explains a large part of the variation in total happiness and that income is closely related with domain-specific happiness, even with non-economic domains. This is als...

  11. Non-parametric Bayesian mixture of sparse regressions with application towards feature selection for statistical downscaling

    Directory of Open Access Journals (Sweden)

    D. Das

    2014-04-01

    Full Text Available Climate projections simulated by Global Climate Models (GCM are often used for assessing the impacts of climate change. However, the relatively coarse resolutions of GCM outputs often precludes their application towards accurately assessing the effects of climate change on finer regional scale phenomena. Downscaling of climate variables from coarser to finer regional scales using statistical methods are often performed for regional climate projections. Statistical downscaling (SD is based on the understanding that the regional climate is influenced by two factors – the large scale climatic state and the regional or local features. A transfer function approach of SD involves learning a regression model which relates these features (predictors to a climatic variable of interest (predictand based on the past observations. However, often a single regression model is not sufficient to describe complex dynamic relationships between the predictors and predictand. We focus on the covariate selection part of the transfer function approach and propose a nonparametric Bayesian mixture of sparse regression models based on Dirichlet Process (DP, for simultaneous clustering and discovery of covariates within the clusters while automatically finding the number of clusters. Sparse linear models are parsimonious and hence relatively more generalizable than non-sparse alternatives, and lends to domain relevant interpretation. Applications to synthetic data demonstrate the value of the new approach and preliminary results related to feature selection for statistical downscaling shows our method can lead to new insights.

  12. A new measure for gene expression biclustering based on non-parametric correlation.

    Science.gov (United States)

    Flores, Jose L; Inza, Iñaki; Larrañaga, Pedro; Calvo, Borja

    2013-12-01

    One of the emerging techniques for performing the analysis of the DNA microarray data known as biclustering is the search of subsets of genes and conditions which are coherently expressed. These subgroups provide clues about the main biological processes. Until now, different approaches to this problem have been proposed. Most of them use the mean squared residue as quality measure but relevant and interesting patterns can not be detected such as shifting, or scaling patterns. Furthermore, recent papers show that there exist new coherence patterns involved in different kinds of cancer and tumors such as inverse relationships between genes which can not be captured. The proposed measure is called Spearman's biclustering measure (SBM) which performs an estimation of the quality of a bicluster based on the non-linear correlation among genes and conditions simultaneously. The search of biclusters is performed by using a evolutionary technique called estimation of distribution algorithms which uses the SBM measure as fitness function. This approach has been examined from different points of view by using artificial and real microarrays. The assessment process has involved the use of quality indexes, a set of bicluster patterns of reference including new patterns and a set of statistical tests. It has been also examined the performance using real microarrays and comparing to different algorithmic approaches such as Bimax, CC, OPSM, Plaid and xMotifs. SBM shows several advantages such as the ability to recognize more complex coherence patterns such as shifting, scaling and inversion and the capability to selectively marginalize genes and conditions depending on the statistical significance. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  13. 分布函数的非参数最小二乘估计%NON-PARAMETRIC LEAST SQUARE ESTIMATION OF DISTRIBUTION FUNCTION

    Institute of Scientific and Technical Information of China (English)

    柴根象; 花虹; 尚汉冀

    2002-01-01

    By using the non-parametric least square method, the strong consistent estimations of distribution function and failure function are established,where the distribution function F(x) after logist transformation is assumed to be approximated by a polynomial.The performance of simulation shows that the estimations are highly satisfactory.

  14. The binned bispectrum estimator: template-based and non-parametric CMB non-Gaussianity searches

    CERN Document Server

    Bucher, Martin; van Tent, Bartjan

    2015-01-01

    We describe the details of the binned bispectrum estimator as used for the official 2013 and 2015 analyses of the temperature and polarization CMB maps from the ESA Planck satellite. The defining aspect of this estimator is the determination of a map bispectrum (3-point correlator) that has been binned in harmonic space. For a parametric determination of the non-Gaussianity in the map (the so-called fNL parameters), one takes the inner product of this binned bispectrum with theoretically motivated templates. However, as a complementary approach one can also smooth the binned bispectrum using a variable smoothing scale in order to suppress noise and make coherent features stand out above the noise. This allows one to look in a model-independent way for any statistically significant bispectral signal. This approach is useful for characterizing the bispectral shape of the galactic foreground emission, for which a theoretical prediction of the bispectral anisotropy is lacking, and for detecting a serendipitous pr...

  15. Modular autopilot design and development featuring Bayesian non-parametric adaptive control

    Science.gov (United States)

    Stockton, Jacob

    Over the last few decades, Unmanned Aircraft Systems, or UAS, have become a critical part of the defense of our nation and the growth of the aerospace sector. UAS have a great potential for the agricultural industry, first response, and ecological monitoring. However, the wide range of applications require many mission-specific vehicle platforms. These platforms must operate reliably in a range of environments, and in presence of significant uncertainties. The accepted practice for enabling autonomously flying UAS today relies on extensive manual tuning of the UAS autopilot parameters, or time consuming approximate modeling of the dynamics of the UAS. These methods may lead to overly conservative controllers or excessive development times. A comprehensive approach to the development of an adaptive, airframe-independent controller is presented. The control algorithm leverages a nonparametric, Bayesian approach to adaptation, and is used as a cornerstone for the development of a new modular autopilot. Promising simulation results are presented for the adaptive controller, as well as, flight test results for the modular autopilot.

  16. Non parametric denoising methods based on wavelets: Application to electron microscopy images in low exposure time

    Energy Technology Data Exchange (ETDEWEB)

    Soumia, Sid Ahmed, E-mail: samasoumia@hotmail.fr [Science and Technology Faculty, El Bachir El Ibrahimi University, BordjBouArreridj (Algeria); Messali, Zoubeida, E-mail: messalizoubeida@yahoo.fr [Laboratory of Electrical Engineering(LGE), University of M' sila (Algeria); Ouahabi, Abdeldjalil, E-mail: abdeldjalil.ouahabi@univ-tours.fr [Polytechnic School, University of Tours (EPU - PolytechTours), EPU - Energy and Electronics Department (France); Trepout, Sylvain, E-mail: sylvain.trepout@curie.fr, E-mail: cedric.messaoudi@curie.fr, E-mail: sergio.marco@curie.fr; Messaoudi, Cedric, E-mail: sylvain.trepout@curie.fr, E-mail: cedric.messaoudi@curie.fr, E-mail: sergio.marco@curie.fr; Marco, Sergio, E-mail: sylvain.trepout@curie.fr, E-mail: cedric.messaoudi@curie.fr, E-mail: sergio.marco@curie.fr [INSERMU759, University Campus Orsay, 91405 Orsay Cedex (France)

    2015-01-13

    The 3D reconstruction of the Cryo-Transmission Electron Microscopy (Cryo-TEM) and Energy Filtering TEM images (EFTEM) hampered by the noisy nature of these images, so that their alignment becomes so difficult. This noise refers to the collision between the frozen hydrated biological samples and the electrons beam, where the specimen is exposed to the radiation with a high exposure time. This sensitivity to the electrons beam led specialists to obtain the specimen projection images at very low exposure time, which resulting the emergence of a new problem, an extremely low signal-to-noise ratio (SNR). This paper investigates the problem of TEM images denoising when they are acquired at very low exposure time. So, our main objective is to enhance the quality of TEM images to improve the alignment process which will in turn improve the three dimensional tomography reconstructions. We have done multiple tests on special TEM images acquired at different exposure time 0.5s, 0.2s, 0.1s and 1s (i.e. with different values of SNR)) and equipped by Golding beads for helping us in the assessment step. We herein, propose a structure to combine multiple noisy copies of the TEM images. The structure is based on four different denoising methods, to combine the multiple noisy TEM images copies. Namely, the four different methods are Soft, the Hard as Wavelet-Thresholding methods, Bilateral Filter as a non-linear technique able to maintain the edges neatly, and the Bayesian approach in the wavelet domain, in which context modeling is used to estimate the parameter for each coefficient. To ensure getting a high signal-to-noise ratio, we have guaranteed that we are using the appropriate wavelet family at the appropriate level. So we have chosen âĂIJsym8âĂİ wavelet at level 3 as the most appropriate parameter. Whereas, for the bilateral filtering many tests are done in order to determine the proper filter parameters represented by the size of the filter, the range parameter and the

  17. Zero- vs. one-dimensional, parametric vs. non-parametric, and confidence interval vs. hypothesis testing procedures in one-dimensional biomechanical trajectory analysis.

    Science.gov (United States)

    Pataky, Todd C; Vanrenterghem, Jos; Robinson, Mark A

    2015-05-01

    Biomechanical processes are often manifested as one-dimensional (1D) trajectories. It has been shown that 1D confidence intervals (CIs) are biased when based on 0D statistical procedures, and the non-parametric 1D bootstrap CI has emerged in the Biomechanics literature as a viable solution. The primary purpose of this paper was to clarify that, for 1D biomechanics datasets, the distinction between 0D and 1D methods is much more important than the distinction between parametric and non-parametric procedures. A secondary purpose was to demonstrate that a parametric equivalent to the 1D bootstrap exists in the form of a random field theory (RFT) correction for multiple comparisons. To emphasize these points we analyzed six datasets consisting of force and kinematic trajectories in one-sample, paired, two-sample and regression designs. Results showed, first, that the 1D bootstrap and other 1D non-parametric CIs were qualitatively identical to RFT CIs, and all were very different from 0D CIs. Second, 1D parametric and 1D non-parametric hypothesis testing results were qualitatively identical for all six datasets. Last, we highlight the limitations of 1D CIs by demonstrating that they are complex, design-dependent, and thus non-generalizable. These results suggest that (i) analyses of 1D data based on 0D models of randomness are generally biased unless one explicitly identifies 0D variables before the experiment, and (ii) parametric and non-parametric 1D hypothesis testing provide an unambiguous framework for analysis when one׳s hypothesis explicitly or implicitly pertains to whole 1D trajectories.

  18. Out-of-Sample Extensions for Non-Parametric Kernel Methods.

    Science.gov (United States)

    Pan, Binbin; Chen, Wen-Sheng; Chen, Bo; Xu, Chen; Lai, Jianhuang

    2017-02-01

    Choosing suitable kernels plays an important role in the performance of kernel methods. Recently, a number of studies were devoted to developing nonparametric kernels. Without assuming any parametric form of the target kernel, nonparametric kernel learning offers a flexible scheme to utilize the information of the data, which may potentially characterize the data similarity better. The kernel methods using nonparametric kernels are referred to as nonparametric kernel methods. However, many nonparametric kernel methods are restricted to transductive learning, where the prediction function is defined only over the data points given beforehand. They have no straightforward extension for the out-of-sample data points, and thus cannot be applied to inductive learning. In this paper, we show how to make the nonparametric kernel methods applicable to inductive learning. The key problem of out-of-sample extension is how to extend the nonparametric kernel matrix to the corresponding kernel function. A regression approach in the hyper reproducing kernel Hilbert space is proposed to solve this problem. Empirical results indicate that the out-of-sample performance is comparable to the in-sample performance in most cases. Experiments on face recognition demonstrate the superiority of our nonparametric kernel method over the state-of-the-art parametric kernel methods.

  19. Alternative methods of marginal abatement cost estimation: Non- parametric distance functions

    Energy Technology Data Exchange (ETDEWEB)

    Boyd, G.; Molburg, J. [Argonne National Lab., IL (United States). Decision and Information Sciences Div.; Prince, R. [USDOE Office of Environmental Analysis, Washington, DC (United States)

    1996-12-31

    This project implements a economic methodology to measure the marginal abatement costs of pollution by measuring the lost revenue implied by an incremental reduction in pollution. It utilizes observed performance, or `best practice`, of facilities to infer the marginal abatement cost. The initial stage of the project is to use data from an earlier published study on productivity trends and pollution in electric utilities to test this approach and to provide insights on its implementation to issues of cost-benefit analysis studies needed by the Department of Energy. The basis for this marginal abatement cost estimation is a relationship between the outputs and the inputs of a firm or plant. Given a fixed set of input resources, including quasi-fixed inputs like plant and equipment and variable inputs like labor and fuel, a firm is able to produce a mix of outputs. This paper uses this theoretical view of the joint production process to implement a methodology and obtain empirical estimates of marginal abatement costs. These estimates are compared to engineering estimates.

  20. Non-parametric reconstruction of an inflaton potential from Einstein–Cartan–Sciama–Kibble gravity with particle production

    Directory of Open Access Journals (Sweden)

    Shantanu Desai

    2016-04-01

    Full Text Available The coupling between spin and torsion in the Einstein–Cartan–Sciama–Kibble theory of gravity generates gravitational repulsion at very high densities, which prevents a singularity in a black hole and may create there a new universe. We show that quantum particle production in such a universe near the last bounce, which represents the Big Bang, gives the dynamics that solves the horizon, flatness, and homogeneity problems in cosmology. For a particular range of the particle production coefficient, we obtain a nearly constant Hubble parameter that gives an exponential expansion of the universe with more than 60 e-folds, which lasts about ∼10−42 s. This scenario can thus explain cosmic inflation without requiring a fundamental scalar field and reheating. From the obtained time dependence of the scale factor, we follow the prescription of Ellis and Madsen to reconstruct in a non-parametric way a scalar field potential which gives the same dynamics of the early universe. This potential gives the slow-roll parameters of cosmic inflation, from which we calculate the tensor-to-scalar ratio, the scalar spectral index of density perturbations, and its running as functions of the production coefficient. We find that these quantities do not significantly depend on the scale factor at the Big Bounce. Our predictions for these quantities are consistent with the Planck 2015 observations.

  1. Non-parametric reconstruction of an inflaton potential from Einstein-Cartan-Sciama-Kibble gravity with particle production

    Science.gov (United States)

    Desai, Shantanu; Popławski, Nikodem J.

    2016-04-01

    The coupling between spin and torsion in the Einstein-Cartan-Sciama-Kibble theory of gravity generates gravitational repulsion at very high densities, which prevents a singularity in a black hole and may create there a new universe. We show that quantum particle production in such a universe near the last bounce, which represents the Big Bang, gives the dynamics that solves the horizon, flatness, and homogeneity problems in cosmology. For a particular range of the particle production coefficient, we obtain a nearly constant Hubble parameter that gives an exponential expansion of the universe with more than 60 e-folds, which lasts about ∼10-42 s. This scenario can thus explain cosmic inflation without requiring a fundamental scalar field and reheating. From the obtained time dependence of the scale factor, we follow the prescription of Ellis and Madsen to reconstruct in a non-parametric way a scalar field potential which gives the same dynamics of the early universe. This potential gives the slow-roll parameters of cosmic inflation, from which we calculate the tensor-to-scalar ratio, the scalar spectral index of density perturbations, and its running as functions of the production coefficient. We find that these quantities do not significantly depend on the scale factor at the Big Bounce. Our predictions for these quantities are consistent with the Planck 2015 observations.

  2. Non-parametric reconstruction of an inflaton potential from Einstein-Cartan-Sciama-Kibble gravity with particle production

    CERN Document Server

    Desai, Shantanu

    2015-01-01

    The coupling between spin and torsion in the Einstein-Cartan-Sciama-Kibble theory of gravity generates gravitational repulsion at very high densities, which prevents a singularity in a black hole and may create there a new universe. We show that quantum particle production in such a universe near the last bounce, which represents the Big Bang gives the dynamics that solves the horizon, flatness, and homogeneity problems in cosmology. For a particular range of the particle production coefficient, we obtain a nearly constant Hubble parameter that gives an exponential expansion of the universe with more than 60 $e$-folds, which lasts about $\\sim 10^{-42}$ s. This scenario can thus explain cosmic inflation without requiring a fundamental scalar field and reheating. From the obtained time dependence of the scale factor, we follow the prescription of Ellis and Madsen to reconstruct in a non-parametric way a scalar field potential which gives the same dynamics of the early universe. This potential gives the slow-rol...

  3. Super-resolution non-parametric deconvolution in modelling the radial response function of a parallel plate ionization chamber.

    Science.gov (United States)

    Kulmala, A; Tenhunen, M

    2012-11-07

    The signal of the dosimetric detector is generally dependent on the shape and size of the sensitive volume of the detector. In order to optimize the performance of the detector and reliability of the output signal the effect of the detector size should be corrected or, at least, taken into account. The response of the detector can be modelled using the convolution theorem that connects the system input (actual dose), output (measured result) and the effect of the detector (response function) by a linear convolution operator. We have developed the super-resolution and non-parametric deconvolution method for determination of the cylinder symmetric ionization chamber radial response function. We have demonstrated that the presented deconvolution method is able to determine the radial response for the Roos parallel plate ionization chamber with a better than 0.5 mm correspondence with the physical measures of the chamber. In addition, the performance of the method was proved by the excellent agreement between the output factors of the stereotactic conical collimators (4-20 mm diameter) measured by the Roos chamber, where the detector size is larger than the measured field, and the reference detector (diode). The presented deconvolution method has a potential in providing reference data for more accurate physical models of the ionization chamber as well as for improving and enhancing the performance of the detectors in specific dosimetric problems.

  4. A sharper view of Pal 5's tails: Discovery of stream perturbations with a novel non-parametric technique

    CERN Document Server

    Erkal, Denis; Belokurov, Vasily

    2016-01-01

    Only in the Milky Way is it possible to conduct an experiment which uses stellar streams to detect low-mass dark matter subhaloes. In smooth and static host potentials, tidal tails of disrupting satellites appear highly symmetric. However, dark perturbers induce density fluctuations that destroy this symmetry. Motivated by the recent release of unprecedentedly deep and wide imaging data around the Pal 5 stellar stream, we develop a new probabilistic, adaptive and non-parametric technique which allows us to bring the cluster's tidal tails into clear focus. Strikingly, we uncover a stream whose density exhibits visible changes on a variety of angular scales. We detect significant bumps and dips, both narrow and broad: two peaks on either side of the progenitor, each only a fraction of a degree across, and two gaps, $\\sim2^{\\circ}$ and $\\sim9^{\\circ}$ wide, the latter accompanied by a gargantuan lump of debris. This largest density feature results in a pronounced inter-tail asymmetry which cannot be made consist...

  5. The merger fraction of active and inactive galaxies in the local Universe through an improved non-parametric classification

    CERN Document Server

    Cotini, Stefano; Caccianiga, Alessandro; Colpi, Monica; Della Ceca, Roberto; Mapelli, Michela; Severgnini, Paola; Segreto, Alberto; 10.1093/mnras/stt358

    2013-01-01

    We investigate the possible link between mergers and the enhanced activity of supermassive black holes (SMBHs) at the centre of galaxies, by comparing the merger fraction of a local sample (0.003 =< z < 0.03) of active galaxies - 59 active galactic nuclei (AGN) host galaxies selected from the all-sky Swift BAT (Burst Alert Telescope) survey - with an appropriate control sample (247 sources extracted from the Hyperleda catalogue) that has the same redshift distribution as the BAT sample. We detect the interacting systems in the two samples on the basis of non-parametric structural indexes of concentration (C), asymmetry (A), clumpiness (S), Gini coefficient (G) and second order momentum of light (M20). In particular, we propose a new morphological criterion, based on a combination of all these indexes, that improves the identification of interacting systems. We also present a new software - PyCASSo (Python CAS Software) - for the automatic computation of the structural indexes. After correcting for the c...

  6. A new powerful non-parametric two-stage approach for testing multiple phenotypes in family-based association studies

    NARCIS (Netherlands)

    Lange, C; Lyon, H; DeMeo, D; Raby, B; Silverman, EK; Weiss, ST

    2003-01-01

    We introduce a new powerful nonparametric testing strategy for family-based association studies in which multiple quantitative traits are recorded and the phenotype with the strongest genetic component is not known prior to the analysis. In the first stage, using a population-based test based on the

  7. An alternative approach to the ground motion prediction problem by a non-parametric adaptive regression method

    Science.gov (United States)

    Yerlikaya-Özkurt, Fatma; Askan, Aysegul; Weber, Gerhard-Wilhelm

    2014-12-01

    Ground Motion Prediction Equations (GMPEs) are empirical relationships which are used for determining the peak ground response at a particular distance from an earthquake source. They relate the peak ground responses as a function of earthquake source type, distance from the source, local site conditions where the data are recorded and finally the depth and magnitude of the earthquake. In this article, a new prediction algorithm, called Conic Multivariate Adaptive Regression Splines (CMARS), is employed on an available dataset for deriving a new GMPE. CMARS is based on a special continuous optimization technique, conic quadratic programming. These convex optimization problems are very well-structured, resembling linear programs and, hence, permitting the use of interior point methods. The CMARS method is performed on the strong ground motion database of Turkey. Results are compared with three other GMPEs. CMARS is found to be effective for ground motion prediction purposes.

  8. Non-parametric convolution based image-segmentation of ill-posed objects applying context window approach

    OpenAIRE

    Kumar, Upendra; Lahiri, Tapobrata; Pal, Manoj Kumar

    2012-01-01

    Context-dependence in human cognition process is a well-established fact. Following this, we introduced the image segmentation method that can use context to classify a pixel on the basis of its membership to a particular object-class of the concerned image. In the broad methodological steps, each pixel was defined by its context window (CW) surrounding it the size of which was fixed heuristically. CW texture defined by the intensities of its pixels was convoluted with weights optimized throu...

  9. The signaling petri net-based simulator: a non-parametric strategy for characterizing the dynamics of cell-specific signaling networks.

    Directory of Open Access Journals (Sweden)

    Derek Ruths

    2008-02-01

    Full Text Available Reconstructing cellular signaling networks and understanding how they work are major endeavors in cell biology. The scale and complexity of these networks, however, render their analysis using experimental biology approaches alone very challenging. As a result, computational methods have been developed and combined with experimental biology approaches, producing powerful tools for the analysis of these networks. These computational methods mostly fall on either end of a spectrum of model parameterization. On one end is a class of structural network analysis methods; these typically use the network connectivity alone to generate hypotheses about global properties. On the other end is a class of dynamic network analysis methods; these use, in addition to the connectivity, kinetic parameters of the biochemical reactions to predict the network's dynamic behavior. These predictions provide detailed insights into the properties that determine aspects of the network's structure and behavior. However, the difficulty of obtaining numerical values of kinetic parameters is widely recognized to limit the applicability of this latter class of methods. Several researchers have observed that the connectivity of a network alone can provide significant insights into its dynamics. Motivated by this fundamental observation, we present the signaling Petri net, a non-parametric model of cellular signaling networks, and the signaling Petri net-based simulator, a Petri net execution strategy for characterizing the dynamics of signal flow through a signaling network using token distribution and sampling. The result is a very fast method, which can analyze large-scale networks, and provide insights into the trends of molecules' activity-levels in response to an external stimulus, based solely on the network's connectivity. We have implemented the signaling Petri net-based simulator in the PathwayOracle toolkit, which is publicly available at http

  10. A sharper view of Pal 5's tails: discovery of stream perturbations with a novel non-parametric technique

    Science.gov (United States)

    Erkal, Denis; Koposov, Sergey E.; Belokurov, Vasily

    2017-09-01

    Only in the Milky Way is it possible to conduct an experiment that uses stellar streams to detect low-mass dark matter subhaloes. In smooth and static host potentials, tidal tails of disrupting satellites appear highly symmetric. However, perturbations from dark subhaloes, as well as from GMCs and the Milky Way bar, can induce density fluctuations that destroy this symmetry. Motivated by the recent release of unprecedentedly deep and wide imaging data around the Pal 5 stellar stream, we develop a new probabilistic, adaptive and non-parametric technique that allows us to bring the cluster's tidal tails into clear focus. Strikingly, we uncover a stream whose density exhibits visible changes on a variety of angular scales. We detect significant bumps and dips, both narrow and broad: two peaks on either side of the progenitor, each only a fraction of a degree across, and two gaps, ∼2° and ∼9° wide, the latter accompanied by a gargantuan lump of debris. This largest density feature results in a pronounced intertail asymmetry which cannot be made consistent with an unperturbed stream according to a suite of simulations we have produced. We conjecture that the sharp peaks around Pal 5 are epicyclic overdensities, while the two dips are consistent with impacts by subhaloes. Assuming an age of 3.4 Gyr for Pal 5, these two gaps would correspond to the characteristic size of gaps created by subhaloes in the mass range of 106-107 M⊙ and 107-108 M⊙, respectively. In addition to dark substructure, we find that the bar of the Milky Way can plausibly produce the asymmetric density seen in Pal 5 and that GMCs could cause the smaller gap.

  11. Transit Timing Observations from Kepler: II. Confirmation of Two Multiplanet Systems via a Non-parametric Correlation Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ford, Eric B.; /Florida U.; Fabrycky, Daniel C.; /Lick Observ.; Steffen, Jason H.; /Fermilab; Carter, Joshua A.; /Harvard-Smithsonian Ctr. Astrophys.; Fressin, Francois; /Harvard-Smithsonian Ctr. Astrophys.; Holman, Matthew J.; /Harvard-Smithsonian Ctr. Astrophys.; Lissauer, Jack J.; /NASA, Ames; Moorhead, Althea V.; /Florida U.; Morehead, Robert C.; /Florida U.; Ragozzine, Darin; /Harvard-Smithsonian Ctr. Astrophys.; Rowe, Jason F.; /NASA, Ames /SETI Inst., Mtn. View /San Diego State U., Astron. Dept.

    2012-01-01

    We present a new method for confirming transiting planets based on the combination of transit timing variations (TTVs) and dynamical stability. Correlated TTVs provide evidence that the pair of bodies are in the same physical system. Orbital stability provides upper limits for the masses of the transiting companions that are in the planetary regime. This paper describes a non-parametric technique for quantifying the statistical significance of TTVs based on the correlation of two TTV data sets. We apply this method to an analysis of the transit timing variations of two stars with multiple transiting planet candidates identified by Kepler. We confirm four transiting planets in two multiple planet systems based on their TTVs and the constraints imposed by dynamical stability. An additional three candidates in these same systems are not confirmed as planets, but are likely to be validated as real planets once further observations and analyses are possible. If all were confirmed, these systems would be near 4:6:9 and 2:4:6:9 period commensurabilities. Our results demonstrate that TTVs provide a powerful tool for confirming transiting planets, including low-mass planets and planets around faint stars for which Doppler follow-up is not practical with existing facilities. Continued Kepler observations will dramatically improve the constraints on the planet masses and orbits and provide sensitivity for detecting additional non-transiting planets. If Kepler observations were extended to eight years, then a similar analysis could likely confirm systems with multiple closely spaced, small transiting planets in or near the habitable zone of solar-type stars.

  12. Dependence between fusion temperatures and chemical components of a certain type of coal using classical, non-parametric and bootstrap techniques

    Energy Technology Data Exchange (ETDEWEB)

    Gonzalez-Manteiga, W.; Prada-Sanchez, J.M.; Fiestras-Janeiro, M.G.; Garcia-Jurado, I. (Universidad de Santiago de Compostela, Santiago de Compostela (Spain). Dept. de Estadistica e Investigacion Operativa)

    1990-11-01

    A statistical study of the dependence between various critical fusion temperatures of a certain kind of coal and its chemical components is carried out. As well as using classical dependence techniques (multiple, stepwise and PLS regression, principal components, canonical correlation, etc.) together with the corresponding inference on the parameters of interest, non-parametric regression and bootstrap inference are also performed. 11 refs., 3 figs., 8 tabs.

  13. Mokken scale analysis of mental health and well-being questionnaire item responses: a non-parametric IRT method in empirical research for applied health researchers

    Directory of Open Access Journals (Sweden)

    Stochl Jan

    2012-06-01

    Full Text Available Abstract Background Mokken scaling techniques are a useful tool for researchers who wish to construct unidimensional tests or use questionnaires that comprise multiple binary or polytomous items. The stochastic cumulative scaling model offered by this approach is ideally suited when the intention is to score an underlying latent trait by simple addition of the item response values. In our experience, the Mokken model appears to be less well-known than for example the (related Rasch model, but is seeing increasing use in contemporary clinical research and public health. Mokken's method is a generalisation of Guttman scaling that can assist in the determination of the dimensionality of tests or scales, and enables consideration of reliability, without reliance on Cronbach's alpha. This paper provides a practical guide to the application and interpretation of this non-parametric item response theory method in empirical research with health and well-being questionnaires. Methods Scalability of data from 1 a cross-sectional health survey (the Scottish Health Education Population Survey and 2 a general population birth cohort study (the National Child Development Study illustrate the method and modeling steps for dichotomous and polytomous items respectively. The questionnaire data analyzed comprise responses to the 12 item General Health Questionnaire, under the binary recoding recommended for screening applications, and the ordinal/polytomous responses to the Warwick-Edinburgh Mental Well-being Scale. Results and conclusions After an initial analysis example in which we select items by phrasing (six positive versus six negatively worded items we show that all items from the 12-item General Health Questionnaire (GHQ-12 – when binary scored – were scalable according to the double monotonicity model, in two short scales comprising six items each (Bech’s “well-being” and “distress” clinical scales. An illustration of ordinal item analysis

  14. Mokken scale analysis of mental health and well-being questionnaire item responses: a non-parametric IRT method in empirical research for applied health researchers.

    Science.gov (United States)

    Stochl, Jan; Jones, Peter B; Croudace, Tim J

    2012-06-11

    Mokken scaling techniques are a useful tool for researchers who wish to construct unidimensional tests or use questionnaires that comprise multiple binary or polytomous items. The stochastic cumulative scaling model offered by this approach is ideally suited when the intention is to score an underlying latent trait by simple addition of the item response values. In our experience, the Mokken model appears to be less well-known than for example the (related) Rasch model, but is seeing increasing use in contemporary clinical research and public health. Mokken's method is a generalisation of Guttman scaling that can assist in the determination of the dimensionality of tests or scales, and enables consideration of reliability, without reliance on Cronbach's alpha. This paper provides a practical guide to the application and interpretation of this non-parametric item response theory method in empirical research with health and well-being questionnaires. Scalability of data from 1) a cross-sectional health survey (the Scottish Health Education Population Survey) and 2) a general population birth cohort study (the National Child Development Study) illustrate the method and modeling steps for dichotomous and polytomous items respectively. The questionnaire data analyzed comprise responses to the 12 item General Health Questionnaire, under the binary recoding recommended for screening applications, and the ordinal/polytomous responses to the Warwick-Edinburgh Mental Well-being Scale. After an initial analysis example in which we select items by phrasing (six positive versus six negatively worded items) we show that all items from the 12-item General Health Questionnaire (GHQ-12)--when binary scored--were scalable according to the double monotonicity model, in two short scales comprising six items each (Bech's "well-being" and "distress" clinical scales). An illustration of ordinal item analysis confirmed that all 14 positively worded items of the Warwick-Edinburgh Mental

  15. The relationship between multilevel models and non-parametric multilevel mixture models: Discrete approximation of intraclass correlation, random coefficient distributions, and residual heteroscedasticity.

    Science.gov (United States)

    Rights, Jason D; Sterba, Sonya K

    2016-11-01

    Multilevel data structures are common in the social sciences. Often, such nested data are analysed with multilevel models (MLMs) in which heterogeneity between clusters is modelled by continuously distributed random intercepts and/or slopes. Alternatively, the non-parametric multilevel regression mixture model (NPMM) can accommodate the same nested data structures through discrete latent class variation. The purpose of this article is to delineate analytic relationships between NPMM and MLM parameters that are useful for understanding the indirect interpretation of the NPMM as a non-parametric approximation of the MLM, with relaxed distributional assumptions. We define how seven standard and non-standard MLM specifications can be indirectly approximated by particular NPMM specifications. We provide formulas showing how the NPMM can serve as an approximation of the MLM in terms of intraclass correlation, random coefficient means and (co)variances, heteroscedasticity of residuals at level 1, and heteroscedasticity of residuals at level 2. Further, we discuss how these relationships can be useful in practice. The specific relationships are illustrated with simulated graphical demonstrations, and direct and indirect interpretations of NPMM classes are contrasted. We provide an R function to aid in implementing and visualizing an indirect interpretation of NPMM classes. An empirical example is presented and future directions are discussed. © 2016 The British Psychological Society.

  16. Non-parametric methods – Tree and P-CFA – for the ecological evaluation and assessment of suitable aquatic habitats: A contribution to fish psychology

    Directory of Open Access Journals (Sweden)

    Andreas H. Melcher

    2012-09-01

    Full Text Available This study analyses multidimensional spawning habitat suitability of the fish species “Nase” (latin: Chondrostoma nasus. This is the first time non-parametric methods were used to better understand biotic habitat use in theory and practice. In particular, we tested (1 the Decision Tree technique, Chi-squared Automatic Interaction Detectors (CHAID, to identify specific habitat types and (2 Prediction-Configural Frequency Analysis (P-CFA to test for statistical significance. The combination of both non-parametric methods, CHAID and P-CFA, enabled the identification, prediction and interpretation of most typical significant spawning habitats, and we were also able to determine non-typical habitat types, e.g., types in contrast to antitypes. The gradual combination of these two methods underlined three significant habitat types: shaded habitat, fine and coarse substrate habitat depending on high flow velocity. The study affirmed the importance for fish species of shading and riparian vegetation along river banks. In addition, this method provides a weighting of interactions between specific habitat characteristics. The results demonstrate that efficient river restoration requires re-establishing riparian vegetation as well as the open river continuum and hydro-morphological improvements to habitats.

  17. A non-parametric method for automatic determination of P-wave and S-wave arrival times: application to local micro earthquakes

    Science.gov (United States)

    Rawles, Christopher; Thurber, Clifford

    2015-08-01

    We present a simple, fast, and robust method for automatic detection of P- and S-wave arrivals using a nearest neighbours-based approach. The nearest neighbour algorithm is one of the most popular time-series classification methods in the data mining community and has been applied to time-series problems in many different domains. Specifically, our method is based on the non-parametric time-series classification method developed by Nikolov. Instead of building a model by estimating parameters from the data, the method uses the data itself to define the model. Potential phase arrivals are identified based on their similarity to a set of reference data consisting of positive and negative sets, where the positive set contains examples of analyst identified P- or S-wave onsets and the negative set contains examples that do not contain P waves or S waves. Similarity is defined as the square of the Euclidean distance between vectors representing the scaled absolute values of the amplitudes of the observed signal and a given reference example in time windows of the same length. For both P waves and S waves, a single pass is done through the bandpassed data, producing a score function defined as the ratio of the sum of similarity to positive examples over the sum of similarity to negative examples for each window. A phase arrival is chosen as the centre position of the window that maximizes the score function. The method is tested on two local earthquake data sets, consisting of 98 known events from the Parkfield region in central California and 32 known events from the Alpine Fault region on the South Island of New Zealand. For P-wave picks, using a reference set containing two picks from the Parkfield data set, 98 per cent of Parkfield and 94 per cent of Alpine Fault picks are determined within 0.1 s of the analyst pick. For S-wave picks, 94 per cent and 91 per cent of picks are determined within 0.2 s of the analyst picks for the Parkfield and Alpine Fault data set

  18. Water quality analysis in rivers with non-parametric probability distributions and fuzzy inference systems: application to the Cauca River, Colombia.

    Science.gov (United States)

    Ocampo-Duque, William; Osorio, Carolina; Piamba, Christian; Schuhmacher, Marta; Domingo, José L

    2013-02-01

    The integration of water quality monitoring variables is essential in environmental decision making. Nowadays, advanced techniques to manage subjectivity, imprecision, uncertainty, vagueness, and variability are required in such complex evaluation process. We here propose a probabilistic fuzzy hybrid model to assess river water quality. Fuzzy logic reasoning has been used to compute a water quality integrative index. By applying a Monte Carlo technique, based on non-parametric probability distributions, the randomness of model inputs was estimated. Annual histograms of nine water quality variables were built with monitoring data systematically collected in the Colombian Cauca River, and probability density estimations using the kernel smoothing method were applied to fit data. Several years were assessed, and river sectors upstream and downstream the city of Santiago de Cali, a big city with basic wastewater treatment and high industrial activity, were analyzed. The probabilistic fuzzy water quality index was able to explain the reduction in water quality, as the river receives a larger number of agriculture, domestic, and industrial effluents. The results of the hybrid model were compared to traditional water quality indexes. The main advantage of the proposed method is that it considers flexible boundaries between the linguistic qualifiers used to define the water status, being the belongingness of water quality to the diverse output fuzzy sets or classes provided with percentiles and histograms, which allows classify better the real water condition. The results of this study show that fuzzy inference systems integrated to stochastic non-parametric techniques may be used as complementary tools in water quality indexing methodologies.

  19. Parametric and non-parametric masking of randomness in sequence alignments can be improved and leads to better resolved trees

    Directory of Open Access Journals (Sweden)

    von Reumont Björn M

    2010-03-01

    Full Text Available Abstract Background Methods of alignment masking, which refers to the technique of excluding alignment blocks prior to tree reconstructions, have been successful in improving the signal-to-noise ratio in sequence alignments. However, the lack of formally well defined methods to identify randomness in sequence alignments has prevented a routine application of alignment masking. In this study, we compared the effects on tree reconstructions of the most commonly used profiling method (GBLOCKS which uses a predefined set of rules in combination with alignment masking, with a new profiling approach (ALISCORE based on Monte Carlo resampling within a sliding window, using different data sets and alignment methods. While the GBLOCKS approach excludes variable sections above a certain threshold which choice is left arbitrary, the ALISCORE algorithm is free of a priori rating of parameter space and therefore more objective. Results ALISCORE was successfully extended to amino acids using a proportional model and empirical substitution matrices to score randomness in multiple sequence alignments. A complex bootstrap resampling leads to an even distribution of scores of randomly similar sequences to assess randomness of the observed sequence similarity. Testing performance on real data, both masking methods, GBLOCKS and ALISCORE, helped to improve tree resolution. The sliding window approach was less sensitive to different alignments of identical data sets and performed equally well on all data sets. Concurrently, ALISCORE is capable of dealing with different substitution patterns and heterogeneous base composition. ALISCORE and the most relaxed GBLOCKS gap parameter setting performed best on all data sets. Correspondingly, Neighbor-Net analyses showed the most decrease in conflict. Conclusions Alignment masking improves signal-to-noise ratio in multiple sequence alignments prior to phylogenetic reconstruction. Given the robust performance of alignment

  20. Identifiability and inference of non-parametric rates-across-sites models on large-scale phylogenies

    CERN Document Server

    Mossel, Elchanan

    2011-01-01

    Mutation rate variation across loci is well known to cause difficulties, notably identifiability issues, in the reconstruction of evolutionary trees from molecular sequences. Here we introduce a new approach for estimating general rates-across-sites models. Our results imply, in particular, that large phylogenies are typically identifiable under rate variation. We also derive sequence-length requirements for high-probability reconstruction. Our main contribution is a novel algorithm that clusters sites according to their mutation rate. Following this site clustering step, standard reconstruction techniques can be used to recover the phylogeny. Our results rely on a basic insight: that, for large trees, certain site statistics experience concentration-of-measure phenomena.

  1. 非参数化方法在 DNB 传递分析中的应用%Non-parametric Method Used in DNB Propagation Analysis

    Institute of Scientific and Technical Information of China (English)

    刘俊强; 黄禹

    2014-01-01

    Deciding the internal pressure probability distribution of the fuel rod is a fundamental work in the DNB propagation analysis using Monte Carlo method .The traditional parametric method is used to assume that the internal pressure probability of all rods can be characterized by a normal distribution .But this is not always the case , sometimes there is far more differences between normal distribution and the real one . However ,a new method ,the non-parametric method was used in the treatment of the rod internal pressure data because of its applicability anyw here and good precision in the case of large samples ,and the results show that it is more conservative to use non-parametric method than parametric method in DNB propagation analysis .%采用蒙特卡罗方法进行偏离泡核沸腾(DNB)传递分析中一个最基本的工作是确定燃料棒内压的概率分布。通常假设燃料棒的内压服从正态分布即传统的参数化方法。但燃料棒的内压不总是满足正态分布或与正态分布相差较远。为克服这一不足,本工作采用一种新的方法即非参数化的方法计算燃料棒内压的概率分布。通过对压水堆核电厂燃料棒内压数据的非参数化处理,得到燃料棒内压的概率分布并进行DNB传递分析。由计算结果得出:在DNB传递分析中,相较于参数化方法,采用非参数化方法所得的棒内压概率分布具有普遍适用性及大样本下的良好精度,分析结果更为保守、安全。

  2. Non-parametric asymptotic statistics for the Palm mark distribution of \\beta-mixing marked point processes

    CERN Document Server

    Heinrich, Lothar; Schmidt, Volker

    2012-01-01

    We consider spatially homogeneous marked point patterns in an unboundedly expanding convex sampling window. Our main objective is to identify the distribution of the typical mark by constructing an asymptotic \\chi^2-goodness-of-fit test. The corresponding test statistic is based on a natural empirical version of the Palm mark distribution and a smoothed covariance estimator which turns out to be mean-square consistent. Our approach does not require independent marks and allows dependences between the mark field and the point pattern. Instead we impose a suitable \\beta-mixing condition on the underlying stationary marked point process which can be checked for a number of Poisson-based models and, in particular, in the case of geostatistical marking. Our method needs a central limit theorem for \\beta-mixing random fields which is proved by extending Bernstein's blocking technique to non-cubic index sets and seems to be of interest in its own right. By large-scale model-based simulations the performance of our t...

  3. SOPIE: an R package for the non-parametric estimation of the off-pulse interval of a pulsar light curve

    Science.gov (United States)

    Schutte, Willem D.; Swanepoel, Jan W. H.

    2016-09-01

    An automated tool to derive the off-pulse interval of a light curve originating from a pulsar is needed. First, we derive a powerful and accurate non-parametric sequential estimation technique to estimate the off-pulse interval of a pulsar light curve in an objective manner. This is in contrast to the subjective `eye-ball' (visual) technique, and complementary to the Bayesian Block method which is currently used in the literature. The second aim involves the development of a statistical package, necessary for the implementation of our new estimation technique. We develop a statistical procedure to estimate the off-pulse interval in the presence of noise. It is based on a sequential application of p-values obtained from goodness-of-fit tests for uniformity. The Kolmogorov-Smirnov, Cramér-von Mises, Anderson-Darling and Rayleigh test statistics are applied. The details of the newly developed statistical package SOPIE (Sequential Off-Pulse Interval Estimation) are discussed. The developed estimation procedure is applied to simulated and real pulsar data. Finally, the SOPIE estimated off-pulse intervals of two pulsars are compared to the estimates obtained with the Bayesian Block method and yield very satisfactory results. We provide the code to implement the SOPIE package, which is publicly available at http://CRAN.R-project.org/package=SOPIE (Schutte).

  4. Non parametric deprojection of NIKA SZ observations: pressure distribution in the Planck-discovered cluster PSZ1 G045.85+57.71

    CERN Document Server

    Ruppin, F; Comis, B; Ade, P; André, P; Arnaud, M; Beelen, A; Benoît, A; Bideaud, A; Billot, N; Bourrion, O; Calvo, M; Catalano, A; Coiffard, G; D'Addabbo, A; De Petris, M; Désert, F -X; Doyle, S; Goupy, J; Kramer, C; Leclercq, S; Macías-Pérez, J F; Mauskopf, P; Mayet, F; Monfardini, A; Pajot, F; Pascale, E; Perotto, L; Pisano, G; Pointecouteau, E; Ponthieu, N; Pratt, G W; Revéret, V; Ritacco, A; Rodriguez, L; Romero, C; Schuster, K; Sievers, A; Triqueneaux, S; Tucker, C; Zylka, R

    2016-01-01

    The determination of the thermodynamic properties of clusters of galaxies at intermediate and high redshift can bring new insights into the formation of large scale structures. It is essential for a robust calibration of the mass-observable scaling relations and their scatter, which are key ingredients for precise cosmology using cluster statistics. Here we illustrate an application of high-resolution $(< 20$ arcsec) thermal Sunyaev-Zel'dovich (tSZ) observations by probing the intracluster medium (ICM) of the Planck-discovered galaxy cluster PSZ1 G045.85+57.71 at redshift $z = 0.61$, using tSZ data obtained with the NIKA camera, a dual-band (150 and 260~GHz) instrument operated at the IRAM 30-meter telescope. We deproject jointly NIKA and Planck data to extract the electronic pressure distribution non-parametrically from the cluster core ($R \\sim 0.02\\, R_{500}$) to its outskirts ($R \\sim 3\\, R_{500}$), for the first time at intermediate redshift. The constraints on the resulting pressure profile allow us ...

  5. Determination of drug absorption rate in time-variant disposition by direct deconvolution using beta clearance correction and end-constrained non-parametric regression.

    Science.gov (United States)

    Neelakantan, S; Veng-Pedersen, P

    2005-11-01

    A novel numerical deconvolution method is presented that enables the estimation of drug absorption rates under time-variant disposition conditions. The method involves two components. (1) A disposition decomposition-recomposition (DDR) enabling exact changes in the unit impulse response (UIR) to be constructed based on centrally based clearance changes iteratively determined. (2) A non-parametric, end-constrained cubic spline (ECS) input response function estimated by cross-validation. The proposed DDR-ECS method compensates for disposition changes between the test and the reference administrations by using a "beta" clearance correction based on DDR analysis. The representation of the input response by the ECS method takes into consideration the complex absorption process and also ensures physiologically realistic approximations of the response. The stability of the new method to noisy data was evaluated by comprehensive simulations that considered different UIRs, various input functions, clearance changes and a novel scaling of the input function that includes the "flip-flop" absorption phenomena. The simulated input response was also analysed by two other methods and all three methods were compared for their relative performances. The DDR-ECS method provides better estimation of the input profile under significant clearance changes but tends to overestimate the input when there were only small changes in the clearance.

  6. Non-parametric study of the evolution of the cosmological equation of state with SNeIa, BAO and high redshift GRBs

    CERN Document Server

    Postnikov, Sergey; Hernandez, Xavier; Capozziello, Salvatore

    2014-01-01

    We study the dark energy equation of state as a function of redshift in a non-parametric way, without imposing any {\\it a priori} $w(z)$ (ratio of pressure over energy density) functional form. As a check of the method, we test our scheme through the use of synthetic data sets produced from different input cosmological models which have the same relative errors and redshift distribution as the real data. Using the luminosity-time $L_{X}-T_{a}$ correlation for GRB X-ray afterglows (the Dainotti et al. correlation), we are able to utilize GRB sample from the {\\it Swift} satellite as probes of the expansion history of the Universe out to $z \\approx 10$. Within the assumption of a flat FLRW universe and combining SNeIa data with BAO constraints, the resulting maximum likelihood solutions are close to a constant $w=-1$. If one imposes the restriction of a constant $w$, we obtain $w=-0.99 \\pm 0.06$ (consistent with a cosmological constant) with the present day Hubble constant as $H_{0}=70.0 \\pm 0.6$ ${\\rm km} \\, {\\...

  7. A Critical Look at the Mass-Metallicity-SFR Relation in the Local Universe: Non-parametric Analysis Framework and Confounding Systematics

    CERN Document Server

    Salim, Samir; Ly, Chun; Brinchmann, Jarle; Davé, Romeel; Dickinson, Mark; Salzer, John J; Charlot, Stéphane

    2014-01-01

    It has been proposed that the mass-metallicity relation of galaxies exhibits a secondary dependence on star formation rate (SFR), and that the resulting M-Z-SFR relation may be redshift-invariant, i.e., "fundamental." However, conflicting results on the character of the SFR dependence, and whether it exists, have been reported. To gain insight into the origins of the conflicting results, we (a) devise a non-parametric, astrophysically-motivated analysis framework based on the offset from the star-forming ("main") sequence at a given stellar mass (relative specific SFR), (b) apply this methodology and perform a comprehensive re-analysis of the local M-Z-SFR relation, based on SDSS, GALEX, and WISE data, and (c) study the impact of sample selection, and of using different metallicity and SFR indicators. We show that metallicity is anti-correlated with specific SFR regardless of the indicators used. We do not find that the relation is spurious due to correlations arising from biased metallicity measurements, or ...

  8. Non-parametric deprojection of NIKA SZ observations: Pressure distribution in the Planck-discovered cluster PSZ1 G045.85+57.71

    Science.gov (United States)

    Ruppin, F.; Adam, R.; Comis, B.; Ade, P.; André, P.; Arnaud, M.; Beelen, A.; Benoît, A.; Bideaud, A.; Billot, N.; Bourrion, O.; Calvo, M.; Catalano, A.; Coiffard, G.; D'Addabbo, A.; De Petris, M.; Désert, F.-X.; Doyle, S.; Goupy, J.; Kramer, C.; Leclercq, S.; Macías-Pérez, J. F.; Mauskopf, P.; Mayet, F.; Monfardini, A.; Pajot, F.; Pascale, E.; Perotto, L.; Pisano, G.; Pointecouteau, E.; Ponthieu, N.; Pratt, G. W.; Revéret, V.; Ritacco, A.; Rodriguez, L.; Romero, C.; Schuster, K.; Sievers, A.; Triqueneaux, S.; Tucker, C.; Zylka, R.

    2017-01-01

    The determination of the thermodynamic properties of clusters of galaxies at intermediate and high redshift can bring new insights into the formation of large-scale structures. It is essential for a robust calibration of the mass-observable scaling relations and their scatter, which are key ingredients for precise cosmology using cluster statistics. Here we illustrate an application of high resolution (R 0.02 R500) to its outskirts (R 3 R500) non-parametrically for the first time at intermediate redshift. The constraints on the resulting pressure profile allow us to reduce the relative uncertainty on the integrated Compton parameter by a factor of two compared to the Planck value. Combining the tSZ data and the deprojected electronic density profile from XMM-Newton allows us to undertake a hydrostatic mass analysis, for which we study the impact of a spherical model assumption on the total mass estimate. We also investigate the radial temperature and entropy distributions. These data indicate that PSZ1 G045.85+57.71 is a massive (M500 5.5 × 1014M⊙) cool-core cluster. This work is part of a pilot study aiming at optimizing the treatment of the NIKA2 tSZ large program dedicated to the follow-up of SZ-discovered clusters at intermediate and high redshifts. This study illustrates the potential of NIKA2 to put constraints on thethermodynamic properties and tSZ-scaling relations of these clusters, and demonstrates the excellent synergy between tSZ and X-ray observations of similar angular resolution.

  9. Estimation from PET data of transient changes in dopamine concentration induced by alcohol: support for a non-parametric signal estimation method

    Energy Technology Data Exchange (ETDEWEB)

    Constantinescu, C C; Yoder, K K; Normandin, M D; Morris, E D [Department of Radiology, Indiana University School of Medicine, Indianapolis, IN (United States); Kareken, D A [Department of Neurology, Indiana University School of Medicine, Indianapolis, IN (United States); Bouman, C A [Weldon School of Biomedical Engineering, Purdue University, West Lafayette, IN (United States); O' Connor, S J [Department of Psychiatry, Indiana University School of Medicine, Indianapolis, IN (United States)], E-mail: emorris@iupui.edu

    2008-03-07

    We previously developed a model-independent technique (non-parametric ntPET) for extracting the transient changes in neurotransmitter concentration from paired (rest and activation) PET studies with a receptor ligand. To provide support for our method, we introduced three hypotheses of validation based on work by Endres and Carson (1998 J. Cereb. Blood Flow Metab. 18 1196-210) and Yoder et al (2004 J. Nucl. Med. 45 903-11), and tested them on experimental data. All three hypotheses describe relationships between the estimated free (synaptic) dopamine curves (F{sup DA}(t)) and the change in binding potential ({delta}BP). The veracity of the F{sup DA}(t) curves recovered by nonparametric ntPET is supported when the data adhere to the following hypothesized behaviors: (1) {delta}BP should decline with increasing DA peak time, (2) {delta}BP should increase as the strength of the temporal correlation between F{sup DA}(t) and the free raclopride (F{sup RAC}(t)) curve increases, (3) {delta}BP should decline linearly with the effective weighted availability of the receptor sites. We analyzed regional brain data from 8 healthy subjects who received two [{sup 11}C]raclopride scans: one at rest, and one during which unanticipated IV alcohol was administered to stimulate dopamine release. For several striatal regions, nonparametric ntPET was applied to recover F{sup DA}(t), and binding potential values were determined. Kendall rank-correlation analysis confirmed that the F{sup DA}(t) data followed the expected trends for all three validation hypotheses. Our findings lend credence to our model-independent estimates of F{sup DA}(t). Application of nonparametric ntPET may yield important insights into how alterations in timing of dopaminergic neurotransmission are involved in the pathologies of addiction and other psychiatric disorders.

  10. Non Parametric Statistical Analysis Research on College Students' Math Anxiety Generation Factors%大学生数学焦虑产生因素的非参数统计分析

    Institute of Scientific and Technical Information of China (English)

    范大付; 李春红

    2012-01-01

    The non-parametric statistics is a test method which does not involve the general parameter and does not depend on the distribution. By using the non-parametric statistics for analyzing and researching the factors of college students' math anxiety, we try to solve the negative effect for studying from math anxiety, and increase the academic achievement of the college students.%采用非参数统计方法中的Wilconxon秩和检验、Friedman检验、Mann-WhitneyU检验对大学生数学焦虑的5个主要影响因素进行了定量分析与评价,获得了数学焦虑产生因素的相关非参数统计结果,为解决数学焦虑所带来的学习负效应提供参考。

  11. International comparisons of the technical efficiency of the hospital sector: panel data analysis of OECD countries using parametric and non-parametric approaches.

    Science.gov (United States)

    Varabyova, Yauheniya; Schreyögg, Jonas

    2013-09-01

    There is a growing interest in the cross-country comparisons of the performance of national health care systems. The present work provides a comparison of the technical efficiency of the hospital sector using unbalanced panel data from OECD countries over the period 2000-2009. The estimation of the technical efficiency of the hospital sector is performed using nonparametric data envelopment analysis (DEA) and parametric stochastic frontier analysis (SFA). Internal and external validity of findings is assessed by estimating the Spearman rank correlations between the results obtained in different model specifications. The panel-data analyses using two-step DEA and one-stage SFA show that countries, which have higher health care expenditure per capita, tend to have a more technically efficient hospital sector. Whether the expenditure is financed through private or public sources is not related to the technical efficiency of the hospital sector. On the other hand, the hospital sector in countries with higher income inequality and longer average hospital length of stay is less technically efficient. Copyright © 2013 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  12. Non-parametric linear regression of discrete Fourier transform convoluted chromatographic peak responses under non-ideal conditions of internal standard method.

    Science.gov (United States)

    Korany, Mohamed A; Maher, Hadir M; Galal, Shereen M; Fahmy, Ossama T; Ragab, Marwa A A

    2010-11-15

    This manuscript discusses the application of chemometrics to the handling of HPLC response data using the internal standard method (ISM). This was performed on a model mixture containing terbutaline sulphate, guaiphenesin, bromhexine HCl, sodium benzoate and propylparaben as an internal standard. Derivative treatment of chromatographic response data of analyte and internal standard was followed by convolution of the resulting derivative curves using 8-points sin x(i) polynomials (discrete Fourier functions). The response of each analyte signal, its corresponding derivative and convoluted derivative data were divided by that of the internal standard to obtain the corresponding ratio data. This was found beneficial in eliminating different types of interferences. It was successfully applied to handle some of the most common chromatographic problems and non-ideal conditions, namely: overlapping chromatographic peaks and very low analyte concentrations. For example, a significant change in the correlation coefficient of sodium benzoate, in case of overlapping peaks, went from 0.9975 to 0.9998 on applying normal conventional peak area and first derivative under Fourier functions methods, respectively. Also a significant improvement in the precision and accuracy for the determination of synthetic mixtures and dosage forms in non-ideal cases was achieved. For example, in the case of overlapping peaks guaiphenesin mean recovery% and RSD% went from 91.57, 9.83 to 100.04, 0.78 on applying normal conventional peak area and first derivative under Fourier functions methods, respectively. This work also compares the application of Theil's method, a non-parametric regression method, in handling the response ratio data, with the least squares parametric regression method, which is considered the de facto standard method used for regression. Theil's method was found to be superior to the method of least squares as it assumes that errors could occur in both x- and y-directions and

  13. Quantitative Phylogenomics of Within-Species Mitogenome Variation: Monte Carlo and Non-Parametric Analysis of Phylogeographic Structure among Discrete Transatlantic Breeding Areas of Harp Seals (Pagophilus groenlandicus.

    Directory of Open Access Journals (Sweden)

    Steven M Carr

    -stepping-stone biogeographic models, but not a simple 1-step trans-Atlantic model. Plots of the cumulative pairwise sequence difference curves among seals in each of the four populations provide continuous proxies for phylogenetic diversification within each. Non-parametric Kolmogorov-Smirnov (K-S tests of maximum pairwise differences between these curves indicates that the Greenland Sea population has a markedly younger phylogenetic structure than either the White Sea population or the two Northwest Atlantic populations, which are of intermediate age and homogeneous structure. The Monte Carlo and K-S assessments provide sensitive quantitative tests of within-species mitogenomic phylogeography. This is the first study to indicate that the White Sea and Greenland Sea populations have different population genetic histories. The analysis supports the hypothesis that Harp Seals comprises three genetically distinguishable breeding populations, in the White Sea, Greenland Sea, and Northwest Atlantic. Implications for an ice-dependent species during ongoing climate change are discussed.

  14. Decompounding random sums: A nonparametric approach

    DEFF Research Database (Denmark)

    Hansen, Martin Bøgsted; Pitts, Susan M.

    review a number of applications and consider the nonlinear inverse problem of inferring the cumulative distribution function of the components in the random sum. We review the existing literature on non-parametric approaches to the problem. The models amenable to the analysis are generalized considerably...

  15. The Stellar Initial Mass Function in Early-type Galaxies from Absorption Line Spectroscopy. IV. A Super-Salpeter IMF in the Center of NGC 1407 from Non-parametric Models

    Science.gov (United States)

    Conroy, Charlie; van Dokkum, Pieter G.; Villaume, Alexa

    2017-03-01

    It is now well-established that the stellar initial mass function (IMF) can be determined from the absorption line spectra of old stellar systems, and this has been used to measure the IMF and its variation across the early-type galaxy population. Previous work focused on measuring the slope of the IMF over one or more stellar mass intervals, implicitly assuming that this is a good description of the IMF and that the IMF has a universal low-mass cutoff. In this work we consider more flexible IMFs, including two-component power laws with a variable low-mass cutoff and a general non-parametric model. We demonstrate with mock spectra that the detailed shape of the IMF can be accurately recovered as long as the data quality is high (S/N ≳ 300 Å‑1) and cover a wide wavelength range (0.4–1.0 μm). We apply these flexible IMF models to a high S/N spectrum of the center of the massive elliptical galaxy NGC 1407. Fitting the spectrum with non-parametric IMFs, we find that the IMF in the center shows a continuous rise extending toward the hydrogen-burning limit, with a behavior that is well-approximated by a power law with an index of ‑2.7. These results provide strong evidence for the existence of extreme (super-Salpeter) IMFs in the cores of massive galaxies.

  16. The Stellar Initial Mass Function in Early-Type Galaxies From Absorption Line Spectroscopy. IV. A Super-Salpeter IMF in the center of NGC 1407 from Non-Parametric Models

    CERN Document Server

    Conroy, Charlie; Villaume, Alexa

    2016-01-01

    It is now well-established that the stellar initial mass function (IMF) can be determined from the absorption line spectra of old stellar systems, and this has been used to measure the IMF and its variation across the early-type galaxy population. Previous work focused on measuring the slope of the IMF over one or more stellar mass intervals, implicitly assuming that this is a good description of the IMF and that the IMF has a universal low-mass cutoff. In this work we consider more flexible IMFs, including two-component power-laws with a variable low-mass cutoff and a general non-parametric model. We demonstrate with mock spectra that the detailed shape of the IMF can be accurately recovered as long as the data quality are high (S/N$\\gtrsim300$) and cover a wide wavelength range (0.4um-1.0um). We apply these flexible IMF models to a high S/N spectrum of the center of the massive elliptical galaxy NGC 1407. Fitting the spectrum with non-parametric IMFs, we find that the IMF in the center shows a continuous ri...

  17. 基于工业控制模型的非参数CUSUM入侵检测方法%A non-parametric CUSUM intrusion detection method based on industrial control model

    Institute of Scientific and Technical Information of China (English)

    张云贵; 赵华; 王丽娜

    2012-01-01

    To deal with the rising serious information security problem of the industrial control system (ICS) , this paper presents an intrusion detection method of the non-parametric cumulative sum (CUSUM) for industrial control network. Using the output-input dependent characteristics of the ICS, a mathematical model of the ICS is established to predict the output of the system. Once the sensors of the control system are under attack, the actual output will change. At every moment, the difference between the predicted output of the industrial control model and the measured signal by the sensors is calculated, and then the time-based statistical sequence is formed. By the non-parametric CUSUM algorithm, the online detection of the intrusion attacks is implemented and alarmed. The simulated detection experiments show that the proposed method has a good real-time and low false alarm rate. By choosing appropriate parameters r and β of the non-parametric CUSUM algorithm, the intrusion detection method can accurately detect the attacks before substantial damage to the control system and it is also helpful to monitor the misoperation.%为解决日趋严重的工业控制系统(industrial control system,ICS)信息安全问题,提出一种针对工业控制网络的非参数累积和( cumulative sum,CUSUM)入侵检测方法.利用ICS输入决定输出的特性,建立ICS的数学模型预测系统的输出,一旦控制系统的传感器遭受攻击,实际输出信号将发生改变.在每个时刻,计算工业控制模型的预测输出与传感器测量信号的差值,形成基于时间的统计序列,采用非参数CUSUM算法,实现在线检测入侵并报警.仿真检测实验证明,该方法具有良好的实时性和低误报率.选择适当的非参数CUSUM算法参数T和β,该入侵检测方法不但能在攻击对控制系统造成实质伤害前检测出攻击,还对监测ICS中的误操作有一定帮助.

  18. Comparação de duas metodologias de amostragem atmosférica com ferramenta estatística não paramétrica Comparison of two atmospheric sampling methodologies with non-parametric statistical tools

    Directory of Open Access Journals (Sweden)

    Maria João Nunes

    2005-03-01

    Full Text Available In atmospheric aerosol sampling, it is inevitable that the air that carries particles is in motion, as a result of both externally driven wind and the sucking action of the sampler itself. High or low air flow sampling speeds may lead to significant particle size bias. The objective of this work is the validation of measurements enabling the comparison of species concentration from both air flow sampling techniques. The presence of several outliers and increase of residuals with concentration becomes obvious, requiring non-parametric methods, recommended for the handling of data which may not be normally distributed. This way, conversion factors are obtained for each of the various species under study using Kendall regression.

  19. 随机右删失非参数回归模型的影响分析%Influence Analysis of Non-parametric Regression Model with Random Right Censorship

    Institute of Scientific and Technical Information of China (English)

    王淑玲; 冯予; 刘刚

    2012-01-01

    In this paper, the primary model is transformed to non-parametric regression model; Then, local influence is discussed and concise influence matrix is obtained; At last, example is given to illustrate our results.%将随机删失非参数固定设计回归模型转化为非参数回归模型进行研究;然后对此模型作了局部影响分析,得到计算影响矩阵及最大影响曲率方向的简洁公式;最后通过实例分析,验证了分析方法的有效性.

  20. Non-Parametric Cell-Based Photometric Proxies for Galaxy Morphology: Methodology and Application to the Morphologically-Defined Star Formation -- Stellar Mass Relation of Spiral Galaxies in the Local Universe

    CERN Document Server

    Grootes, M W; Popescu, C C; Robotham, A S G; Seibert, M; Kelvin, L S

    2013-01-01

    (Abridged) We present a non-parametric cell-based method of selecting highly pure and largely complete samples of spiral galaxies using photometric and structural parameters as provided by standard photometric pipelines and simple shape fitting algorithms, demonstrably superior to commonly used proxies. Furthermore, we find structural parameters derived using passbands longwards of the $g$ band and linked to older stellar populations, especially the stellar mass surface density $\\mu_*$ and the $r$ band effective radius $r_e$, to perform at least equally well as parameters more traditionally linked to the identification of spirals by means of their young stellar populations. In particular the distinct bimodality in the parameter $\\mu_*$, consistent with expectations of different evolutionary paths for spirals and ellipticals, represents an often overlooked yet powerful parameter in differentiating between spiral and non-spiral/elliptical galaxies. We investigate the intrinsic specific star-formation rate - ste...

  1. 统计软件R在非参数统计教学中的应用%Application of Statistical Software R in the Teaching of Non-Parametric Statistics

    Institute of Scientific and Technical Information of China (English)

    王志刚; 冯利英; 刘勇

    2012-01-01

    Introduces the applieation of statistical software R in the teaching of non-parametric statistic's, which is an important branch of statistics. In particular, describes the using of software R in ex- ploratory data analysis, inferential statistics and stochastic, simulation in details. The flexihle, open-sourc, e characteristics of software R makes the data processing more efficient. This soft- ware can realize all the methods of the teaching process, and is convenient fi~r learners to opti- mize and improve based on the previous work. R software is suitable for teaching of the non- parametric statistics.%主要介绍统计软件R在统计中一个重要分支非参数统计中的应用.分别从探索性数据分析、推断统计、随机模拟三个角度介绍R软件的应用。从介绍可以看出R软件的灵活、开源的特性,使得数据处理变得更加高效、得心应手。能够通过软件实现教学环节中的所有方法,并且方便学习者在前人工作基础上对方法进行优化、改进,在非参数统计教学中选用R软件是适合的。

  2. Which percentile-based approach should be preferred for calculating normalized citation impact values? An empirical comparison of five approaches including a newly developed citation-rank approach (P100)

    NARCIS (Netherlands)

    Bornmann, L.; Leydesdorff, L.; Wang, J.

    2013-01-01

    For comparisons of citation impacts across fields and over time, bibliometricians normalize the observed citation counts with reference to an expected citation value. Percentile-based approaches have been proposed as a non-parametric alternative to parametric central-tendency statistics. Percentiles

  3. Which percentile-based approach should be preferred for calculating normalized citation impact values? An empirical comparison of five approaches including a newly developed citation-rank approach (P100)

    NARCIS (Netherlands)

    Bornmann, L.; Leydesdorff, L.; Wang, J.

    2013-01-01

    For comparisons of citation impacts across fields and over time, bibliometricians normalize the observed citation counts with reference to an expected citation value. Percentile-based approaches have been proposed as a non-parametric alternative to parametric central-tendency statistics. Percentiles

  4. AUTO-IK: a 2D indicator kriging program for the automated non-parametric modeling of local uncertainty in earth sciences.

    Science.gov (United States)

    Goovaerts, P

    2009-06-01

    Indicator kriging provides a flexible interpolation approach that is well suited for datasets where: 1) many observations are below the detection limit, 2) the histogram is strongly skewed, or 3) specific classes of attribute values are better connected in space than others (e.g. low pollutant concentrations). To apply indicator kriging at its full potential requires, however, the tedious inference and modeling of multiple indicator semivariograms, as well as the post-processing of the results to retrieve attribute estimates and associated measures of uncertainty. This paper presents a computer code that performs automatically the following tasks: selection of thresholds for binary coding of continuous data, computation and modeling of indicator semivariograms, modeling of probability distributions at unmonitored locations (regular or irregular grids), and estimation of the mean and variance of these distributions. The program also offers tools for quantifying the goodness of the model of uncertainty within a cross-validation and jack-knife frameworks. The different functionalities are illustrated using heavy metal concentrations from the well-known soil Jura dataset. A sensitivity analysis demonstrates the benefit of using more thresholds when indicator kriging is implemented with a linear interpolation model, in particular for variables with positively skewed histograms.

  5. AUTO-IK: A 2D indicator kriging program for the automated non-parametric modeling of local uncertainty in earth sciences

    Science.gov (United States)

    Goovaerts, P.

    2009-06-01

    Indicator kriging (IK) provides a flexible interpolation approach that is well suited for datasets where: (1) many observations are below the detection limit, (2) the histogram is strongly skewed, or (3) specific classes of attribute values are better connected in space than others (e.g. low pollutant concentrations). To apply indicator kriging at its full potential requires, however, the tedious inference and modeling of multiple indicator semivariograms, as well as the post-processing of the results to retrieve attribute estimates and associated measures of uncertainty. This paper presents a computer code that performs automatically the following tasks: selection of thresholds for binary coding of continuous data, computation and modeling of indicator semivariograms, modeling of probability distributions at unmonitored locations (regular or irregular grids), and estimation of the mean and variance of these distributions. The program also offers tools for quantifying the goodness of the model of uncertainty within a cross-validation and jack-knife frameworks. The different functionalities are illustrated using heavy metal concentrations from the well-known soil Jura dataset. A sensitivity analysis demonstrates the benefit of using more thresholds when indicator kriging is implemented with a linear interpolation model, in particular for variables with positively skewed histograms.

  6. Benchmark of the non-parametric Bayesian deconvolution method implemented in the SINBAD code for X/γ rays spectra processing

    Science.gov (United States)

    Rohée, E.; Coulon, R.; Carrel, F.; Dautremer, T.; Barat, E.; Montagu, T.; Normand, S.; Jammes, C.

    2016-11-01

    Radionuclide identification and quantification are a serious concern for many applications as for in situ monitoring at nuclear facilities, laboratory analysis, special nuclear materials detection, environmental monitoring, and waste measurements. High resolution gamma-ray spectrometry based on high purity germanium diode detectors is the best solution available for isotopic identification. Over the last decades, methods have been developed to improve gamma spectra analysis. However, some difficulties remain in the analysis when full energy peaks are folded together with high ratio between their amplitudes, and when the Compton background is much larger compared to the signal of a single peak. In this context, this study deals with the comparison between a conventional analysis based on "iterative peak fitting deconvolution" method and a "nonparametric Bayesian deconvolution" approach developed by the CEA LIST and implemented into the SINBAD code. The iterative peak fit deconvolution is used in this study as a reference method largely validated by industrial standards to unfold complex spectra from HPGe detectors. Complex cases of spectra are studied from IAEA benchmark protocol tests and with measured spectra. The SINBAD code shows promising deconvolution capabilities compared to the conventional method without any expert parameter fine tuning.

  7. Short-term monitoring of benzene air concentration in an urban area: a preliminary study of application of Kruskal-Wallis non-parametric test to assess pollutant impact on global environment and indoor.

    Science.gov (United States)

    Mura, Maria Chiara; De Felice, Marco; Morlino, Roberta; Fuselli, Sergio

    2010-01-01

    In step with the need to develop statistical procedures to manage small-size environmental samples, in this work we have used concentration values of benzene (C6H6), concurrently detected by seven outdoor and indoor monitoring stations over 12 000 minutes, in order to assess the representativeness of collected data and the impact of the pollutant on indoor environment. Clearly, the former issue is strictly connected to sampling-site geometry, which proves critical to correctly retrieving information from analysis of pollutants of sanitary interest. Therefore, according to current criteria for network-planning, single stations have been interpreted as nodes of a set of adjoining triangles; then, a) node pairs have been taken into account in order to estimate pollutant stationarity on triangle sides, as well as b) node triplets, to statistically associate data from air-monitoring with the corresponding territory area, and c) node sextuplets, to assess the impact probability of the outdoor pollutant on indoor environment for each area. Distributions from the various node combinations are all non-Gaussian, in the consequently, Kruskal-Wallis (KW) non-parametric statistics has been exploited to test variability on continuous density function from each pair, triplet and sextuplet. Results from the above-mentioned statistical analysis have shown randomness of site selection, which has not allowed a reliable generalization of monitoring data to the entire selected territory, except for a single "forced" case (70%); most important, they suggest a possible procedure to optimize network design.

  8. Short-term monitoring of benzene air concentration in an urban area: a preliminary study of application of Kruskal-Wallis non-parametric test to assess pollutant impact on global environment and indoor

    Directory of Open Access Journals (Sweden)

    Maria Chiara Mura

    2010-12-01

    Full Text Available In step with the need to develop statistical procedures to manage small-size environmental samples, in this work we have used concentration values of benzene (C6H6, concurrently detected by seven outdoor and indoor monitoring stations over 12 000 minutes, in order to assess the representativeness of collected data and the impact of the pollutant on indoor environment. Clearly, the former issue is strictly connected to sampling-site geometry, which proves critical to correctly retrieving information from analysis of pollutants of sanitary interest. Therefore, according to current criteria for network-planning, single stations have been interpreted as nodes of a set of adjoining triangles; then, a node pairs have been taken into account in order to estimate pollutant stationarity on triangle sides, as well as b node triplets, to statistically associate data from air-monitoring with the corresponding territory area, and c node sextuplets, to assess the impact probability of the outdoor pollutant on indoor environment for each area. Distributions from the various node combinations are all non-Gaussian, in the consequently, Kruskal-Wallis (KW non-parametric statistics has been exploited to test variability on continuous density function from each pair, triplet and sextuplet. Results from the above-mentioned statistical analysis have shown randomness of site selection, which has not allowed a reliable generalization of monitoring data to the entire selected territory, except for a single "forced" case (70%; most important, they suggest a possible procedure to optimize network design.

  9. Parametric estimation in the wave buoy analogy - an elaborated approach based on energy considerations

    DEFF Research Database (Denmark)

    Montazeri, Najmeh; Nielsen, Ulrik Dam

    2014-01-01

    the ship’s wave-induced responses based on different statistical inferences including parametric and non-parametric approaches. This paper considers a concept to improve the estimate obtained by the parametric method for sea state estimation. The idea is illustrated by an analysis made on full-scale...

  10. Corporate failure: a non parametric method

    Directory of Open Access Journals (Sweden)

    Ben Jabeur Sami

    2013-07-01

    Full Text Available A number of authors suggested that the impact of the macroeconomic factors on the incidence of the financial distress, and afterward in case of failure of companies. However, macroeconomic factors rarely, if ever, appear as variables in predictive models that seek to identify distress and failure; modellers generally suggest that the impact of macroeconomic factors has already been taken into account by financial ratio variables.  This article presents a systematic study of this domain, by examining the link between the failure of companies and macroeconomic factors for the French companies to identify the most important variables and to estimate their utility in a predictive context. The results of the study suggest that several macroeconomic variables are strictly associated to the failure, and have a predictive value by specifying the relation between the financial distress and the failure.

  11. Parametric and Non-Parametric System Modelling

    DEFF Research Database (Denmark)

    Nielsen, Henrik Aalborg

    1999-01-01

    other aspects, the properties of a method for parameter estimation in stochastic differential equations is considered within the field of heat dynamics of buildings. In the second paper a lack-of-fit test for stochastic differential equations is presented. The test can be applied to both linear and non-linear...... networks is included. In this paper, neural networks are used for predicting the electricity production of a wind farm. The results are compared with results obtained using an adaptively estimated ARX-model. Finally, two papers on stochastic differential equations are included. In the first paper, among...... stochastic differential equations. Some applications are presented in the papers. In the summary report references are made to a number of other applications. Resumé på dansk: Nærværende afhandling består af ti artikler publiceret i perioden 1996-1999 samt et sammendrag og en perspektivering heraf. I...

  12. Non-Parametric Bayesian Areal Linguistics

    CERN Document Server

    Daumé, Hal

    2009-01-01

    We describe a statistical model over linguistic areas and phylogeny. Our model recovers known areas and identifies a plausible hierarchy of areal features. The use of areas improves genetic reconstruction of languages both qualitatively and quantitatively according to a variety of metrics. We model linguistic areas by a Pitman-Yor process and linguistic phylogeny by Kingman's coalescent.

  13. Non-Parametric Model Drift Detection

    Science.gov (United States)

    2016-07-01

    Analysis Division Information Directorate This report is published in the interest of scientific and technical...took place on datasets made up of text documents. The difference between datasets used to estimate potential error (drop in accuracy) that the model...Assistant, Extraction of executable rules from regulatory text 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT UU 18. NUMBER OF PAGES 19a

  14. Correlated Non-Parametric Latent Feature Models

    CERN Document Server

    Doshi-Velez, Finale

    2012-01-01

    We are often interested in explaining data through a set of hidden factors or features. When the number of hidden features is unknown, the Indian Buffet Process (IBP) is a nonparametric latent feature model that does not bound the number of active features in dataset. However, the IBP assumes that all latent features are uncorrelated, making it inadequate for many realworld problems. We introduce a framework for correlated nonparametric feature models, generalising the IBP. We use this framework to generate several specific models and demonstrate applications on realworld datasets.

  15. Non Parametric Classification Using Learning Vector Quantization

    Science.gov (United States)

    1990-08-21

    0.(0) = a. Then for every finite T and 7 > 0 lim P isup e,, - 6.(t,)I > 7) 0. (2.18)all 10 (t,. <T This result is proved in Section 2.3. The second...91 References A. Benveniste, M. Metivier & P. Priouret [1987], Algorithmes Adaptatifs et Ap- proximations Stochastiques, Mason, Paris . P. Billingsley

  16. Temporal Expression of Peripheral Blood Leukocyte Biomarkers in a Macaca fascicularis Infection Model of Tuberculosis; Comparison with Human Datasets and Analysis with Parametric/Non-parametric Tools for Improved Diagnostic Biomarker Identification.

    Directory of Open Access Journals (Sweden)

    Sajid Javed

    Full Text Available A temporal study of gene expression in peripheral blood leukocytes (PBLs from a Mycobacterium tuberculosis primary, pulmonary challenge model Macaca fascicularis has been conducted. PBL samples were taken prior to challenge and at one, two, four and six weeks post-challenge and labelled, purified RNAs hybridised to Operon Human Genome AROS V4.0 slides. Data analyses revealed a large number of differentially regulated gene entities, which exhibited temporal profiles of expression across the time course study. Further data refinements identified groups of key markers showing group-specific expression patterns, with a substantial reprogramming event evident at the four to six week interval. Selected statistically-significant gene entities from this study and other immune and apoptotic markers were validated using qPCR, which confirmed many of the results obtained using microarray hybridisation. These showed evidence of a step-change in gene expression from an 'early' FOS-associated response, to a 'late' predominantly type I interferon-driven response, with coincident reduction of expression of other markers. Loss of T-cell-associate marker expression was observed in responsive animals, with concordant elevation of markers which may be associated with a myeloid suppressor cell phenotype e.g. CD163. The animals in the study were of different lineages and these Chinese and Mauritian cynomolgous macaque lines showed clear evidence of differing susceptibilities to Tuberculosis challenge. We determined a number of key differences in response profiles between the groups, particularly in expression of T-cell and apoptotic makers, amongst others. These have provided interesting insights into innate susceptibility related to different host `phenotypes. Using a combination of parametric and non-parametric artificial neural network analyses we have identified key genes and regulatory pathways which may be important in early and adaptive responses to TB. Using

  17. Statistical Approaches to Aerosol Dynamics for Climate Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Zhu, Wei

    2014-09-02

    In this work, we introduce two general non-parametric regression analysis methods for errors-in-variable (EIV) models: the compound regression, and the constrained regression. It is shown that these approaches are equivalent to each other and, to the general parametric structural modeling approach. The advantages of these methods lie in their intuitive geometric representations, their distribution free nature, and their ability to offer a practical solution when the ratio of the error variances is unknown. Each includes the classic non-parametric regression methods of ordinary least squares, geometric mean regression, and orthogonal regression as special cases. Both methods can be readily generalized to multiple linear regression with two or more random regressors.

  18. 中国膳食暴露评估非参数概率模型构建%Establishment of non-parametric probabilistic model for evaluation of Chinese dietary exposure

    Institute of Scientific and Technical Information of China (English)

    孙金芳; 刘沛; 陈炳为; 陈启光; 余小金; 王灿楠; 李靖欣

    2010-01-01

    目的 为提高评估精度并与国际食品安全风险评估技术接轨,构建中国膳食暴露评估非参数概率模型.方法 利用我国膳食调查、污染物监测数据及相应的人口学资料建立膳食消费量和化学污染物浓度经验分布.通过蒙特卡洛(Monte Carlo)模拟和自助法(Bootstrap)抽样获得人群膳食暴露变异度和不确定度.其中,膳食量数据和人口学数据来源于2002年中国居民营养与健康状况调查24 h膳食回顾法收集的22 567个家庭66 172人连续3 d调查共计193 814个人日、1 810 703条数据.污染物监测数据为2000-2006年全国14个省或地区食品污染物监测网以及2005-2006年海关出口农产品监测数据,包括重金属、农药,以及霉菌毒素(如黄曲霉毒素)等135种污染物,涉及499种食物,共计487 819条数据.结果 构建了包括重金属、农药及部分毒素的我国人群膳食暴露非参数概率评估模型,得到不同污染物膳食暴露量分布的指标统计量和95%可信区间.对7~10岁儿童乙酰甲胺磷膳食暴露评估显示,城乡儿童膳食暴露量的中位数分别为1.77μg·kg~(-1)·d~(-1)和2.48μg·kg~(-10·d~(-1),其95%可信区间分别为(1.59~2.06)μg·kg~(-1)·d~(-1)和(2.33~2.80)μg·kg~(-1)·d~(-1).结论 构建的非参数概率模型可量化暴露评估中的变异度和不确定度,提高了膳食暴露评估精度.%Objective To establish a non-parametric probabilistic model for evaluation of Chinese dietary exposure and to improve the assessment accuracy while integrating into the global risk assessment on food safety.Methods Contamination data was from the national food contamination monitoring program during 2000-2006 ,including heavy metals,pesticides and mycotoxins,amounting to 135 contaminants with 499 commodities and 487 819 samples.Food consumption data was obtained from the national diet and nutrition survey conducted in 2002 with three consecutive days by 24-hour recall method

  19. Un estudio no paramétrico de eficiencia para la minería de Zacatecas, México || A Non-Parametric Approach to Efficiency for Mining in Zacatecas, Mexico

    Directory of Open Access Journals (Sweden)

    Rodallegas Portillo, Mayra C.

    2012-01-01

    Full Text Available La intención de este trabajo es abordar el sector minero en el estado de Zacatecas bajo la técnica del análisis envolvente de datos para construir indicadores de eficiencia en los años 1998, 2003 y 2008. Se realiza un análisis entre entidades federativas para comparar el desempeño a nivel nacional; asimismo, se efectúa un estudio comparativo entre tipos específicos de producción minera. El artículo constituye una fuente de información significativa sobre el desempeño de la industria minera en la entidad, pudiendo identificar los productos del estado que son susceptibles de mejora técnica mediante las clases de actividad ineficientes. || The aim of the paper is to use the data envelopment analysis (DEA for mining in Zacatecas and provide efficiency indicators in the years 1998, 2003, and 2008. We compare the performance of the state of Zacatecas with other mining states in Mexico. Then the empirical analysis extends to specific mining. The study is an important source of information about mining behavior. By using the national industries level, it was possible to identify the products that are susceptible of improvement.

  20. Entry Mode for FDI and Firm Productivity:A Non-parametric Approach%FDI 进入方式与企业生产率--非参数分析方法

    Institute of Scientific and Technical Information of China (English)

    赵万东

    2014-01-01

    采用1998-2007年中国制造业外商投资企业的非平衡面板数据,在估计企业全要素生产率的基础上,采用一阶随机占优方法比较分析了外商独资企业与中外合资企业全要素生产率的差异。研究发现,中国外商独资企业的生产率在总体上并不比中外合资企业高,但随着跨国公司母国与中国制度特别是经济制度的差距逐渐缩小,外商独资企业的生产率不断提高;随着FDI进入时间的推移,外商独资企业逐渐克服进入初期的不利因素,其生产率明显高于中外合资企业;大中型中外合资企业可能由于能较好地适应中国的文化、管理特点以及经营协同效应的存在,其生产率明显高于大中型外商独资企业。%This paper examines total factor productivity differences between wholly owned subsidiaries and joint ventures . T hese differences are documented based on the concept of first order stochastic dominance and on the basis of a sample of Chinese manufacturing foreign firms over the period 1998-2007 . Our results show that not all wholly owned subsidiaries have greater productivity than joint ventures ,but institutions distance , especially economic systems distance , between home country of multinational corporation and China reducing ,productivity for w holly ow ned subsidiaries improve continually .Operating time in China has a significant positive effect on productivity for wholly owned subsidiaries .Large and medium-sized joint ventures perform better than w holly ow ned subsidiaries , perhaps because they can adapt themselves well to our country's culture ,management characteristics and the existence of synergistic effect in operating .

  1. Choosing the best non-parametric richness estimator for benthic macroinvertebrates databases Eligiendo el mejor estimador no paramétrico para calcular riqueza en bases de datos de macroinvertebrados bentónicos

    Directory of Open Access Journals (Sweden)

    Carola V. Basualdo

    2011-06-01

    Full Text Available Non-parametric estimators allow to compare the estimates of richness among data sets from heterogeneous sources. However, since the estimator performance depends on the species-abundance distribution of the sample, preference for one or another is a difficult issue. The present study recovers and revalues some criteria already present in the literature in order to choose the most suitable estimator for streams macroinvertebrates, and provides some tools to apply them. Two abundance and four incidence estimators were applied to a regional database at family and genus level. They were evaluated under four criteria: sub-sample size required to estimate the observed richness; constancy of the sub-sample size; lack of erratic behavior and similarity in curve shape through different data sets. Among incidence estimators, Jack1 had the best performance. Between abundance estimators, ACE was the best when the observed richness was small and Chao1 when the observed richness was high. The uniformity of curves shapes allowed to describe the general sequences of curves behavior that could act as references to compare estimations of small databases and to infer the possible behavior of the curve (i.e the expected richness if the sample were larger. These results can be very useful for environmental management, and update the state of knowledge of regional macroinvertebrates.Los estimadores no paramétricos permiten comparar la riqueza estimada de conjuntos de datos de origen diverso. Empero, como su comportamiento depende de la distribución de abundancia del conjunto de datos, la preferencia por alguno representa una decisión difícil. Este trabajo rescata algunos criterios presentes en la literatura para elegir el estimador más adecuado para macroinvertebrados bentónicos de ríos y ofrece algunas herramientas para su aplicación. Cuatro estimadores de incidencia y dos de abundancia se aplicaron a un inventario regional a nivel de familia y género. Para

  2. Robust statistical approaches to assess the degree of agreement of clinical data

    Science.gov (United States)

    Grilo, Luís M.; Grilo, Helena L.

    2016-06-01

    To analyze the blood of patients who took vitamin B12 for a period of time, two different medicine measurement methods were used (one is the established method, with more human intervention, and the other method uses essentially machines). Given the non-normality of the differences between both measurement methods, the limits of agreement are estimated using also a non-parametric approach to assess the degree of agreement of the clinical data. The bootstrap resampling method is applied in order to obtain robust confidence intervals for mean and median of differences. The approaches used are easy to apply, running a friendly software, and their outputs are also easy to interpret. In this case study the results obtained with (non)parametric approaches lead us to different statistical conclusions, but the decision whether agreement is acceptable or not is always a clinical judgment.

  3. Evaluación de la estabilidad de un cultivar de caña de azúcar (Saccharum spp. en diferentes ambientes agroecológicos a través de una técnica no paramétrica en Tucumán, R. Argentina Assessment of the stability of a sugarcane (Saccharum spp. cultivar in different environments by a non-parametric test in Tucumán, Argentina

    Directory of Open Access Journals (Sweden)

    Santiago Ostengo

    2011-12-01

    considered in a breeding program. It is for that reason that in sugar cane breeding, multienvironmental trials (MET are conducted at the last stage of the selection process. There exist different approaches to study genotype-environment interaction. One of these is the non-parametric technique, a valid and useful tool which allows making an initial exploration that can be easily interpreted. The non-parametric technique called relative consistency of performance enables the classification of genotypes into the following four categories: (i consistently superior; (ii inconsistently superior; (iii inconsistently inferior and (iv consistently inferior. This work aims to evaluate the consistency of performance of TUC 95-10 variety across different agro-ecological environments in the province of Tucumán (Argentina, as regards the variable tons of sugar per hectare and considering different crop ages. Data were obtained from MET of the Sugarcane Breeding Program of Estación Experimental Agroindustrial Obispo Colombres (EEAOC from Tucumán (Argentina, conducted at six sites through four crop ages. Results showed that TUC 95-10, recently released by EEAOC, can be labeled as consistently superior at all ages, i.e. it held the top position in sugar production in all tested environments. Therefore, it can be concluded that TUC 95-10 shows an excellent performance and good adaptation to different agro-ecological environments in Tucumán, at all crop ages.

  4. Identifying resident care areas for a quality improvement intervention in long-term care: a collaborative approach

    Directory of Open Access Journals (Sweden)

    Cranley Lisa A

    2012-09-01

    Full Text Available Abstract Background In Canada, healthcare aides (also referred to as nurse aides, personal support workers, nursing assistants are unregulated personnel who provide 70-80% of direct care to residents living in nursing homes. Although they are an integral part of the care team their contributions to the resident care planning process are not always acknowledged in the organization. The purpose of the Safer Care for Older Persons [in residential] Environments (SCOPE project was to evaluate the feasibility of engaging front line staff (primarily healthcare aides to use quality improvement methods to integrate best practices into resident care. This paper describes the process used by teams participating in the SCOPE project to select clinical improvement areas. Methods The study employed a collaborative approach to identify clinical areas and through consensus, teams selected one of three areas. To select the clinical areas we recruited two nursing homes not involved in the SCOPE project and sampled healthcare providers and decision-makers within them. A vote counting method was used to determine the top five ranked clinical areas for improvement. Results Responses received from stakeholder groups included gerontology experts, decision-makers, registered nurses, managers, and healthcare aides. The top ranked areas from highest to lowest were pain/discomfort management, behaviour management, depression, skin integrity, and assistance with eating. Conclusions Involving staff in selecting areas that they perceive as needing improvement may facilitate staff engagement in the quality improvement process.

  5. Bayesian and Geostatistical Approaches to Combining Categorical Data Derived from Visual and Digital Processing of Remotely Sensed Images

    Institute of Scientific and Technical Information of China (English)

    ZHANG Jingxiong; LI Deren

    2005-01-01

    This paper seeks a synthesis of Bayesian and geostatistical approaches to combining categorical data in the context of remote sensing classification.By experiment with aerial photographs and Landsat TM data, accuracy of spectral, spatial, and combined classification results was evaluated.It was confirmed that the incorporation of spatial information in spectral classification increases accuracy significantly.Secondly, through test with a 5-class and a 3-class classification schemes, it was revealed that setting a proper semantic framework for classification is fundamental to any endeavors of categorical mapping and the most important factor affecting accuracy.Lastly, this paper promotes non-parametric methods for both definition of class membership profiling based on band-specific histograms of image intensities and derivation of spatial probability via indicator kriging, a non-parametric geostatistical technique.

  6. A combined superiority and non-inferiority approach to multiple endpoints in clinical trials.

    Science.gov (United States)

    Bloch, Daniel A; Lai, Tze Leung; Su, Zheng; Tubert-Bitter, Pascale

    2007-03-15

    Treatment comparisons in clinical trials often involve multiple endpoints. By making use of bootstrap tests, we develop a new non-parametric approach to multiple-endpoint testing that can be used to demonstrate non-inferiority of a new treatment for all endpoints and superiority for some endpoint when it is compared to an active control. It is shown that this approach does not incur a large multiplicity cost in sample size to achieve reasonable power and that it can incorporate complex dependencies in the multivariate distributions of all outcome variables for the two treatments via bootstrap resampling. Copyright (c) 2006 John Wiley & Sons, Ltd.

  7. Non-parametric causal inference for bivariate time series

    CERN Document Server

    McCracken, James M

    2015-01-01

    We introduce new quantities for exploratory causal inference between bivariate time series. The quantities, called penchants and leanings, are computationally straightforward to apply, follow directly from assumptions of probabilistic causality, do not depend on any assumed models for the time series generating process, and do not rely on any embedding procedures; these features may provide a clearer interpretation of the results than those from existing time series causality tools. The penchant and leaning are computed based on a structured method for computing probabilities.

  8. Further Research into a Non-Parametric Statistical Screening System.

    Science.gov (United States)

    1979-12-14

    Let X = V if birth weight is high X2 = 0 if gestation length is short V2 if gestation length is long Normal babies have high birth weight and long... gestation length or low birth weight and short gestation length . Abnormal babies have either of the other two combinations ((0, 1) or (1, 0)). The LDF

  9. Using non-parametric methods in econometric production analysis

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard; Henningsen, Arne

    Econometric estimation of production functions is one of the most common methods in applied economic production analysis. These studies usually apply parametric estimation techniques, which obligate the researcher to specify the functional form of the production function. Most often, the Cobb......-Douglas or the Translog production function is used. However, the specification of a functional form for the production function involves the risk of specifying a functional form that is not similar to the “true” relationship between the inputs and the output. This misspecification might result in biased estimation...... results—including measures that are of interest of applied economists, such as elasticities. Therefore, we propose to use nonparametric econometric methods. First, they can be applied to verify the functional form used in parametric estimations of production functions. Second, they can be directly used...

  10. Non-Parametric Bayesian State Space Estimator for Negative Information

    Directory of Open Access Journals (Sweden)

    Guillaume de Chambrier

    2017-09-01

    Full Text Available Simultaneous Localization and Mapping (SLAM is concerned with the development of filters to accurately and efficiently infer the state parameters (position, orientation, etc. of an agent and aspects of its environment, commonly referred to as the map. A mapping system is necessary for the agent to achieve situatedness, which is a precondition for planning and reasoning. In this work, we consider an agent who is given the task of finding a set of objects. The agent has limited perception and can only sense the presence of objects if a direct contact is made, as a result most of the sensing is negative information. In the absence of recurrent sightings or direct measurements of objects, there are no correlations from the measurement errors that can be exploited. This renders SLAM estimators, for which this fact is their backbone such as EKF-SLAM, ineffective. In addition for our setting, no assumptions are taken with respect to the marginals (beliefs of both the agent and objects (map. From the loose assumptions we stipulate regarding the marginals and measurements, we adopt a histogram parametrization. We introduce a Bayesian State Space Estimator (BSSE, which we name Measurement Likelihood Memory Filter (MLMF, in which the values of the joint distribution are not parametrized but instead we directly apply changes from the measurement integration step to the marginals. This is achieved by keeping track of the history of likelihood functions’ parameters. We demonstrate that the MLMF gives the same filtered marginals as a histogram filter and show two implementations: MLMF and scalable-MLMF that both have a linear space complexity. The original MLMF retains an exponential time complexity (although an order of magnitude smaller than the histogram filter while the scalable-MLMF introduced independence assumption such to have a linear time complexity. We further quantitatively demonstrate the scalability of our algorithm with 25 beliefs having up to 10,000,000 states each. In an Active-SLAM setting, we evaluate the impact that the size of the memory’s history has on the decision-theoretic process in a search scenario for a one-step look-ahead information gain planner. We report on both 1D and 2D experiments.

  11. A non-parametric 2D deformable template classifier

    DEFF Research Database (Denmark)

    Schultz, Nette; Nielsen, Allan Aasbjerg; Conradsen, Knut;

    2005-01-01

    We introduce an interactive segmentation method for a sea floor survey. The method is based on a deformable template classifier and is developed to segment data from an echo sounder post-processor called RoxAnn. RoxAnn collects two different measures for each observation point, and in this 2D...... feature space the ship-master will be able to interactively define a segmentation map, which is refined and optimized by the deformable template algorithms. The deformable templates are defined as two-dimensional vector-cycles. Local random transformations are applied to the vector-cycles, and stochastic...... relaxation in a Bayesian scheme is used. In the Bayesian likelihood a class density function and its estimate hereof is introduced, which is designed to separate the feature space. The method is verified on data collected in Øresund, Scandinavia. The data come from four geographically different areas. Two...

  12. A Non-parametric Analysis of Morbidity/Mortality Data

    Science.gov (United States)

    1998-11-01

    numjcov,cov,numout); %sumstd(denjcov,cov, denout ); /**The files numout and denout contain one observation each*** ***with variables sum and stdev...symput(’numsum’,compress(sum)); stop; run; data _null_; set denout ; 51 sig = stdev*stdev; call symput(’densig’,compress(sig)); call symput(’densum...covden,2,1,&rownum,&colnum,denjcov); %sumstd(numjcov,cov,numout); %sumstd(denjcov,cov, denout ); /**The files numout and denout contain one observation each

  13. Quantifying Audit Expectation Gap: A New approach to Measuring Expectation Gap

    Directory of Open Access Journals (Sweden)

    Salehi Mahdi

    2016-05-01

    Full Text Available The main objective of the study is at first identifying the expectation gap about audit responsibility and the second quantifying the expectation gap in Iran. In order to collecting data, a questionnaire designed and developed between auditors and investors. Collected data analyzed by employing non-parametric statistics test. The results show that there is expectation gap between auditors and investors in Iran. The current study employed a new approach in the world in order to quantifying the expectation gap. It gives the more strength to other researchers in order to measuring audit expectation gap in the world.

  14. MEASURING THE MARKET RISK OF FREIGHT RATES: A VALUE-AT-RISK APPROACH

    OpenAIRE

    TIMOTHEOS ANGELIDIS; GEORGE SKIADOPOULOS

    2008-01-01

    The fluctuation of shipping freight rates (freight rate risk) is an important source of market risk for all participants in the freight markets including hedge funds, commodity and energy producers. We measure the freight rate risk by the Value-at-Risk (VaR) approach. A range of parametric and non-parametric VaR methods is applied to various popular freight markets for dry and wet cargoes. Backtesting is conducted in two stages by means of statistical tests and a subjective loss function that...

  15. A Bayesian Sampling Approach to Exploration in Reinforcement Learning

    CERN Document Server

    Asmuth, John; Littman, Michael L; Nouri, Ali; Wingate, David

    2012-01-01

    We present a modular approach to reinforcement learning that uses a Bayesian representation of the uncertainty over models. The approach, BOSS (Best of Sampled Set), drives exploration by sampling multiple models from the posterior and selecting actions optimistically. It extends previous work by providing a rule for deciding when to resample and how to combine the models. We show that our algorithm achieves nearoptimal reward with high probability with a sample complexity that is low relative to the speed at which the posterior distribution converges during learning. We demonstrate that BOSS performs quite favorably compared to state-of-the-art reinforcement-learning approaches and illustrate its flexibility by pairing it with a non-parametric model that generalizes across states.

  16. Análise não-paramétrica da sanidade de sementes e índices de eliminação e classificação de genótipos de soja Non-parametric analysis of seed sanity and elimination and ranking indices of soybean genotypes

    Directory of Open Access Journals (Sweden)

    Edmar Soares de Vasconcelos

    2008-03-01

    Full Text Available O objetivo deste trabalho foi avaliar genótipos de soja quanto à sanidade de semente, com um método de análise, pelo qual se obtém índices de sanidade (eliminação e classificação com base em análise não-paramétrica. Esses índices consistiram em eliminar os genótipos com incidência de patógenos acima de um dado valor, estabelecido pelo experimentador e, em seguida, classificar os genótipos não eliminados, por ordem de incidência desses patógenos. A fim de comprovar sua eficácia, realizaram-se a simulação e comparação desse método com outros, e seu uso em dados de germinação e sanidade das sementes de cultivares e linhagens de soja, de ensaios finais do Programa de Melhoramento de Soja, do Departamento de Fitotecnia, da Universidade Federal de Viçosa, conduzidos no ano agrícola de 2002/2003. Os pesos das variáveis e os limites de corte, utilizados nos índices, foram estabelecidos tendo-se levado em consideração estudos que relacionam a sanidade das sementes e sua germinação. A utilização dos índices propostos permite classificar genótipos de soja, quanto à qualidade sanitária das sementes, e eliminar das análises os genótipos que não atingiram os níveis mínimos requeridos.The objective of this work was to assess soybean genotypes for seed sanity, with a method by which a sanity index (elimination and classification is obtained based on non-parametric analysis. This index consisted in the elimination of genotypes with pathogen incidence above a certain value, established by the researcher, and then the classification of the noneliminated genotypes in the first step, ordering them according to the incidence of the pathogens. To verify its effectiveness, it was accomplished a simulation study and comparison of this proposed method with others, and its use in germination and sanity data of seeds from soybean lineages and cultivars of final experiments of the Soybean Breeding Program of Departmento de

  17. Seleção de híbridos diplóides (AA de bananeira com base em três índices não paramétricos Selection of (AA diploid banana hybrids using three non-parametric indices

    Directory of Open Access Journals (Sweden)

    Lauro Saraiva Lessa

    2010-01-01

    Full Text Available Objetivou-se selecionar híbridos diplóides (AA de bananeira com base em três índices não paramétricos, a fim de orientar a seleção e aumentar o aproveitamento da variabilidade existente no Banco de Germoplasma de Bananeira da Embrapa Mandioca e Fruticultura Tropical. Foram avaliados 11 híbridos, no delineamento de blocos ao acaso, com quatro repetições. As parcelas constituíram-se de seis plantas, espaçadas de 2,5 m x 2,5 m, tendo na bordadura plantas da cultivar Pacovan. Tomaram-se dados dos seguintes caracteres: altura da planta, diâmetro do pseudocaule, número de filhos na floração, número de folhas na floração, ciclo da planta do plantio à emissão do cacho, presença de pólen, número de pencas, número de frutos, comprimento do fruto e resistência à Sigatoka-amarela. As médias desses 10 caracteres foram empregadas no cálculo dos índices multiplicativos, de soma de classificação e da distância genótipo-ideótipo. Os dois híbridos de melhor desempenho geral, o SH3263 e o 1318-01, foram classificados, respectivamente, em primeiro e segundo lugares pelos índices multiplicativos e de soma de classificação, enquanto o índice da distância genótipo-ideótipo os classificou em primeiro e quarto lugares respectivamente. Embora os três índices tenham demonstrado uma boa correspondência entre o desempenho geral dos híbridos e a sua classificação, os índices multiplicativo e de soma de classificação propiciaram classificação mais adequada desses híbridos.The objective of the present study was to select diploids (AA hybrids of banana based on three non-parametric indices as to guide the selection and increase the use of the variability present in the Banana Germplasm Bank of Embrapa Cassava and Tropical Fruits. Eleven hybrids were evaluated in random blocks with four replicates. The plots consisted of six plants spaced 2.5 m x 2.5 m whereas the border rows were from the Pacovan cultivar. The following

  18. Illiquidity premium and expected stock returns in the UK: A new approach

    Science.gov (United States)

    Chen, Jiaqi; Sherif, Mohamed

    2016-09-01

    This study examines the relative importance of liquidity risk for the time-series and cross-section of stock returns in the UK. We propose a simple way to capture the multidimensionality of illiquidity. Our analysis indicates that existing illiquidity measures have considerable asset specific components, which justifies our new approach. Further, we use an alternative test of the Amihud (2002) measure and parametric and non-parametric methods to investigate whether liquidity risk is priced in the UK. We find that the inclusion of the illiquidity factor in the capital asset pricing model plays a significant role in explaining the cross-sectional variation in stock returns, in particular with the Fama-French three-factor model. Further, using Hansen-Jagannathan non-parametric bounds, we find that the illiquidity-augmented capital asset pricing models yield a small distance error, other non-liquidity based models fail to yield economically plausible distance values. Our findings have important implications for managing the liquidity risk of equity portfolios.

  19. Multi-frequency image reconstruction for radio interferometry. A regularized inverse problem approach

    CERN Document Server

    Ferrari, André; Ferrari, Chiara; Mary, David; Schutz, Antony; Smirnov, Oleg

    2015-01-01

    We describe a "spatio-spectral" deconvolution algorithm for wide-band imaging in radio interferometry. In contrast with the existing multi-frequency reconstruction algorithms, the proposed method does not rely on a model of the sky-brightness spectral distribution. This non-parametric approach can be of particular interest for the new generation of low frequency radiotelescopes. The proposed solution formalizes the reconstruction problem as a convex optimization problem with spatial and spectral regularizations. The efficiency of this approach has been already proven for narrow-band image reconstruction and the present contribution can be considered as its extension to the multi-frequency case. Because the number of frequency bands multiplies the size of the inverse problem, particular attention is devoted to the derivation of an iterative large scale optimization algorithm. It is shown that the main computational bottleneck of the approach, which lies in the resolution of a linear system, can be efficiently ...

  20. Modelación de episodios críticos de contaminación por material particulado (PM10 en Santiago de Chile: Comparación de la eficiencia predictiva de los modelos paramétricos y no paramétricos Modeling critical episodes of air pollution by PM10 in Santiago, Chile: Comparison of the predictive efficiency of parametric and non-parametric statistical models

    Directory of Open Access Journals (Sweden)

    Sergio A. Alvarado

    2010-12-01

    Full Text Available Objetivo: Evaluar la eficiencia predictiva de modelos estadísticos paramétricos y no paramétricos para predecir episodios críticos de contaminación por material particulado PM10 del día siguiente, que superen en Santiago de Chile la norma de calidad diaria. Una predicción adecuada de tales episodios permite a la autoridad decretar medidas restrictivas que aminoren la gravedad del episodio, y consecuentemente proteger la salud de la comunidad. Método: Se trabajó con las concentraciones de material particulado PM10 registradas en una estación asociada a la red de monitorización de la calidad del aire MACAM-2, considerando 152 observaciones diarias de 14 variables, y con información meteorológica registrada durante los años 2001 a 2004. Se ajustaron modelos estadísticos paramétricos Gamma usando el paquete estadístico STATA v11, y no paramétricos usando una demo del software estadístico MARS v 2.0 distribuida por Salford-Systems. Resultados: Ambos métodos de modelación presentan una alta correlación entre los valores observados y los predichos. Los modelos Gamma presentan mejores aciertos que MARS para las concentraciones de PM10 con valores Objective: To evaluate the predictive efficiency of two statistical models (one parametric and the other non-parametric to predict critical episodes of air pollution exceeding daily air quality standards in Santiago, Chile by using the next day PM10 maximum 24h value. Accurate prediction of such episodes would allow restrictive measures to be applied by health authorities to reduce their seriousness and protect the community´s health. Methods: We used the PM10 concentrations registered by a station of the Air Quality Monitoring Network (152 daily observations of 14 variables and meteorological information gathered from 2001 to 2004. To construct predictive models, we fitted a parametric Gamma model using STATA v11 software and a non-parametric MARS model by using a demo version of Salford

  1. Generation of scenarios from calibrated ensemble forecasts with a dynamic ensemble copula coupling approach

    CERN Document Server

    Bouallegue, Zied Ben; Theis, Susanne E; Pinson, Pierre

    2015-01-01

    Probabilistic forecasts in the form of ensemble of scenarios are required for complex decision making processes. Ensemble forecasting systems provide such products but the spatio-temporal structures of the forecast uncertainty is lost when statistical calibration of the ensemble forecasts is applied for each lead time and location independently. Non-parametric approaches allow the reconstruction of spatio-temporal joint probability distributions at a low computational cost.For example, the ensemble copula coupling (ECC) method consists in rebuilding the multivariate aspect of the forecast from the original ensemble forecasts. Based on the assumption of error stationarity, parametric methods aim to fully describe the forecast dependence structures. In this study, the concept of ECC is combined with past data statistics in order to account for the autocorrelation of the forecast error. The new approach which preserves the dynamical development of the ensemble members is called dynamic ensemble copula coupling (...

  2. A state machine approach in modelling the heating process of a building

    Energy Technology Data Exchange (ETDEWEB)

    Pakanen, Jouko [Helsinki University of Technology, P.O. Box 3300, FI-02015 TKK Espoo (Finland); Karjalainen, Sami [VTT, P.O. Box 1000, FI-02044 VTT Espoo (Finland)

    2009-05-15

    Process models and their applications have gradually become an integral part of the design, maintenance and automation of modern buildings. The following state machine model outlines a new approach in this area. The heating power described by the model is based on the recent inputs as well as on the past inputs and outputs of the process, thus also representing the states of the system. Identifying the model means collecting, assorting and storing observations, but also effectively utilizing their inherent relationships and nearest neighbours. The last aspect enables to create a uniform set of data, which forms the characteristic, dynamic behaviour of the HVAC process. The state machine model is non-parametric and needs no sophisticated algorithm for identification. It is therefore suitable for small microprocessor devices equipped with a larger memory capacity. The first test runs, performed in a simulated environment, were encouraging and showed good prediction capability. (author)

  3. Generation of scenarios from calibrated ensemble forecasts with a dynamic ensemble copula coupling approach

    DEFF Research Database (Denmark)

    Ben Bouallègue, Zied; Heppelmann, Tobias; Theis, Susanne E.

    2015-01-01

    Probabilistic forecasts in the form of ensemble of scenarios are required for complex decision making processes. Ensemble forecasting systems provide such products but the spatio-temporal structures of the forecast uncertainty is lost when statistical calibration of the ensemble forecasts...... is applied for each lead time and location independently. Non-parametric approaches allow the reconstruction of spatio-temporal joint probability distributions at a low computational cost.For example, the ensemble copula coupling (ECC) method consists in rebuilding the multivariate aspect of the forecast...... from the original ensemble forecasts. Based on the assumption of error stationarity, parametric methods aim to fully describe the forecast dependence structures. In this study, the concept of ECC is combined with past data statistics in order to account for the autocorrelation of the forecast error...

  4. Generation of scenarios from calibrated ensemble forecasts with a dual ensemble copula coupling approach

    DEFF Research Database (Denmark)

    Ben Bouallègue, Zied; Heppelmann, Tobias; Theis, Susanne E.

    2016-01-01

    Probabilistic forecasts in the form of ensemble of scenarios are required for complex decision making processes. Ensemble forecasting systems provide such products but the spatio-temporal structures of the forecast uncertainty is lost when statistical calibration of the ensemble forecasts...... is applied for each lead time and location independently. Non-parametric approaches allow the reconstruction of spatio-temporal joint probability distributions at a low computational cost. For example, the ensemble copula coupling (ECC) method rebuilds the multivariate aspect of the forecast from...... the original ensemble forecasts. Based on the assumption of error stationarity, parametric methods aim to fully describe the forecast dependence structures. In this study, the concept of ECC is combined with past data statistics in order to account for the autocorrelation of the forecast error. The new...

  5. [Approximation of Time Series of Paramecia caudatum Dynamics by Verhulst and Gompertz Models: Non-traditional Approach].

    Science.gov (United States)

    Nedorezov, L V

    2015-01-01

    For approximation of some well-known time series of Paramecia caudatun population dynamics (G. F. Gause, The Struggle for Existence, 1934) Verhulst and Gompertz models were used. The parameters were estimated for each of the models in two different ways: with the least squares method (global fitting) and non-traditional approach (a method of extreme points). The results obtained were compared and also with those represented by G. F. Gause. Deviations of theoretical (model) trajectories from experimental time series were tested using various non-parametric statistical tests. It was shown that the least square method-estimations lead to the results which not always meet the requirements imposed for a "fine" model. But in some cases a small modification of the least square method-estimations is possible allowing for satisfactory representations of experimental data set for approximation.

  6. Historical and current approaches of research synthesis in plant sciences Aproximaciones históricas y actuales en la síntesis de investigaciones en ciencias vegetales

    Directory of Open Access Journals (Sweden)

    Ramiro Aguilar

    2009-07-01

    Full Text Available The progress of science has depended to a great extent on the ability of reaching general conclusions from a body of published research, making reviews essential to scientific development. Research reviews have provided the bases for conceptual syntheses and for development of general theories in many research areas of the natural sciences, as in the case of plant sciences. Historical approaches include narrative and "vote-counting" types of reviews, both of which present serious flaws that limit their ability to obtain robust generalizations. Currently, quantitative reviews such as metaanalyses are able to synthesize results from independent studies in a manner that is both objective and statistically defensible. Meta-analysis has been used to synthesize disparate research findings and identify patterns to arrive at conclusions unavailable to the researchers of the primary studies. Here I outline some of the most important features of meta-analysis, provide a state of the art of meta-analyses in plant sciences, and point out some future perspectives on its use. Hopefully, readers will gain interest and appreciation on meta-analytical techniques to arrive to meaningful generalizations in different areas of plant sciences.El progreso de la ciencia ha estado íntimamente ligado a la habilidad de alcanzar conclusiones generales a partir de investigaciones publicadas, haciendo de las revisiones una herramienta esencial para el desarrollo científico. Las revisiones en ciencia han provisto las bases para síntesis conceptuales y para el desarrollo de teorías generales en muchas áreas de investigación de las ciencias naturales como en las ciencias vegetales. Las aproximaciones históricas incluyen las revisiones narrativas y las conocidas como "cuenta votos", las cuales presentan serias fallas que limitan su capacidad de obtener generalizaciones robustas. Actualmente, las revisiones cuantitativas como el meta-análisis son capaces de sintetizar

  7. The Effects of Minimum Wages on the Distribution of Family Incomes: A Non-Parametric Analysis

    OpenAIRE

    David Neumark; Mark Schweitzer; William Wascher

    1998-01-01

    The primary goal of a national minimum wage floor is to raise the incomes of poor or near-poor families with members in the work force. However, estimates of employment effects of minimum wages tell us relatively little about whether minimum wages are likely to achieve this goal; even if the disemployment effects of minimum wages are modest, minimum wage increases could result in net income losses for poor and low-income families. In this paper, we present evidence on the effects of minimum w...

  8. A Java program for non-parametric statistic comparison of community structure

    Directory of Open Access Journals (Sweden)

    WenJun Zhang

    2011-09-01

    Full Text Available The Java algorithm to statistically compare structure difference of two communities was presented in this study. Euclidean distance, Manhattan distance, Pearson correlation, Point correlation, quadratic correlation and Jaccard coefficient were included in the algorithm. The algorithm was used to compare rice arthropod communities in Pearl River Delta, China, and the results showed that the family composition of arthropods for Guangzhou, Zhongshan, Zhuhai, and Dongguan are not significantly different.

  9. Inferential, non-parametric statistics to assess the quality of probabilistic forecast systems

    NARCIS (Netherlands)

    Maia, A.H.N.; Meinke, H.B.; Lennox, S.; Stone, R.C.

    2007-01-01

    Many statistical forecast systems are available to interested users. To be useful for decision making, these systems must be based on evidence of underlying mechanisms. Once causal connections between the mechanism and its statistical manifestation have been firmly established, the forecasts must al

  10. Applications of non-parametric statistics and analysis of variance on sample variances

    Science.gov (United States)

    Myers, R. H.

    1981-01-01

    Nonparametric methods that are available for NASA-type applications are discussed. An attempt will be made here to survey what can be used, to attempt recommendations as to when each would be applicable, and to compare the methods, when possible, with the usual normal-theory procedures that are avavilable for the Gaussion analog. It is important here to point out the hypotheses that are being tested, the assumptions that are being made, and limitations of the nonparametric procedures. The appropriateness of doing analysis of variance on sample variances are also discussed and studied. This procedure is followed in several NASA simulation projects. On the surface this would appear to be reasonably sound procedure. However, difficulties involved center around the normality problem and the basic homogeneous variance assumption that is mase in usual analysis of variance problems. These difficulties discussed and guidelines given for using the methods.

  11. Non-parametric Bayesian graph models reveal community structure in resting state fMRI

    DEFF Research Database (Denmark)

    Andersen, Kasper Winther; Madsen, Kristoffer H.; Siebner, Hartwig Roman

    2014-01-01

    Modeling of resting state functional magnetic resonance imaging (rs-fMRI) data using network models is of increasing interest. It is often desirable to group nodes into clusters to interpret the communication patterns between nodes. In this study we consider three different nonparametric Bayesian...

  12. Testing the Non-Parametric Conditional CAPM in the Brazilian Stock Market

    Directory of Open Access Journals (Sweden)

    Daniel Reed Bergmann

    2014-04-01

    Full Text Available This paper seeks to analyze if the variations of returns and systematic risks from Brazilian portfolios could be explained by the nonparametric conditional Capital Asset Pricing Model (CAPM by Wang (2002. There are four informational variables available to the investors: (i the Brazilian industrial production level; (ii the broad money supply M4; (iii the inflation represented by the Índice de Preços ao Consumidor Amplo (IPCA; and (iv the real-dollar exchange rate, obtained by PTAX dollar quotation.This study comprised the shares listed in the BOVESPA throughout January 2002 to December 2009. The test methodology developed by Wang (2002 and retorted to the Mexican context by Castillo-Spíndola (2006 was used. The observed results indicate that the nonparametric conditional model is relevant in explaining the portfolios’ returns of the sample considered for two among the four tested variables, M4 and PTAX dollar at 5% level of significance.

  13. Comparing non-parametric methods for ungrouping coarsely aggregated age-specific distributions

    DEFF Research Database (Denmark)

    Rizzi, Silvia; Thinggaard, Mikael; Vaupel, James W.

    2016-01-01

    Demographers have often access to vital statistics that are less than ideal for the purpose of their research. In many instances demographic data are reported in coarse histograms, where the values given are only the summation of true latent values, thereby making detailed analysis troublesome. O...

  14. Non-parametric estimation of the availability in a general repairable system

    Energy Technology Data Exchange (ETDEWEB)

    Gamiz, M.L. [Departamento de Estadistica e I.O., Facultad de Ciencias, Universidad de Granada, Granada 18071 (Spain)], E-mail: mgamiz@ugr.es; Roman, Y. [Departamento de Estadistica e I.O., Facultad de Ciencias, Universidad de Granada, Granada 18071 (Spain)

    2008-08-15

    This work deals with repairable systems with unknown failure and repair time distributions. We focus on the estimation of the instantaneous availability, that is, the probability that the system is functioning at a given time, which we consider as the most significant measure for evaluating the effectiveness of a repairable system. The estimation of the availability function is not, in general, an easy task, i.e., analytical techniques are difficult to apply. We propose a smooth estimation of the availability based on kernel estimator of the cumulative distribution functions (CDF) of the failure and repair times, for which the bandwidth parameters are obtained by bootstrap procedures. The consistency properties of the availability estimator are established by using techniques based on the Laplace transform.

  15. Non-parametric group-level statistics for source-resolved ERP analysis.

    Science.gov (United States)

    Lee, Clement; Miyakoshi, Makoto; Delorme, Arnaud; Cauwenberghs, Gert; Makeig, Scott

    2015-01-01

    We have developed a new statistical framework for group-level event-related potential (ERP) analysis in EEGLAB. The framework calculates the variance of scalp channel signals accounted for by the activity of homogeneous clusters of sources found by independent component analysis (ICA). When ICA data decomposition is performed on each subject's data separately, functionally equivalent ICs can be grouped into EEGLAB clusters. Here, we report a new addition (statPvaf) to the EEGLAB plug-in std_envtopo to enable inferential statistics on main effects and interactions in event related potentials (ERPs) of independent component (IC) processes at the group level. We demonstrate the use of the updated plug-in on simulated and actual EEG data.

  16. [Non-parametric Bootstrap estimation on the intraclass correlation coefficient generated from quantitative hierarchical data].

    Science.gov (United States)

    Liang, Rong; Zhou, Shu-dong; Li, Li-xia; Zhang, Jun-guo; Gao, Yan-hui

    2013-09-01

    This paper aims to achieve Bootstraping in hierarchical data and to provide a method for the estimation on confidence interval(CI) of intraclass correlation coefficient(ICC).First, we utilize the mixed-effects model to estimate data from ICC of repeated measurement and from the two-stage sampling. Then, we use Bootstrap method to estimate CI from related ICCs. Finally, the influences of different Bootstraping strategies to ICC's CIs are compared. The repeated measurement instance show that the CI of cluster Bootsraping containing the true ICC value. However, when ignoring the hierarchy characteristics of data, the random Bootsraping method shows that it has the invalid CI. Result from the two-stage instance shows that bias observed between cluster Bootstraping's ICC means while the ICC of the original sample is the smallest, but with wide CI. It is necessary to consider the structure of data as important, when hierarchical data is being resampled. Bootstrapping seems to be better on the higher than that on lower levels.

  17. Choosing the best non-parametric richness estimator for benthic macroinvertebrates databases

    Directory of Open Access Journals (Sweden)

    Carola V. BASUALDO

    2011-01-01

    Full Text Available Los estimadores no paramétricos permiten comparar la riqueza estimada de conjuntos de datos de origen diverso. Empero, como su comportamiento depende de la distribución de abundancia del conjunto de datos, la preferencia por alguno representa una decisión difícil. Este trabajo rescata algunos criterios presentes en la literatura para elegir el estimador más adecuado para macroinvertebrados bentónicos de ríos y ofrece algunas herramientas para su aplicación. Cuatro estimadores de incidencia y dos de abundancia se aplicaron a un inventario regional a nivel de familia y género. Para su evaluación se consideró: el tamaño de submuestra para estimar la riqueza observada, la constancia de ese tamaño de submuestra, la ausencia de comportamiento errático y la similitud en la forma de la curva entre los distintos conjuntos de datos. Entre los estimadores de incidencia, el mejor fue Jack1; entre los de abundancia, ACE para muestras de baja riqueza y Chao1, para las de alta riqueza. La forma uniforme de las curvas permitió describir secuencias generales de comportamiento, que pueden utilizarse como referencia para comparar curvas de pequeñas muestras e inferir su comportamiento –y riqueza– probable, si la muestra fuera mayor. Estos resultados pueden ser muy útiles para la gestión ambiental y actualizan el estado del conocimiento regional de macroinvertebrados.

  18. Non-parametric PSF estimation from celestial transit solar images using blind deconvolution

    CERN Document Server

    Gonzalez, Adriana; Jacques, Laurent

    2016-01-01

    Context: Characterization of instrumental effects in astronomical imaging is important in order to extract accurate physical information from the observations. Optics are never perfect and the non-ideal path through the telescope is usually represented by the convolution of an ideal image with a Point Spread Function (PSF). Other sources of noise (read-out, Photon) also contaminate the image acquisition process. The problem of estimating both the PSF filter and a denoised image is called blind deconvolution and is ill-posed. Aims: We propose a blind deconvolution scheme that relies on image regularization. Contrarily to most methods presented in the literature, it does not assume a parametric model of the PSF and can thus be applied to any telescope. Methods: Our scheme uses a wavelet analysis image prior model and weak assumptions on the PSF filter's response. We use the observations from a celestial body transit where such object can be assumed to be a black disk. Such constraints limits the interchangeabil...

  19. Improving Adaptive Importance Sampling Simulation of Markovian Queueing Models using Non-parametric Smoothing

    NARCIS (Netherlands)

    Woudt, Edwin; de Boer, Pieter-Tjerk; van Ommeren, Jan C.W.

    2007-01-01

    Previous work on state-dependent adaptive importance sampling techniques for the simulation of rare events in Markovian queueing models used either no smoothing or a parametric smoothing technique, which was known to be non-optimal. In this paper, we introduce the use of kernel smoothing in this con

  20. A new non-parametric stationarity test of time series in the time domain

    KAUST Repository

    Jin, Lei

    2014-11-07

    © 2015 The Royal Statistical Society and Blackwell Publishing Ltd. We propose a new double-order selection test for checking second-order stationarity of a time series. To develop the test, a sequence of systematic samples is defined via Walsh functions. Then the deviations of the autocovariances based on these systematic samples from the corresponding autocovariances of the whole time series are calculated and the uniform asymptotic joint normality of these deviations over different systematic samples is obtained. With a double-order selection scheme, our test statistic is constructed by combining the deviations at different lags in the systematic samples. The null asymptotic distribution of the statistic proposed is derived and the consistency of the test is shown under fixed and local alternatives. Simulation studies demonstrate well-behaved finite sample properties of the method proposed. Comparisons with some existing tests in terms of power are given both analytically and empirically. In addition, the method proposed is applied to check the stationarity assumption of a chemical process viscosity readings data set.

  1. Non-parametric Bayesian networks for parameter estimation in reservoir engineering

    NARCIS (Netherlands)

    Zilko, A.A.; Hanea, A.M.; Hanea, R.G.

    2013-01-01

    The ultimate goal in reservoir engineering is to optimize hydrocarbon recovery from a reservoir. To achieve the goal, good knowledge of the subsurface properties is crucial. One of these properties is the permeability. Ensemble Kalman Filter (EnKF) is the most common tool used to deal with this

  2. Non-Parametric, Closed-Loop Testing of Autonomy in Unmanned Aircraft Systems Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed Phase I program aims to develop new methods to support safety testing for integration of Unmanned Aircraft Systems into the National Airspace (NAS) with...

  3. Non-parametric classification of esophagus motility by means of neural networks

    DEFF Research Database (Denmark)

    Thøgersen, C; Rasmussen, C; Rutz, K

    1997-01-01

    . The aim of the present work has been to test the ability of neural networks to identify abnormal contraction patterns in patients with non-obstructive dysphagia (NOBD). Nineteen volunteers and 22 patients with NOBD underwent simultaneous recordings of four pressures in the esophagus for at least 23 hours...

  4. On Non-Parametric Field Estimation using Randomly Deployed, Noisy, Binary Sensors

    CERN Document Server

    Wang, Ye

    2007-01-01

    We consider the problem of reconstructing a deterministic data field from binary quantized noisy observations of sensors randomly deployed over the field domain. Our focus is on the extremes of lack of control in the sensor deployment, arbitrariness and lack of knowledge of the noise distribution, and low-precision and unreliability in the sensors. These adverse conditions are motivated by possible real-world scenarios where a large collection of low-cost, crudely manufactured sensors are mass-deployed in an environment where little can be assumed about the ambient noise. We propose a simple estimator that reconstructs the entire data field from these unreliable, binary quantized, noisy observations. Under the assumption of a bounded amplitude field, we prove almost sure and mean-square convergence of the estimator to the actual field as the number of sensors tends to infinity. For fields with bounded-variation, Sobolev differentiable, or finite-dimensionality properties, we derive specific mean squared error...

  5. The Efficiency Change of Italian Public Universities in the New Millennium: A Non-Parametric Analysis

    Science.gov (United States)

    Guccio, Calogero; Martorana, Marco Ferdinando; Mazza, Isidoro

    2017-01-01

    The paper assesses the evolution of efficiency of Italian public universities for the period 2000-2010. It aims at investigating whether their levels of efficiency showed signs of convergence, and if the well-known disparity between northern and southern regions decreased. For this purpose, we use a refinement of data envelopment analysis, namely…

  6. Tremor Detection Using Parametric and Non-Parametric Spectral Estimation Methods : A Comparison with Clinical Assessment

    NARCIS (Netherlands)

    Martinez Manzanera, Octavio; Elting, Jan Willem; van der Hoeven, Johannes H; Maurits, Natasha M

    2016-01-01

    In the clinic, tremor is diagnosed during a time-limited process in which patients are observed and the characteristics of tremor are visually assessed. For some tremor disorders, a more detailed analysis of these characteristics is needed. Accelerometry and electromyography can be used to obtain a

  7. Sparse Representations for Image Classification: Learning Discriminative and Reconstructive Non-Parametric Dictionaries

    Science.gov (United States)

    2008-06-01

    December 2006. 7. M. Fritz, B. Leibe, B. Caputo, and B. Schiele . Integrating representative and discriminant models for object category detection. 8... Schiele . Robust object detection with interleaved categorization and segmentation. Int. J. of Computer Vision, 2007 (in press). 18. D. G. Lowe. Distinctive

  8. Bayesian analysis of energy balance data from growing cattle using parametric and non-parametric modelling

    NARCIS (Netherlands)

    Moraes, L.E.; Kebreab, E.; Strathe, A.B.; France, J.; Dijkstra, J.; Casper, D.; Fadel, J.G.

    2014-01-01

    Linear and non-linear models have been extensively utilised for the estimation of net and metabolisable energy requirements and for the estimation of the efficiencies of utilising dietary energy for maintenance and tissue gain. In growing animals, biological principles imply that energy retention ra

  9. Afrika Statistika ISSN 2316-090X Bidimensional non-parametric ...

    African Journals Online (AJOL)

    for example, income, expenditures, caloric consumption, life expectancy, height, body weight, extent of ... K(x) is a Borel function satisfying the following hypotheses: (H1) ..... considered Gaussian kernels that fullfil Hi,i = 1, ..., 6 and hj = 1/. √.

  10. Measuring the influence of networks on transaction costs using a non-parametric regression technique

    DEFF Research Database (Denmark)

    Henningsen, Géraldine; Henningsen, Arne; Henning, Christian H.C.A.

    All business transactions as well as achieving innovations take up resources, subsumed under the concept of transaction costs. One of the major factors in transaction costs theory is information. Firm networks can catalyse the interpersonal information exchange and hence, increase the access to n...

  11. Comparison of non-parametric methods for ungrouping coarsely aggregated data

    DEFF Research Database (Denmark)

    Rizzi, Silvia; Thinggaard, Mikael; Engholm, Gerda;

    2016-01-01

    group at the highest ages. When histogram intervals are too coarse, information is lost and comparison between histograms with different boundaries is arduous. In these cases it is useful to estimate detailed distributions from grouped data. Methods From an extensive literature search we identify five...

  12. Inferential, non-parametric statistics to assess the quality of probabilistic forecast systems

    NARCIS (Netherlands)

    Maia, A.H.N.; Meinke, H.B.; Lennox, S.; Stone, R.C.

    2007-01-01

    Many statistical forecast systems are available to interested users. To be useful for decision making, these systems must be based on evidence of underlying mechanisms. Once causal connections between the mechanism and its statistical manifestation have been firmly established, the forecasts must al

  13. A Non Parametric Model for the Forecasting of the Venezuelan Oil Prices

    CERN Document Server

    Costanzo, Sabatino; Dehne, Wafaa; Prato, Hender

    2007-01-01

    A neural net model for forecasting the prices of Venezuelan crude oil is proposed. The inputs of the neural net are selected by reference to a dynamic system model of oil prices by Mashayekhi (1995, 2001) and its performance is evaluated using two criteria: the Excess Profitability test by Anatoliev and Gerko (2005) and the characteristics of the equity curve generated by a trading strategy based on the neural net predictions. ----- Se introduce aqui un modelo no parametrico para pronosticar los precios del petroleo Venezolano cuyos insumos son seleccionados en base a un sistema dinamico que explica los precios en terminos de dichos insumos. Se describe el proceso de recoleccion y pre-procesamiento de datos y la corrida de la red y se evaluan sus pronosticos a traves de un test estadistico de predictibilidad y de las caracteristicas del Equity Curve inducido por la estrategia de compraventa bursatil generada por dichos pronosticos.

  14. Non-parametric causality detection: An application to social media and financial data

    Science.gov (United States)

    Tsapeli, Fani; Musolesi, Mirco; Tino, Peter

    2017-10-01

    According to behavioral finance, stock market returns are influenced by emotional, social and psychological factors. Several recent works support this theory by providing evidence of correlation between stock market prices and collective sentiment indexes measured using social media data. However, a pure correlation analysis is not sufficient to prove that stock market returns are influenced by such emotional factors since both stock market prices and collective sentiment may be driven by a third unmeasured factor. Controlling for factors that could influence the study by applying multivariate regression models is challenging given the complexity of stock market data. False assumptions about the linearity or non-linearity of the model and inaccuracies on model specification may result in misleading conclusions. In this work, we propose a novel framework for causal inference that does not require any assumption about a particular parametric form of the model expressing statistical relationships among the variables of the study and can effectively control a large number of observed factors. We apply our method in order to estimate the causal impact that information posted in social media may have on stock market returns of four big companies. Our results indicate that social media data not only correlate with stock market returns but also influence them.

  15. Non-parametric probabilistic forecasts of wind power: required properties and evaluation

    DEFF Research Database (Denmark)

    Pinson, Pierre; Nielsen, Henrik Aalborg; Møller, Jan Kloppenborg;

    2007-01-01

    of the conditional expectation of future generation for each look-ahead time, but also with uncertainty estimates given by probabilistic forecasts. In order to avoid assumptions on the shape of predictive distributions, these probabilistic predictions are produced from nonparametric methods, and then take the form...... of a single or a set of quantile forecasts. The required and desirable properties of such probabilistic forecasts are defined and a framework for their evaluation is proposed. This framework is applied for evaluating the quality of two statistical methods producing full predictive distributions from point......Predictions of wind power production for horizons up to 48-72 hour ahead comprise a highly valuable input to the methods for the daily management or trading of wind generation. Today, users of wind power predictions are not only provided with point predictions, which are estimates...

  16. A Probabilistic, Non-parametric Framework for Inter-modality Label Fusion

    DEFF Research Database (Denmark)

    Iglesias, Juan Eugenio; Sabuncu, Mert Rory; Van Leemput, Koen

    2013-01-01

    in a principled way in inter-modality scenarios remains an open problem. Here we propose a label fusion scheme that does not require voxel intensity consistency between the atlases and the target image to segment. The method is based on a generative model of image data in which each intensity in the atlases has...... an associated conditional distribution of corresponding intensities in the target. The segmentation is computed using variational expectation maximization (VEM) in a Bayesian framework. The method was evaluated with a dataset of eight proton density weighted brain MRI scans with nine labeled structures......Multi-atlas techniques are commonplace in medical image segmentation due to their high performance and ease of implementation. Locally weighting the contributions from the different atlases in the label fusion process can improve the quality of the segmentation. However, how to define these weights...

  17. Distributed Non-Parametric Representations for Vital Filtering: UW at TREC KBA 2014

    Science.gov (United States)

    2014-11-01

    Proceedings of the Twenty-Second Text REtrieval Conference (TREC), 2013. Blei, David . Probabilistic topic models. Communications of the ACM, pp. 7784, 2012...Conference (TREC 2012), 2012. Hinton, G. E., McClelland , J. L., and Rumelhart, D. E. Distributed Representations. In Parallel Distributed Processing

  18. How are teachers teaching? A nonparametric approach

    NARCIS (Netherlands)

    de Witte, K.; van Klaveren, C.

    2010-01-01

    This paper examines which configuration of teaching activities (expressed in, e.g., problem solving, homework, lecturing) maximizes student performance. To do so, it formulates a non- parametric efficiency model that is rooted in the Data Envelopment Analysis literature. In the model, we account for

  19. How are teachers teaching? A nonparametric approach

    NARCIS (Netherlands)

    de Witte, K.; van Klaveren, C.

    2010-01-01

    This paper examines which configuration of teaching activities (expressed in, e.g., problem solving, homework, lecturing) maximizes student performance. To do so, it formulates a non- parametric efficiency model that is rooted in the Data Envelopment Analysis literature. In the model, we account for

  20. A robust approach for tree segmentation in deciduous forests using small-footprint airborne LiDAR data

    Science.gov (United States)

    Hamraz, Hamid; Contreras, Marco A.; Zhang, Jun

    2016-10-01

    This paper presents a non-parametric approach for segmenting trees from airborne LiDAR data in deciduous forests. Based on the LiDAR point cloud, the approach collects crown information such as steepness and height on-the-fly to delineate crown boundaries, and most importantly, does not require a priori assumptions of crown shape and size. The approach segments trees iteratively starting from the tallest within a given area to the smallest until all trees have been segmented. To evaluate its performance, the approach was applied to the University of Kentucky Robinson Forest, a deciduous closed-canopy forest with complex terrain and vegetation conditions. The approach identified 94% of dominant and co-dominant trees with a false detection rate of 13%. About 62% of intermediate, overtopped, and dead trees were also detected with a false detection rate of 15%. The overall segmentation accuracy was 77%. Correlations of the segmentation scores of the proposed approach with local terrain and stand metrics was not significant, which is likely an indication of the robustness of the approach as results are not sensitive to the differences in terrain and stand structures.

  1. Sensor fusion approaches for EMI and GPR-based subsurface threat identification

    Science.gov (United States)

    Torrione, Peter; Morton, Kenneth, Jr.; Besaw, Lance E.

    2011-06-01

    Despite advances in both electromagnetic induction (EMI) and ground penetrating radar (GPR) sensing and related signal processing, neither sensor alone provides a perfect tool for detecting the myriad of possible buried objects that threaten the lives of Soldiers and civilians. However, while neither GPR nor EMI sensing alone can provide optimal detection across all target types, the two approaches are highly complementary. As a result, many landmine systems seek to make use of both sensing modalities simultaneously and fuse the results from both sensors to improve detection performance for targets with widely varying metal content and GPR responses. Despite this, little work has focused on large-scale comparisons of different approaches to sensor fusion and machine learning for combining data from these highly orthogonal phenomenologies. In this work we explore a wide array of pattern recognition techniques for algorithm development and sensor fusion. Results with the ARA Nemesis landmine detection system suggest that nonlinear and non-parametric classification algorithms provide significant performance benefits for single-sensor algorithm development, and that fusion of multiple algorithms can be performed satisfactorily using basic parametric approaches, such as logistic discriminant classification, for the targets under consideration in our data sets.

  2. Do trend extraction approaches affect causality detection in climate change studies?

    Science.gov (United States)

    Huang, Xu; Hassani, Hossein; Ghodsi, Mansi; Mukherjee, Zinnia; Gupta, Rangan

    2017-03-01

    Various scientific studies have investigated the causal link between solar activity (SS) and the earth's temperature (GT). Results from literature indicate that both the detected structural breaks and existing trend have significant effects on the causality detection outcomes. In this paper, we make a contribution to this literature by evaluating and comparing seven trend extraction methods covering various aspects of trend extraction studies to date. In addition, we extend previous work by using Convergent Cross Mapping (CCM) - an advanced non-parametric causality detection technique to provide evidence on the effect of existing trend in global temperature on the causality detection outcome. This paper illustrates the use of a method to find the most reliable trend extraction approach for data preprocessing, as well as provides detailed analyses of the causality detection of each component by this approach to achieve a better understanding of the causal link between SS and GT. Furthermore, the corresponding CCM results indicate increasing significance of causal effect from SS to GT since 1880 to recent years, which provide solid evidences that may contribute on explaining the escalating global tendency of warming up recent decades.

  3. Quality assessment of Isfahan Medical Faculty web site electronic services and prioritizing solutions using analytic hierarchy process approach.

    Science.gov (United States)

    Hajrahimi, Nafiseh; Dehaghani, Sayed Mehdi Hejazi; Hajrahimi, Nargess; Sarmadi, Sima

    2014-01-01

    Implementing information technology in the best possible way can bring many advantages such as applying electronic services and facilitating tasks. Therefore, assessment of service providing systems is a way to improve the quality and elevate these systems including e-commerce, e-government, e-banking, and e-learning. This study was aimed to evaluate the electronic services in the website of Isfahan University of Medical Sciences in order to propose solutions to improve them. Furthermore, we aim to rank the solutions based on the factors that enhance the quality of electronic services by using analytic hierarchy process (AHP) method. Non-parametric test was used to assess the quality of electronic services. The assessment of propositions was based on Aqual model and they were prioritized using AHP approach. The AHP approach was used because it directly applies experts' deductions in the model, and lead to more objective results in the analysis and prioritizing the risks. After evaluating the quality of the electronic services, a multi-criteria decision making frame-work was used to prioritize the proposed solutions. Non-parametric tests and AHP approach using Expert Choice software. The results showed that students were satisfied in most of the indicators. Only a few indicators received low satisfaction from students including, design attractiveness, the amount of explanation and details of information, honesty and responsiveness of authorities, and the role of e-services in the user's relationship with university. After interviewing with Information and Communications Technology (ICT) experts at the university, measurement criteria, and solutions to improve the quality were collected. The best solutions were selected by EC software. According to the results, the solution "controlling and improving the process in handling users complaints" is of the utmost importance and authorities have to have it on the website and place great importance on updating this process

  4. An Integrated and Multivariate Model along with Designing Experiments Approach for Assessment of Micro- and Macro- Ergonomic Factors: The Case of a Gas Refinery.

    Science.gov (United States)

    Azadeh, A; Mohammadfam, I; Sadjadi, M; Hamidi, Y; Kianfar, A

    2008-12-28

    The objectives of this paper are three folds. First, an integrated framework for designing and development of the integrated health, safety and environment (HSE) model is presented. Second, it is implemented and tested for a large gas refinery in Iran. Third, it is shown whether the total ergonomics model is superior to the conventional ergonomics approach. This study is among the first to examine total ergonomics components in a manufacturing system. This study was conducted in Sarkhoon & Qeshm Gas refinery- Iran in 2006. To achieve the above objectives, an integrated approach based on total ergonomics factors was developed. Second, it is applied to the refinery and the advantages of total ergonomics approach are discussed. Third, the impacts of total ergonomics factors on local factors are examined through non-parametric statistical analysis. It was shown that total ergonomics model is much more beneficial than conventional approach. It should be noted that the traditional ergonomics methodology is not capable of locating the findings of total ergonomics model. The distinguished aspect of this study is the employment of a total system approach based on integration of the conventional ergonomics factors with HSE factors.

  5. A novel data-driven approach to model error estimation in Data Assimilation

    Science.gov (United States)

    Pathiraja, Sahani; Moradkhani, Hamid; Marshall, Lucy; Sharma, Ashish

    2016-04-01

    Error characterisation is a fundamental component of Data Assimilation (DA) studies. Effectively describing model error statistics has been a challenging area, with many traditional methods requiring some level of subjectivity (for instance in defining the error covariance structure). Recent advances have focused on removing the need for tuning of error parameters, although there are still some outstanding issues. Many methods focus only on the first and second moments, and rely on assuming multivariate Gaussian statistics. We propose a non-parametric, data-driven framework to estimate the full distributional form of model error, ie. the transition density p(xt|xt-1). All sources of uncertainty associated with the model simulations are considered, without needing to assign error characteristics/devise stochastic perturbations for individual components of model uncertainty (eg. input, parameter and structural). A training period is used to derive the error distribution of observed variables, conditioned on (potentially hidden) states. Errors in hidden states are estimated from the conditional distribution of observed variables using non-linear optimization. The framework is discussed in detail, and an application to a hydrologic case study with hidden states for one-day ahead streamflow prediction is presented. Results demonstrate improved predictions and more realistic uncertainty bounds compared to a standard tuning approach.

  6. SECONDARY SCHOOL HEAD TEACHERS’ JOB SATISFACTION IN SAUDI ARABIA: THE RESULTS OF A MIXED METHODS APPROACH

    Directory of Open Access Journals (Sweden)

    AHMED MOHAMED ALZAIDI

    2008-11-01

    Full Text Available This paper aims to identify the factors which might affect secondary school head teachers’ job satisfaction in the city of Jeddah, Saudi Arabia. This study adopts a sequential exploratory strategy based on a mixed methods approach. The qualitative data generated identified the factors leading to job satisfaction and dissatisfaction. The factors fall into eight major themes: relationship with the educational administration, head teachers’ practices, the school environment, relationships with students and parents, head teachers’ authority, relationship with educational supervision and relationships with teachers. The quantitative data reveal that factors causing dissatisfaction are: lack of authority to transfer underperforming teachers, lack of finance and manpower for the cleaning of school buildings, lack of financial resources to improve school buildings, salary, poor revenue from school meals as a financial resource, and lack of financial reward. To explore the relationship between job satisfaction and the selected variables, a Kruskal-Wills (non parametric statistical test revealed significant differences between job satisfaction in terms of morale, relationship with the educational administration, the school environment, head teachers’ authority and overall job satisfaction according to educational supervision centers. In addition, a kruskal-Wills test revealed significant differences between job satisfactions in head teachers’ practices according to completion of the head teachers’ training programme. However, there were no significant differences between job satisfaction related to experience, student numbers, head teachers’ qualification, age and school building type. The paper identifies that the highly centralised educational system in Saudi Arabia and the lack of autonomy are factors that affect job satisfaction.

  7. A novel scan statistics approach for clustering identification and comparison in binary genomic data.

    Science.gov (United States)

    Pellin, Danilo; Di Serio, Clelia

    2016-09-22

    In biomedical research a relevant issue is to identify time intervals or portions of a n-dimensional support where a particular event of interest is more likely to occur than expected. Algorithms that require to specify a-priori number/dimension/length of clusters assumed for the data suffer from a high degree of arbitrariness whenever no precise information are available, and this may strongly affect final estimation on parameters. Within this framework, spatial scan-statistics have been proposed in the literature, representing a valid non-parametric alternative. We adapt the so called Bernoulli-model scan statistic to the genomic field and we propose a multivariate extension, named Relative Scan Statistics, for the comparison of two series of Bernoulli r.v. defined over a common support, with the final goal of highlighting unshared event rate variations. Using a probabilistic approach based on success probability estimates and comparison (likelihood based), we can exploit an hypothesis testing procedure to identify clusters and relative clusters. Both the univariate and the novel multivariate extension of the scan statistic confirm previously published findings. The method described in the paper represents a challenging application of scan statistics framework to problem related to genomic data. From a biological perspective, these tools offer the possibility to clinicians and researcher to improve their knowledge on viral vectors integrations process, allowing to focus their attention to restricted over-targeted portion of the genome.

  8. Extreme storm surges: a comparative study of frequency analysis approaches

    Directory of Open Access Journals (Sweden)

    Y. Hamdi

    2013-11-01

    Full Text Available In France, nuclear facilities were designed to very low probabilities of failure. Nevertheless, exceptional climatic events have given rise to surges much larger than observations (outliers and had clearly illustrated the potential to underestimate the extreme water levels calculated with the current statistical methods. The objective of the present work is to conduct a comparative study of three approaches including the Annual Maxima (AM, the Peaks-Over Threshold (POT and the r-Largest Order Statistics (r-LOS. These methods are illustrated in a real analysis case study. All the data sets were screened for outliers. Non-parametric tests for randomness, homogeneity and stationarity of time series were used. The shape and scale parameters stability plots, the mean excess residual life plot and the stability of the standard errors of return levels were used to select optimal thresholds and r values for the POT and r-LOS method, respectively. The comparison of methods was based on: (i the uncertainty degrees, (ii the adequacy criteria and tests and (iii the visual inspection. It was found that the r-LOS and POT methods have reduced the uncertainty on the distributions parameters and return level estimates and have systematically shown values of the 100 and 500 yr return levels smaller than those estimated with the AM method. Results have also shown that none of the compared methods has allowed a good fitting at the right tail of the distribution in the presence of outliers. As a perspective, the use of historical information was proposed in order to increase the representativity of outliers in data sets. Findings are of practical relevance not only to nuclear energy operators in France, for applications in storm surge hazard analysis and flood management, but also for the optimal planning and design of facilities to withstand extreme environmental conditions, with an appropriate level of risk.

  9. Extreme storm surges: a comparative study of frequency analysis approaches

    Science.gov (United States)

    Hamdi, Y.; Bardet, L.; Duluc, C.-M.; Rebour, V.

    2014-08-01

    In France, nuclear facilities were designed around very low probabilities of failure. Nevertheless, some extreme climatic events have given rise to exceptional observed surges (outliers) much larger than other observations, and have clearly illustrated the potential to underestimate the extreme water levels calculated with the current statistical methods. The objective of the present work is to conduct a comparative study of three approaches to extreme value analysis, including the annual maxima (AM), the peaks-over-threshold (POT) and the r-largest order statistics (r-LOS). These methods are illustrated in a real analysis case study. All data sets were screened for outliers. Non-parametric tests for randomness, homogeneity and stationarity of time series were used. The shape and scale parameter stability plots, the mean excess residual life plot and the stability of the standard errors of return levels were used to select optimal thresholds and r values for the POT and r-LOS method, respectively. The comparison of methods was based on (i) the uncertainty degrees, (ii) the adequacy criteria and tests, and (iii) the visual inspection. It was found that the r-LOS and POT methods have reduced the uncertainty on the distribution parameters and return level estimates and have systematically shown values of the 100 and 500-year return levels smaller than those estimated with the AM method. Results have also shown that none of the compared methods has allowed a good fit at the right tail of the distribution in the presence of outliers. As a perspective, the use of historical information was proposed in order to increase the representativeness of outliers in data sets. Findings are of practical relevance, not only to nuclear energy operators in France, for applications in storm surge hazard analysis and flood management, but also for the optimal planning and design of facilities to withstand extreme environmental conditions, with an appropriate level of risk.

  10. Genes and longevity: a genetic-demographic approach reveals sex- and age-specific gene effects not shown by the case-control approach (APOE and HSP70.1 loci).

    Science.gov (United States)

    Dato, S; Carotenuto, L; De Benedictis, G

    2007-02-01

    Association analyses between gene variability and human longevity carried out by comparing gene frequencies between population samples of different ages (case/control design) may provide information on genes and pathways playing a role in modulating survival at old ages. However, by dealing with cross-sectional data, the gene-frequency (GF) approach ignores cohort effects in population mortality changes. The genetic-demographic (GD) approach adds demographic information to genetic data and allows the estimation of hazard rates and survival functions for candidate alleles and genotypes. Thus mortality changes in the cohort to which the cross-sectional sample belongs are taken into account. In this work, we applied the GD method to a dataset relevant to two genes, APOE and HSP70.1, previously shown to be related to longevity by the GF method. We show that the GD method reveals sex- and age-specific allelic effects not shown by the GF analysis. In addition, we provide an algorithm for the implementation of a non-parametric GD analysis.

  11. MULTI-TEMPORAL LAND USE ANALYSIS OF AN EPHEMERAL RIVER AREA USING AN ARTIFICIAL NEURAL NETWORK APPROACH ON LANDSAT IMAGERY

    Directory of Open Access Journals (Sweden)

    M. Aquilino

    2014-01-01

    The historical archive of LANDSAT imagery dating back to the launch of ERTS in 1972 provides a comprehensive and permanent data source for tracking change on the planet‟s land surface. In this study case the imagery acquisition dates of 1987, 2002 and 2011 were selected to cover a time trend of 24 years. Land cover categories were based on classes outlined by the Curve Number method with the aim of characterizing land use according to the level of surface imperviousness. After comparing two land use classification methods, i.e. Maximum Likelihood Classifier (MLC and Multi-Layer Perceptron (MLP neural network, the Artificial Neural Networks (ANN approach was found the best reliable and efficient method in the absence of ground reference data. The ANN approach has a distinct advantage over statistical classification methods in that it is non-parametric and requires little or no a priori knowledge on the distribution model of input data. The results quantify land cover change patterns in the river basin area under study and demonstrate the potential of multitemporal LANDSAT data to provide an accurate and cost-effective means to map and analyse land cover changes over time that can be used as input in land management and policy decision-making.

  12. Analysing latent constructs via a derived metric paired comparison approach: An application to students’ emotions in mathematics

    Directory of Open Access Journals (Sweden)

    Alexandra Grand

    2014-09-01

    Full Text Available In this article we suggest an approach for the analysis of sets of items using the method of paired comparisons. We applied the proposed approach to a students’ survey of emotions typically experienced while learning mathematics by focusing on the relative dominance of these emotions. The emotions of interest were: enjoyment, pride, anger, anxiety, boredom and shame which were each measured by a set of items and for which we want to obtain an ordering on a continuum. In a first step we evaluated the quality of items by using a method of non-parametric Rasch model tests. The item sets of the emotions enjoyment, anxiety and boredom met the properties of a Rasch model. As a result of fitting Rasch models, we obtained person “emotion” parameter estimates. We then derived for each individual metric paired comparison responses from the obtained person parameter estimates and directly modelled these derived relative responses by fitting a beta regression model. This model is similar to generalized linear models (GLMs. The proposed model accounts for bounded metric paired comparison data in (0,1 where subject covariates and object-specific covariates can also be incorporated. We found that there is a tendency, the higher the positive discrepancy between the self concept of maths ability and the averaged perceived maths ability of students the more enjoyment and the less anxiety is typically experienced while learning mathematics.

  13. From concepts, theory, and evidence of heterogeneity of treatment effects to methodological approaches: a primer.

    Science.gov (United States)

    Willke, Richard J; Zheng, Zhiyuan; Subedi, Prasun; Althin, Rikard; Mullins, C Daniel

    2012-12-13

    Implicit in the growing interest in patient-centered outcomes research is a growing need for better evidence regarding how responses to a given intervention or treatment may vary across patients, referred to as heterogeneity of treatment effect (HTE). A variety of methods are available for exploring HTE, each associated with unique strengths and limitations. This paper reviews a selected set of methodological approaches to understanding HTE, focusing largely but not exclusively on their uses with randomized trial data. It is oriented for the "intermediate" outcomes researcher, who may already be familiar with some methods, but would value a systematic overview of both more and less familiar methods with attention to when and why they may be used. Drawing from the biomedical, statistical, epidemiological and econometrics literature, we describe the steps involved in choosing an HTE approach, focusing on whether the intent of the analysis is for exploratory, initial testing, or confirmatory testing purposes. We also map HTE methodological approaches to data considerations as well as the strengths and limitations of each approach. Methods reviewed include formal subgroup analysis, meta-analysis and meta-regression, various types of predictive risk modeling including classification and regression tree analysis, series of n-of-1 trials, latent growth and growth mixture models, quantile regression, and selected non-parametric methods. In addition to an overview of each HTE method, examples and references are provided for further reading.By guiding the selection of the methods and analysis, this review is meant to better enable outcomes researchers to understand and explore aspects of HTE in the context of patient-centered outcomes research.

  14. From concepts, theory, and evidence of heterogeneity of treatment effects to methodological approaches: a primer

    Directory of Open Access Journals (Sweden)

    Willke Richard J

    2012-12-01

    Full Text Available Abstract Implicit in the growing interest in patient-centered outcomes research is a growing need for better evidence regarding how responses to a given intervention or treatment may vary across patients, referred to as heterogeneity of treatment effect (HTE. A variety of methods are available for exploring HTE, each associated with unique strengths and limitations. This paper reviews a selected set of methodological approaches to understanding HTE, focusing largely but not exclusively on their uses with randomized trial data. It is oriented for the “intermediate” outcomes researcher, who may already be familiar with some methods, but would value a systematic overview of both more and less familiar methods with attention to when and why they may be used. Drawing from the biomedical, statistical, epidemiological and econometrics literature, we describe the steps involved in choosing an HTE approach, focusing on whether the intent of the analysis is for exploratory, initial testing, or confirmatory testing purposes. We also map HTE methodological approaches to data considerations as well as the strengths and limitations of each approach. Methods reviewed include formal subgroup analysis, meta-analysis and meta-regression, various types of predictive risk modeling including classification and regression tree analysis, series of n-of-1 trials, latent growth and growth mixture models, quantile regression, and selected non-parametric methods. In addition to an overview of each HTE method, examples and references are provided for further reading. By guiding the selection of the methods and analysis, this review is meant to better enable outcomes researchers to understand and explore aspects of HTE in the context of patient-centered outcomes research.

  15. Characterizing Ipomopsis rubra (Polemoniaceae) germination under various thermal scenarios with non-parametric and semi-parametric statistical methods.

    Science.gov (United States)

    Pérez, Hector E; Kettner, Keith

    2013-10-01

    Time-to-event analysis represents a collection of relatively new, flexible, and robust statistical techniques for investigating the incidence and timing of transitions from one discrete condition to another. Plant biology is replete with examples of such transitions occurring from the cellular to population levels. However, application of these statistical methods has been rare in botanical research. Here, we demonstrate the use of non- and semi-parametric time-to-event and categorical data analyses to address questions regarding seed to seedling transitions of Ipomopsis rubra propagules exposed to various doses of constant or simulated seasonal diel temperatures. Seeds were capable of germinating rapidly to >90 % at 15-25 or 22/11-29/19 °C. Optimum temperatures for germination occurred at 25 or 29/19 °C. Germination was inhibited and seed viability decreased at temperatures ≥30 or 33/24 °C. Kaplan-Meier estimates of survivor functions indicated highly significant differences in temporal germination patterns for seeds exposed to fluctuating or constant temperatures. Extended Cox regression models specified an inverse relationship between temperature and the hazard of germination. Moreover, temperature and the temperature × day interaction had significant effects on germination response. Comparisons to reference temperatures and linear contrasts suggest that summer temperatures (33/24 °C) play a significant role in differential germination responses. Similarly, simple and complex comparisons revealed that the effects of elevated temperatures predominate in terms of components of seed viability. In summary, the application of non- and semi-parametric analyses provides appropriate, powerful data analysis procedures to address various topics in seed biology and more widespread use is encouraged.

  16. Study on non-parametric methods for fast pattern recognition with emphasis on neural networks and cascade classifiers

    OpenAIRE

    Ludwig, Oswaldo

    2012-01-01

    Tese de doutoramento em Engenharia Eletrotécnica e de Computadores, no ramo de especialização em Automação e Robótica, apresentada à Faculdade de Ciências e Tecnologia da Universidade de Coimbra Esta tese concentra-se em reconhecimento de padrões, com particular ênfase para o con ito de escolha entre capacidade de generalização e custo computacional, a m de fornecer suporte para aplicações em tempo real. Neste contexto são apresentadas contribuições metodológicas e analíticas par...

  17. Selecting variables in non-parametric regression models for binary response. An application to the computerized detection of breast cancer.

    Science.gov (United States)

    Roca-Pardiñas, Javier; Cadarso-Suárez, Carmen; Tahoces, Pablo G; Lado, María J

    2009-01-30

    In many biomedical applications, interest lies in being able to distinguish between two possible states of a given response variable, depending on the values of certain continuous predictors. If the number of predictors, p, is high, or if there is redundancy among them, it then becomes important to decide on the selection of the best subset of predictors that will be able to obtain the models with greatest discrimination capacity. With this aim in mind, logistic generalized additive models were considered and receiver operating characteristic (ROC) curves were applied in order to determine and compare the discriminatory capacity of such models. This study sought to develop bootstrap-based tests that allow for the following to be ascertained: (a) the optimal number q < or = p of predictors; and (b) the model or models including q predictors, which display the largest AUC (area under the ROC curve). A simulation study was conducted to verify the behaviour of these tests. Finally, the proposed method was applied to a computer-aided diagnostic system dedicated to early detection of breast cancer. Copyright (c) 2008 John Wiley & Sons, Ltd.

  18. Non-parametric representation and prediction of single- and multi-shell diffusion-weighted MRI data using Gaussian processes

    Science.gov (United States)

    Andersson, Jesper L.R.; Sotiropoulos, Stamatios N.

    2015-01-01

    Diffusion MRI offers great potential in studying the human brain microstructure and connectivity. However, diffusion images are marred by technical problems, such as image distortions and spurious signal loss. Correcting for these problems is non-trivial and relies on having a mechanism that predicts what to expect. In this paper we describe a novel way to represent and make predictions about diffusion MRI data. It is based on a Gaussian process on one or several spheres similar to the Geostatistical method of “Kriging”. We present a choice of covariance function that allows us to accurately predict the signal even from voxels with complex fibre patterns. For multi-shell data (multiple non-zero b-values) the covariance function extends across the shells which means that data from one shell is used when making predictions for another shell. PMID:26236030

  19. A non-parametric mixture model for genome-enabled prediction of genetic value for a quantitative trait.

    Science.gov (United States)

    Gianola, Daniel; Wu, Xiao-Lin; Manfredi, Eduardo; Simianer, Henner

    2010-10-01

    A Bayesian nonparametric form of regression based on Dirichlet process priors is adapted to the analysis of quantitative traits possibly affected by cryptic forms of gene action, and to the context of SNP-assisted genomic selection, where the main objective is to predict a genomic signal on phenotype. The procedure clusters unknown genotypes into groups with distinct genetic values, but in a setting in which the number of clusters is unknown a priori, so that standard methods for finite mixture analysis do not work. The central assumption is that genetic effects follow an unknown distribution with some "baseline" family, which is a normal process in the cases considered here. A Bayesian analysis based on the Gibbs sampler produces estimates of the number of clusters, posterior means of genetic effects, a measure of credibility in the baseline distribution, as well as estimates of parameters of the latter. The procedure is illustrated with a simulation representing two populations. In the first one, there are 3 unknown QTL, with additive, dominance and epistatic effects; in the second, there are 10 QTL with additive, dominance and additive × additive epistatic effects. In the two populations, baseline parameters are inferred correctly. The Dirichlet process model infers the number of unique genetic values correctly in the first population, but it produces an understatement in the second one; here, the true number of clusters is over 900, and the model gives a posterior mean estimate of about 140, probably because more replication of genotypes is needed for correct inference. The impact on inferences of the prior distribution of a key parameter (M), and of the extent of replication, was examined via an analysis of mean body weight in 192 paternal half-sib families of broiler chickens, where each sire was genotyped for nearly 7,000 SNPs. In this small sample, it was found that inference about the number of clusters was affected by the prior distribution of M. For a set of combinations of parameters of a given prior distribution, the effects of the prior dissipated when the number of replicate samples per genotype was increased. Thus, the Dirichlet process model seems to be useful for gauging the number of QTLs affecting the trait: if the number of clusters inferred is small, probably just a few QTLs code for the trait. If the number of clusters inferred is large, this may imply that standard parametric models based on the baseline distribution may suffice. However, priors may be influential, especially if sample size is not large and if only a few genotypic configurations have replicate phenotypes in the sample.

  20. Non-parametric Data Analysis of Low-latitude Auroras and Naked-eye Sunspots in the Medieval Epoch

    Science.gov (United States)

    Bekli, Mohamed Reda; Zougab, Nabil; Belabbas, Abdelmoumene; Chadou, Ilhem

    2017-04-01

    We have studied solar activity by analyzing naked-eye sunspot observations and aurorae borealis observed at latitudes below 45°. We focused on the medieval epoch by considering the non-telescopic observations of sunspots from AD 974 to 1278 and aurorae borealis from AD 965 to 1273 that are reported in several Far East historical sources, primarily in China and Korea. After setting selection rules, we analyzed the distribution of these individual events following the months of the Gregorian calendar. In December, an unusual peak is observed with data recorded in both China and Japan, but not within Korean data.

  1. SPECIES-SPECIFIC FOREST VARIABLE ESTIMATION USING NON-PARAMETRIC MODELING OF MULTI-SPECTRAL PHOTOGRAMMETRIC POINT CLOUD DATA

    Directory of Open Access Journals (Sweden)

    J. Bohlin

    2012-07-01

    Full Text Available The recent development in software for automatic photogrammetric processing of multispectral aerial imagery, and the growing nation-wide availability of Digital Elevation Model (DEM data, are about to revolutionize data capture for forest management planning in Scandinavia. Using only already available aerial imagery and ALS-assessed DEM data, raster estimates of the forest variables mean tree height, basal area, total stem volume, and species-specific stem volumes were produced and evaluated. The study was conducted at a coniferous hemi-boreal test site in southern Sweden (lat. 58° N, long. 13° E. Digital aerial images from the Zeiss/Intergraph Digital Mapping Camera system were used to produce 3D point-cloud data with spectral information. Metrics were calculated for 696 field plots (10 m radius from point-cloud data and used in k-MSN to estimate forest variables. For these stands, the tree height ranged from 1.4 to 33.0 m (18.1 m mean, stem volume from 0 to 829 m3 ha-1 (249 m3 ha-1 mean and basal area from 0 to 62.2 m2 ha-1 (26.1 m2 ha-1 mean, with mean stand size of 2.8 ha. Estimates made using digital aerial images corresponding to the standard acquisition of the Swedish National Land Survey (Lantmäteriet showed RMSEs (in percent of the surveyed stand mean of 7.5% for tree height, 11.4% for basal area, 13.2% for total stem volume, 90.6% for pine stem volume, 26.4 for spruce stem volume, and 72.6% for deciduous stem volume. The results imply that photogrammetric matching of digital aerial images has significant potential for operational use in forestry.

  2. Study on non-parametric methods for fast pattern recognition with emphasis on neural networks and cascade classifiers

    OpenAIRE

    Ludwig, Oswaldo

    2012-01-01

    Tese de doutoramento em Engenharia Eletrotécnica e de Computadores, no ramo de especialização em Automação e Robótica, apresentada à Faculdade de Ciências e Tecnologia da Universidade de Coimbra Esta tese concentra-se em reconhecimento de padrões, com particular ênfase para o con ito de escolha entre capacidade de generalização e custo computacional, a m de fornecer suporte para aplicações em tempo real. Neste contexto são apresentadas contribuições metodológicas e analíticas par...

  3. A cross-country non parametric estimation of the returns to factors of production and the elasticity of scale

    Directory of Open Access Journals (Sweden)

    Adalmir Marquetti

    2007-04-01

    Full Text Available This paper employs local regression to estimate the output elasticity with respect to labor, human capital, physical capital and the elasticity of scale for 90 countries in 1985 and 1995. The results support the hypotheses of constant returns to scale to factors and decreasing returns to accumulable factors. The low capital-labor ratio countries have important differences in factor elasticities in relation to other countries. The augmentation of the production function by human capital did not reduce the elasticity of physical capital as suggested by Mankiw, Romer and Weil (1992. Moreover, it is investigated if the factors shares are really equal to their output elasticity. The wage share raises with the capital labor ratio and the sum of the output elasticity of labor and human capital is below the wage share for high capital labor ratio countries, happening the inverse for low capital labor ratio countries. It indicates the presence of externalities, or imperfect competition or that the marginal theory of distribution is inaccurate.

  4. Maternal and infant activity: Analytic approaches for the study of circadian rhythm.

    Science.gov (United States)

    Thomas, Karen A; Burr, Robert L; Spieker, Susan

    2015-11-01

    The study of infant and mother circadian rhythm entails choice of instruments appropriate for use in the home environment as well as selection of analytic approach that characterizes circadian rhythm. While actigraphy monitoring suits the needs of home study, limited studies have examined mother and infant rhythm derived from actigraphy. Among this existing research a variety of analyses have been employed to characterize 24-h rhythm, reducing ability to evaluate and synthesize findings. Few studies have examined the correspondence of mother and infant circadian parameters for the most frequently cited approaches: cosinor, non-parametric circadian rhythm analysis (NPCRA), and autocorrelation function (ACF). The purpose of this research was to examine analytic approaches in the study of mother and infant circadian activity rhythm. Forty-three healthy mother and infant pairs were studied in the home environment over a 72h period at infant age 4, 8, and 12 weeks. Activity was recorded continuously using actigraphy monitors and mothers completed a diary. Parameters of circadian rhythm were generated from cosinor analysis, NPCRA, and ACF. The correlation among measures of rhythm center (cosinor mesor, NPCRA mid level), strength or fit of 24-h period (cosinor magnitude and R(2), NPCRA amplitude and relative amplitude (RA)), phase (cosinor acrophase, NPCRA M10 and L5 midpoint), and rhythm stability and variability (NPCRA interdaily stability (IS) and intradaily variability (IV), ACF) was assessed, and additionally the effect size (eta(2)) for change over time evaluated. Results suggest that cosinor analysis, NPCRA, and autocorrelation provide several comparable parameters of infant and maternal circadian rhythm center, fit, and phase. IS and IV were strongly correlated with the 24-h cycle fit. The circadian parameters analyzed offer separate insight into rhythm and differing effect size for the detection of change over time. Findings inform selection of analysis and

  5. An information-theoretic approach to assess practical identifiability of parametric dynamical systems.

    Science.gov (United States)

    Pant, Sanjay; Lombardi, Damiano

    2015-10-01

    A new approach for assessing parameter identifiability of dynamical systems in a Bayesian setting is presented. The concept of Shannon entropy is employed to measure the inherent uncertainty in the parameters. The expected reduction in this uncertainty is seen as the amount of information one expects to gain about the parameters due to the availability of noisy measurements of the dynamical system. Such expected information gain is interpreted in terms of the variance of a hypothetical measurement device that can measure the parameters directly, and is related to practical identifiability of the parameters. If the individual parameters are unidentifiable, correlation between parameter combinations is assessed through conditional mutual information to determine which sets of parameters can be identified together. The information theoretic quantities of entropy and information are evaluated numerically through a combination of Monte Carlo and k-nearest neighbour methods in a non-parametric fashion. Unlike many methods to evaluate identifiability proposed in the literature, the proposed approach takes the measurement-noise into account and is not restricted to any particular noise-structure. Whilst computationally intensive for large dynamical systems, it is easily parallelisable and is non-intrusive as it does not necessitate re-writing of the numerical solvers of the dynamical system. The application of such an approach is presented for a variety of dynamical systems--ranging from systems governed by ordinary differential equations to partial differential equations--and, where possible, validated against results previously published in the literature. Copyright © 2015 Elsevier Inc. All rights reserved.

  6. [Severe idiopathic scoliosis. Does the approach and the instruments used modify the results?].

    Science.gov (United States)

    Sánchez-Márquez, J M; Sánchez Pérez-Grueso, F J; Pérez Martín-Buitrago, M; Fernández-Baíllo, N; García-Fernández, A; Quintáns-Rodríguez, J

    2014-01-01

    The aim of this work is to evaluate and compare the radiographic results and complications of the surgical treatment of adolescents with idiopathic scoliosis greater than 75 degrees, using a double approach (DA) or an isolated posterior approach with hybrid instruments (posterior hybrid [PH]), or with «all-pedicle screws» (posterior screws [PS]). A retrospective review was performed on 69 patients with idiopathic scoliosis greater than 75°, with a follow-up of more than 2 years, to analyze the flexibility of the curves, the correction obtained, and the complications depending on the type of surgery. The Kruskal-Wallis test for non-parametric variables was used for the statistical analysis. There were no statistically significant differences between the 3 patient groups in the pre-surgical Cobb angle values (DA=89°, PH=83°, PS=83°), in the immediate post-surgical (DA=34°, PH=33°, PS=30°), nor at the end of follow-up (DA=36°, PH=36°, PS=33°) (P>.05). The percentage correction (DA=60%, PH=57%, PS=60%) was similar between groups (P>.05). The percentage of complications associated with the procedure was 20.8% in DA, 10% in PH and 20% in PS. Two patients in the PS group showed changes, with no neurological lesions, in the spinal cord monitoring, and one patient in the same group suffered a delayed and transient incomplete lesion. No significant differences were observed in the correction of severe idiopathic scoliosis between patients operated using the double or isolated posterior approach, regardless of the type of instrumentation used. Copyright © 2013 SECOT. Published by Elsevier Espana. All rights reserved.

  7. Measuring multi-joint stiffness during single movements: numerical validation of a novel time-frequency approach.

    Directory of Open Access Journals (Sweden)

    Davide Piovesan

    Full Text Available This study presents and validates a Time-Frequency technique for measuring 2-dimensional multijoint arm stiffness throughout a single planar movement as well as during static posture. It is proposed as an alternative to current regressive methods which require numerous repetitions to obtain average stiffness on a small segment of the hand trajectory. The method is based on the analysis of the reassigned spectrogram of the arm's response to impulsive perturbations and can estimate arm stiffness on a trial-by-trial basis. Analytic and empirical methods are first derived and tested through modal analysis on synthetic data. The technique's accuracy and robustness are assessed by modeling the estimation of stiffness time profiles changing at different rates and affected by different noise levels. Our method obtains results comparable with two well-known regressive techniques. We also test how the technique can identify the viscoelastic component of non-linear and higher than second order systems with a non-parametrical approach. The technique proposed here is very impervious to noise and can be used easily for both postural and movement tasks. Estimations of stiffness profiles are possible with only one perturbation, making our method a useful tool for estimating limb stiffness during motor learning and adaptation tasks, and for understanding the modulation of stiffness in individuals with neurodegenerative diseases.

  8. ADVANCED EARTH OBSERVATION APPROACH FOR MULTISCALE FOREST ECOSYSTEM SERVICES MODELING AND MAPPING (MIMOSE

    Directory of Open Access Journals (Sweden)

    G. Chirici

    2014-04-01

    Full Text Available In the last decade ecosystem services (ES have been proposed as a method for quantifying the multifunctional role of forest ecosystems. Their spatial distribution on large areas is frequently limited by the lack of information, because field data collection with traditional methods requires much effort in terms of time and cost.  In this contribution we propose a methodology (namely, MultIscale Mapping Of ecoSystem servicEs - MIMOSE based on the integration of remotely sensed images and field observation to produce a wall-to-wall geodatabase of forest parcels accompanied with several information useful as a basis for future trade-off analysis of different ES. Here, we present the application of the MIMOSE approach to a study area of 443,758 hectares  coincident with administrative Molise Region in Central Italy. The procedure is based on a local high resolution forest types map integrated with information on the main forest management approaches. Through the non-parametric k-Nearest Neighbors techniques, we produced a growing stock volume map integrating a local forest inventory with a multispectral satellite IRS LISS III imagery. With the growing stock volume map we derived a forest age map for even-aged forest types. Later these information were used to automatically create a vector forest parcels map by multidimensional image segmentation that were finally populated with a number of information useful for ES spatial estimation. The contribution briefly introduce to the MIMOSE methodology presenting the preliminary results we achieved which constitute the basis for a future implementation of ES modeling.

  9. Skin Detection Based on Color Model and Low Level Features Combined with Explicit Region and Parametric Approaches

    Directory of Open Access Journals (Sweden)

    HARPREET KAUR SAINI

    2014-10-01

    Full Text Available Skin detection is active research area in the field of computer vision which can be applied in the application of face detection, eye detection, etc. These detection helps in various applications such as driver fatigue monitoring system, surveillance system etc. In Computer vision applications, the color model and representations of the human image in color model is one of major module to detect the skin pixels. The mainstream technology is based on the individual pixels and selection of the pixels to detect the skin part in the whole image. In this thesis implementation, we presents a novel technique for skin color detection incorporating with explicit region based and parametric based approach which gives the better efficiency and performances in terms of skin detection in human images. Color models and image quantization technique is used to extract the regions of the images and to represent the image in a particular color model such as RGB and HSV, and then the parametric based approach is applied by selecting the low level skin features are applied to extract the skin and non-skin pixels of the images. In the first step, our technique uses the state-of-the-art non-parametric approach which we call the template based technique or explicitly defined skin regions technique. Then the low level features of the human skin are being extracted such as edge, corner detection which is also known as parametric method. The experimental results depict the improvement in detection rate of the skin pixels by this novel approach. And in the end we discuss the experimental results to prove the algorithmic improvements.

  10. The Effects of Learning Activities Corresponding with Students’ Learning Styles on Academic Success and Attitude within the Scope of Constructivist Learning Approach: The Case of the Concepts of Function and Derivative

    Directory of Open Access Journals (Sweden)

    Kemal Özgen

    2014-04-01

    Full Text Available The aim of this study was to identify the effects of learning activities according to students’ learning styles on students’ academic success and attitude towards mathematics within a scope of constructivist learning approach. The study had a semi-experimental research design based on the pre test-post test model with a control group. The participants of the study were students studying at a state high school in the 2010-2011 academic year. As part of the study, activities which were suitable to the students’ learning styles were developed within the scope of constructivist learning approach in line with McCarthy’s 4MAT system with 8 steps of learning and used for the learning of the concepts of function and derivative. Data were collected using data collection tools such as a personal information form, non-routine problems, and a mathematics attitude scale. Descriptive and non-parametric statistics were used for the analysis of quantitative data. Data analysis indicated that, the learning process in which activities appropriate for students’ learning styles were used to contribute to an increase in the students’ academic success and problem solving skills. Yet, there was no statistically significant difference in students’ attitudes towards mathematics.Key Words:    Constructivist learning approach, learning style, learning activity, success, attitude

  11. Segmenting Multiple Sclerosis Lesions using a Spatially Constrained K-Nearest Neighbour approach

    DEFF Research Database (Denmark)

    Lyksborg, Mark; Larsen, Rasmus; Sørensen, Per Soelberg

    2012-01-01

    We propose a method for the segmentation of Multiple Sclerosis lesions. The method is based on probability maps derived from a K-Nearest Neighbours classication. These are used as a non parametric likelihood in a Bayesian formulation with a prior that assumes connectivity of neighbouring voxels...

  12. Survival dimensionality reduction (SDR: development and clinical application of an innovative approach to detect epistasis in presence of right-censored data

    Directory of Open Access Journals (Sweden)

    Beretta Lorenzo

    2010-08-01

    Full Text Available Abstract Background Epistasis is recognized as a fundamental part of the genetic architecture of individuals. Several computational approaches have been developed to model gene-gene interactions in case-control studies, however, none of them is suitable for time-dependent analysis. Herein we introduce the Survival Dimensionality Reduction (SDR algorithm, a non-parametric method specifically designed to detect epistasis in lifetime datasets. Results The algorithm requires neither specification about the underlying survival distribution nor about the underlying interaction model and proved satisfactorily powerful to detect a set of causative genes in synthetic epistatic lifetime datasets with a limited number of samples and high degree of right-censorship (up to 70%. The SDR method was then applied to a series of 386 Dutch patients with active rheumatoid arthritis that were treated with anti-TNF biological agents. Among a set of 39 candidate genes, none of which showed a detectable marginal effect on anti-TNF responses, the SDR algorithm did find that the rs1801274 SNP in the FcγRIIa gene and the rs10954213 SNP in the IRF5 gene non-linearly interact to predict clinical remission after anti-TNF biologicals. Conclusions Simulation studies and application in a real-world setting support the capability of the SDR algorithm to model epistatic interactions in candidate-genes studies in presence of right-censored data. Availability: http://sourceforge.net/projects/sdrproject/

  13. Narrative approaches

    DEFF Research Database (Denmark)

    Stelter, Reinhard

    2012-01-01

    as a narrative-collaborative practice, an approach that is based on phenomenology, social constructionism and narrative theory. Seeing narrative coaching as a collaborative practice also leads to reflecting on the relationship between coach and coachee(s) in a new way, where both parts contribute to the dialogue......Narrative coaching is representative of the new wave – or third generation – of coaching practice . The theory and practice of narrative coaching takes into account the social and cultural conditions of late modern society, and must be seen as intertwined with them. Some initial conceptualizations...... and coachee. The conceptual framework will be tested by presenting central results of a research project. The ideas discussed in this chapter expand upon earlier concepts of the narrative approach (mainly formulated by White in 2007) by integrating ideas from phenomenology and experiential approaches...

  14. Capability approach

    DEFF Research Database (Denmark)

    Jensen, Niels Rosendal; Kjeldsen, Christian Christrup

    Lærebogen er den første samlede danske præsentation af den af Amartya Sen og Martha Nussbaum udviklede Capability Approach. Bogen indeholder en præsentation og diskussion af Sen og Nussbaums teoretiske platform. I bogen indgår eksempler fra såvel uddannelse/uddannelsespolitik, pædagogik og omsorg....

  15. Developing awareness of sustainability in nursing and midwifery using a scenario-based approach: Evidence from a pre and post educational intervention study.

    Science.gov (United States)

    Richardson, Janet; Grose, Jane; Bradbury, Martyn; Kelsey, Janet

    2017-07-01

    The delivery of healthcare has an impact on the environment and contributes to climate change. As a consequence, the way in which nurses and midwives use and dispose of natural resources in clinical practice, and the subsequent impact on the environment, should be integral component of nursing and midwifery education. Opportunities need to be found to embed such issues into nursing curricula; thus bringing sustainability issues 'closer to home' and making them more relevant for clinical practice. The study was designed to measure the impact of a sustainability-focussed, scenario-based learning educational intervention on the attitudes and knowledge of student nurses and midwives. Pre test/Post test intervention study using scenario-based learning as the educational intervention. The Sustainability Attitudes in Nursing Survey (SANS_2) was used as the outcome measure. Clinical skills session in a UK University School of Nursing and Midwifery. 676 second year undergraduate nursing and midwifery students. The 7-point scale SANS survey was completed before and after the teaching session; standard non-parametric analysis compared pre and post intervention scores. Changes were observed in attitude towards climate change and sustainability and to the inclusion of these topics within the nursing curricula (p=0.000). Participants demonstrated greater knowledge of natural resource use and the cost of waste disposal following the session (p=0.000). Participants also reported that sessions were realistic, and levels of agreement with statements supporting the value of the session and the interactive nature of delivery were higher following the session. Using a scenario-based learning approach with nursing and midwifery students can change attitudes and knowledge towards sustainability and climate change. Embedding this approach in the context of clinical skills provides a novel and engaging approach that is both educationally sound and clinically relevant. Copyright © 2017

  16. Pedagogical approaches

    DEFF Research Database (Denmark)

    Lund Larsen, Lea

    and their particular needs, which teachers must be aware of and deal with. Secondly, I propose a combination of adult learners’ characteristics with ‘teaching orientations’, as a basis for further research on teachers of adults’ professional development. Some of the competencies that teachers need can be taught...... in formal settings, but in most teaching settings, the teachers act alone and develop their pedagogical approaches/- teaching strategies with no synchronous sparring from a colleague. Adult learners have particular needs and characteristics that their teachers must be able to address (cf. Knowles......This paper is a part of an on-going qualitative empirical research project: “Teachers of adults as learners. A study on teachers’ experiences in practice”. Data is collected at a Danish Adult Education Centre. The aim of the study is to understand teachers’ learning experiences. The research...

  17. A statistical approach to bioclimatic trend detection in the airborne pollen records of Catalonia (NE Spain).

    Science.gov (United States)

    Fernández-Llamazares, Alvaro; Belmonte, Jordina; Delgado, Rosario; De Linares, Concepción

    2014-04-01

    Airborne pollen records are a suitable indicator for the study of climate change. The present work focuses on the role of annual pollen indices for the detection of bioclimatic trends through the analysis of the aerobiological spectra of 11 taxa of great biogeographical relevance in Catalonia over an 18-year period (1994-2011), by means of different parametric and non-parametric statistical methods. Among others, two non-parametric rank-based statistical tests were performed for detecting monotonic trends in time series data of the selected airborne pollen types and we have observed that they have similar power in detecting trends. Except for those cases in which the pollen data can be well-modeled by a normal distribution, it is better to apply non-parametric statistical methods to aerobiological studies. Our results provide a reliable representation of the pollen trends in the region and suggest that greater pollen quantities are being liberated to the atmosphere in the last years, specially by Mediterranean taxa such as Pinus, Total Quercus and Evergreen Quercus, although the trends may differ geographically. Longer aerobiological monitoring periods are required to corroborate these results and survey the increasing levels of certain pollen types that could exert an impact in terms of public health.

  18. Approaches for the accurate definition of geological time boundaries

    Science.gov (United States)

    Schaltegger, Urs; Baresel, Björn; Ovtcharova, Maria; Goudemand, Nicolas; Bucher, Hugo

    2015-04-01

    of the ash, therefore masking the true age of deposition. Trace element ratios such as Th/U, Yb/Gd, as well as Hf isotope analysis of dated zircon can be used to decipher the temporal evolution of the magmatic system before the eruption and deposition of the studied ashes, and resolve the complex system behaviour of the zircons. b) Changes in the source of the magma may happen between the deposition of two stratigraphically consecutive ash beds. They result in the modification of the trace element signature of zircon, but also of apatite (Ca5 (F, Cl, OH) (PO4)3). Trace element characteristics in apatite (e.g. Mg, Mn, Fe, F, Cl, Ce, and Y) are a reliable tool for distinguishing chemically similar groups of apatite crystals to unravel the geochemical fingerprint of one single ash bed. By establishing this fingerprint, ash beds of geographically separated geologic sections can be correlated even if they have not all been dated by U-Pb techniques. c) The ultimate goal of quantitative stratigraphy is to establish an age model that predicts the age of a synchronous time line with an associated 95% confidence interval for any such line within a stratigraphic sequence. We show how a Bayesian, non-parametric interpolation approach can be applied to very complex data sets and leads to a well-defined age solution, possibly identifying changes in sedimentation rate. The age of a geological time boundary bracketed by dated samples in such an age model can be defined with an associated uncertainty.

  19. 1312-IJBCS-Article-Tchatchoua Dorothy

    African Journals Online (AJOL)

    pc

    Methodological approaches for the selection of genotypes in a progeny trial of. Dacryodes edulis (G. ... domesticating the high-value indigenous fruit ... weighed against the risk of future costs due to .... Spearman non-parametric correlation.

  20. Electrodiagnostic evaluation of median nerve conduction in Type II ...

    African Journals Online (AJOL)

    MJP

    2015-12-29

    Dec 29, 2015 ... them at high risk of developingdevastating hand and foot .... range in the case of parametric and non- parametric data ... The median value of median nerve velocities. (motor) with ... clinical approaches are, to a large extent,.

  1. Vitamin A status, other risk factors and acute respiratory infection ...

    African Journals Online (AJOL)

    1997-01-01

    Jan 1, 1997 ... a comprehensive approach to public health programmes to address AR!. The role .... parametric and non-parametric tests. The frequencies of ... values and odds ratios to evaluate associations between the risk factors and the ...

  2. Limits to detectability of land degradation by trend analysis of vegetation index data

    CSIR Research Space (South Africa)

    Wessels, Konrad J

    2012-10-01

    Full Text Available This paper demonstrates a simulation approach for testing the sensitivity of linear and non-parametric trend analysis methods applied to remotely sensed vegetation index data for the detection of land degradation. The intensity, rate and timing...

  3. The development of a directed population approach to tackle inequalities in dental caries prevalence among secondary school children based on a small area profile.

    Science.gov (United States)

    Sagheri, Darius; Hahn, Petra; Hellwig, Elmar

    2008-06-01

    It has been observed that the prevalence of dental caries among children has declined in the last decade in Germany. However, despite of these improvements there is still a proportion of children suffering from dental decay. The aims of this study were to evaluate if a social gradient in the prevalence of dental caries exists and, based on those findings, to develop a strategy to target those children with heightened risk to develop dental caries in order to assist oral health care professionals to refocus the current uniform school-based dental health programme to a caries preventive strategy based on a directed population approach. A representative, random sample of 12-year olds in Freiburg (Germany) was examined and dental caries was recorded using WHO criteria. Educational attainment of the child's parents was used as an indicator of socio-economic status and classified by use of the CASMIN Educational Classification. A total of 322 children participated. An examination of dental caries score revealed that its distribution was positively skewed. For this reason this study provides summary analyses based on medians and a non-parametric rank-sum test. The Kruskal-Wallis H-test showed a significant difference between median scores across the different educational levels (p-value = 0.015) which was due to lower dental caries levels in children with non-deprived social background. In order to reduce current social inequalities in child oral health the current uniform school-based dental health programme at secondary school level should be developed to a targeted school-based screening and prevention programme.

  4. Are Public-Private Partnerships a Source of Greater Efficiency in Water Supply? Results of a Non-Parametric Performance Analysis Relating to the Italian Industry

    Directory of Open Access Journals (Sweden)

    Corrado lo Storto

    2013-12-01

    Full Text Available This article reports the outcome of a performance study of the water service provision industry in Italy. The study evaluates the efficiency of 21 “private or public-private” equity and 32 “public” equity water service operators and investigates controlling factors. In particular, the influence that the operator typology and service management nature - private vs. public - has on efficiency is assessed. The study employed a two-stage Data Envelopment Analysis methodology. In the first stage, the operational efficiency of water supply operators is calculated by implementing a conventional BCC DEA model, that uses both physical infrastructure and financial input and output variables to explore economies of scale. In the second stage, bootstrapped DEA and Tobit regression are performed to estimate the influence that a number of environmental factors have on water supplier efficiency. The results show that the integrated water provision industry in Italy is characterized by operational inefficiencies of service operators, and scale and agglomeration economies may have a not negligible effect on efficiency. In addition, the operator typology and its geographical location affect efficiency.

  5. Classification of thermal-hydraulics using scenarios of non-parametric techniques; Clasificacion de escenarios termo-hidraulicos mediante uso de tecnicas no parametricas

    Energy Technology Data Exchange (ETDEWEB)

    Villamizar, M.; Martorell, S.; Sanchez-Saez, F.; Villanueva, J. F.; Carlos, S.; Sanchez, A.

    2013-07-01

    The objective of the study is to apply a probabilistic neural network (PNN) that allow to classify sheath temperature trajectory from the beginning of the accident to the stabilization of the plant from a certain inputs and a starting from these groups establish models that predict the clad temperature peaking.

  6. Detection of SYN Flooding Attacks Based on Non-parametric CUSUM Algorithm%基于非参数CUSUM算法的SYN Flooding攻击检测

    Institute of Scientific and Technical Information of China (English)

    程军; 林白; 芦建芝; 李鸥

    2006-01-01

    针对危害性极大的SYN Flooding攻击,提出了一种新的检测方法.该方法监控进入网络的TCP业务的SYN包与FIN(RsT)包的平衡性,并使用非参数累积和(CUSUM)算法来检测SYN包与FIN(RST)包数量的均衡性的变化.该方法不需要正常业务和攻击业务的详细模型,能提高检测的准确性和在线检测速度,降低运算开销.

  7. Application of Non-parametric Statistics in Market Research%非参数统计分析方法在市场调查中的应用

    Institute of Scientific and Technical Information of China (English)

    曹小敬

    2007-01-01

    市场调查是以市场为对象,收集、记录、整理与分析企业经营活动有关的数据、资料的活动。市场调查对于企业而言,犹如医生诊断患者,不经市场调查,就无从了解市场情况,就无从制定企业的经营战略。

  8. The evolution of illness phases in schizophrenia: A non-parametric item response analysis of the Positive and Negative Syndrome Scale

    Directory of Open Access Journals (Sweden)

    Anzalee Khan

    2014-06-01

    Conclusion: Findings confirm differences in symptom presentation and predominance of particular domains in subpopulations of schizophrenia. Identifying symptom domains characteristic of subpopulations may be more useful in assessing efficacy endpoints than total or subscale scores.

  9. The Impact of ICT on Educational Performance and its Efficiency in Selected EU and OECD Countries: A Non-Parametric Analysis

    Science.gov (United States)

    Aristovnik, Aleksander

    2012-01-01

    The purpose of the paper is to review some previous researches examining ICT efficiency and the impact of ICT on educational output/outcome as well as different conceptual and methodological issues related to performance measurement. Moreover, a definition, measurements and the empirical application of a model measuring the efficiency of ICT use…

  10. 广义Lorenz曲线的非参数统计推断%Non-parametric inferences for the generalized Lorenz curve

    Institute of Scientific and Technical Information of China (English)

    杨宝莹; 秦更生; BELINGA-HILL Nelly E.

    2012-01-01

    本文讨论了广义Lorenz曲线的经验似然统计推断.在简单随机抽样、分层随机抽样和整群随机抽样下,本文分别定义了广义Lorenz坐标的profile经验似然比统计量,得出这些经验似然比的极限分布为带系数的自由度为1的x2分布.对于整个Lorenz曲线,基于经验似然方法类似地得出相应的极限过程.根据所得的经验似然理论,本文给出了bootstrap经验似然置信区间构造方法,并通过数据模拟,对新给出的广义Lorenz坐标的bootstrap经验似然置信区间与渐近正态置信区间以及bootstrap置信区间等进行了对比研究.对整个Lorenz曲线,基于经验似然方法对其置信域也进行了模拟研究.最后我们将所推荐的置信区间应用到实例中.%In this paper, we discuss the empirical likelihood-based inferences for the generalized Lorenz (GL) curve. In the settings of simple random sampling, stratified random sampling and cluster random sampling, it is shown that the limiting distributions of the empirical likelihood ratio statistics for the GL ordinate are the scaled x2 distributions with one degree of freedom. We also derive the limiting processes of the associated empirical likelihood-based GL processes. Various confidence intervals for the GL ordinate are proposed based on bootstrap method and the newly developed empirical likelihood theory. Extensive simulation studies are conducted to compare the relative performances of various confidence intervals for GL ordinates in terms of coverage probability and average interval length. The finite sample performances of the empirical likelihood-based confidence bands are also illustrated in simulation studies. Finally, a real example is used to illustrate the application of the recommended intervals.

  11. Ecotoxicology is not normal: A comparison of statistical approaches for analysis of count and proportion data in ecotoxicology.

    Science.gov (United States)

    Szöcs, Eduard; Schäfer, Ralf B

    2015-09-01

    Ecotoxicologists often encounter count and proportion data that are rarely normally distributed. To meet the assumptions of the linear model, such data are usually transformed or non-parametric methods are used if the transformed data still violate the assumptions. Generalized linear models (GLMs) allow to directly model such data, without the need for transformation. Here, we compare the performance of two parametric methods, i.e., (1) the linear model (assuming normality of transformed data), (2) GLMs (assuming a Poisson, negative binomial, or binomially distributed response), and (3) non-parametric methods. We simulated typical data mimicking low replicated ecotoxicological experiments of two common data types (counts and proportions from counts). We compared the performance of the different methods in terms of statistical power and Type I error for detecting a general treatment effect and determining the lowest observed effect concentration (LOEC). In addition, we outlined differences on a real-world mesocosm data set. For count data, we found that the quasi-Poisson model yielded the highest power. The negative binomial GLM resulted in increased Type I errors, which could be fixed using the parametric bootstrap. For proportions, binomial GLMs performed better than the linear model, except to determine LOEC at extremely low sample sizes. The compared non-parametric methods had generally lower power. We recommend that counts in one-factorial experiments should be analyzed using quasi-Poisson models and proportions from counts by binomial GLMs. These methods should become standard in ecotoxicology.

  12. Estimation of Esfarayen Farmers Risk Aversion Coefficient and Its Influencing Factors (Nonparametric Approach

    Directory of Open Access Journals (Sweden)

    Z. Nematollahi

    2016-03-01

    Full Text Available Introduction: Due to existence of the risk and uncertainty in agriculture, risk management is crucial for management in agriculture. Therefore the present study was designed to determine the risk aversion coefficient for Esfarayens farmers. Materials and Methods: The following approaches have been utilized to assess risk attitudes: (1 direct elicitation of utility functions, (2 experimental procedures in which individuals are presented with hypothetical questionnaires regarding risky alternatives with or without real payments and (3: Inference from observation of economic behavior. In this paper, we focused on approach (3: inference from observation of economic behavior, based on this assumption of existence of the relationship between the actual behavior of a decision maker and the behavior predicted from empirically specified models. A new non-parametric method and the QP method were used to calculate the coefficient of risk aversion. We maximized the decision maker expected utility with the E-V formulation (Freund, 1956. Ideally, in constructing a QP model, the variance-covariance matrix should be formed for each individual farmer. For this purpose, a sample of 100 farmers was selected using random sampling and their data about 14 products of years 2008- 2012 were assembled. The lowlands of Esfarayen were used since within this area, production possibilities are rather homogeneous. Results and Discussion: The results of this study showed that there was low correlation between some of the activities, which implies opportunities for income stabilization through diversification. With respect to transitory income, Ra, vary from 0.000006 to 0.000361 and the absolute coefficient of risk aversion in our sample were 0.00005. The estimated Ra values vary considerably from farm to farm. The results showed that the estimated Ra for the subsample existing of 'non-wealthy' farmers was 0.00010. The subsample with farmers in the 'wealthy' group had an

  13. A fully Bayesian approach to the parcel-based detection-estimation of brain activity in fMRI

    Energy Technology Data Exchange (ETDEWEB)

    Makni, S. [Univ Oxford, John Radcliffe Hosp, Oxford Ctr Funct Magnet Resonance Imaging Brain, Oxford OX3 9DU (United Kingdom); Idier, J. [IRCCyN CNRS, Nantes (France); Vincent, T.; Ciuciu, P. [CEA, NeuroSpin, Gif Sur Yvette (France); Vincent, T.; Dehaene-Lambertz, G.; Ciuciu, P. [Inst Imagerie Neurofonctionnelle, IFR 49, Paris (France); Thirion, B. [INRIA Futurs, Orsay (France); Dehaene-Lambertz, G. [INSERM, NeuroSpin, U562, Gif Sur Yvette (France)

    2008-07-01

    Within-subject analysis in fMRI essentially addresses two problems, i. e., the detection of activated brain regions in response to an experimental task and the estimation of the underlying dynamics, also known as the characterisation of Hemodynamic response function (HRF). So far, both issues have been treated sequentially while it is known that the HRF model has a dramatic impact on the localisation of activations and that the HRF shape may vary from one region to another. In this paper, we conciliate both issues in a region-based joint detection-estimation framework that we develop in the Bayesian formalism. Instead of considering function basis to account for spatial variability, spatially adaptive General Linear Models are built upon region-based non-parametric estimation of brain dynamics. Regions are first identified as functionally homogeneous parcels in the mask of the grey matter using a specific procedure [Thirion, B., Flandin, G., Pinel, P., Roche, A., Ciuciu, P., Poline, J.B., August 2006. Dealing with the shortcomings of spatial normalization: Multi-subject parcellation of fMRI datasets. Hum. Brain Mapp. 27 (8), 678-693.]. Then, in each parcel, prior information is embedded to constrain this estimation. Detection is achieved by modelling activating, deactivating and non-activating voxels through mixture models within each parcel. From the posterior distribution, we infer upon the model parameters using Markov Chain Monte Carlo (MCMC) techniques. Bayesian model comparison allows us to emphasize on artificial datasets first that inhomogeneous gamma-Gaussian mixture models outperform Gaussian mixtures in terms of sensitivity/specificity trade-off and second that it is worthwhile modelling serial correlation through an AR(1) noise process at low signal-to-noise (SNR) ratio. Our approach is then validated on an fMRI experiment that studies habituation to auditory sentence repetition. This phenomenon is clearly recovered as well as the hierarchical temporal

  14. A discriminative model-constrained EM approach to 3D MRI brain tissue classification and intensity non-uniformity correction

    Energy Technology Data Exchange (ETDEWEB)

    Wels, Michael; Hornegger, Joachim [Pattern Recognition Lab, Department of Computer Science, Friedrich-Alexander University Erlangen-Nuremberg, Martensstr. 3, 91058 Erlangen (Germany); Zheng Yefeng; Comaniciu, Dorin [Corporate Research and Technologies, Siemens Corporate Technology, 755 College Road East, Princeton, NJ 08540 (United States); Huber, Martin, E-mail: michael.wels@informatik.uni-erlangen.de [Corporate Research and Technologies, Siemens Corporate Technology, Guenther-Scharowsky-Str. 1, 91058 Erlangen (Germany)

    2011-06-07

    We describe a fully automated method for tissue classification, which is the segmentation into cerebral gray matter (GM), cerebral white matter (WM), and cerebral spinal fluid (CSF), and intensity non-uniformity (INU) correction in brain magnetic resonance imaging (MRI) volumes. It combines supervised MRI modality-specific discriminative modeling and unsupervised statistical expectation maximization (EM) segmentation into an integrated Bayesian framework. While both the parametric observation models and the non-parametrically modeled INUs are estimated via EM during segmentation itself, a Markov random field (MRF) prior model regularizes segmentation and parameter estimation. Firstly, the regularization takes into account knowledge about spatial and appearance-related homogeneity of segments in terms of pairwise clique potentials of adjacent voxels. Secondly and more importantly, patient-specific knowledge about the global spatial distribution of brain tissue is incorporated into the segmentation process via unary clique potentials. They are based on a strong discriminative model provided by a probabilistic boosting tree (PBT) for classifying image voxels. It relies on the surrounding context and alignment-based features derived from a probabilistic anatomical atlas. The context considered is encoded by 3D Haar-like features of reduced INU sensitivity. Alignment is carried out fully automatically by means of an affine registration algorithm minimizing cross-correlation. Both types of features do not immediately use the observed intensities provided by the MRI modality but instead rely on specifically transformed features, which are less sensitive to MRI artifacts. Detailed quantitative evaluations on standard phantom scans and standard real-world data show the accuracy and robustness of the proposed method. They also demonstrate relative superiority in comparison to other state-of-the-art approaches to this kind of computational task: our method achieves average

  15. A geostatistical approach for describing spatial pattern in stream networks

    Science.gov (United States)

    Ganio, L.M.; Torgersen, C.E.; Gresswell, R.E.

    2005-01-01

    The shape and configuration of branched networks influence ecological patterns and processes. Recent investigations of network influences in riverine ecology stress the need to quantify spatial structure not only in a two-dimensional plane, but also in networks. An initial step in understanding data from stream networks is discerning non-random patterns along the network. On the other hand, data collected in the network may be spatially autocorrelated and thus not suitable for traditional statistical analyses. Here we provide a method that uses commercially available software to construct an empirical variogram to describe spatial pattern in the relative abundance of coastal cutthroat trout in headwater stream networks. We describe the mathematical and practical considerations involved in calculating a variogram using a non-Euclidean distance metric to incorporate the network pathway structure in the analysis of spatial variability, and use a non-parametric technique to ascertain if the pattern in the empirical variogram is non-random.

  16. Describing spatial pattern in stream networks: A practical approach

    Science.gov (United States)

    Ganio, L.M.; Torgersen, C.E.; Gresswell, R.E.

    2005-01-01

    The shape and configuration of branched networks influence ecological patterns and processes. Recent investigations of network influences in riverine ecology stress the need to quantify spatial structure not only in a two-dimensional plane, but also in networks. An initial step in understanding data from stream networks is discerning non-random patterns along the network. On the other hand, data collected in the network may be spatially autocorrelated and thus not suitable for traditional statistical analyses. Here we provide a method that uses commercially available software to construct an empirical variogram to describe spatial pattern in the relative abundance of coastal cutthroat trout in headwater stream networks. We describe the mathematical and practical considerations involved in calculating a variogram using a non-Euclidean distance metric to incorporate the network pathway structure in the analysis of spatial variability, and use a non-parametric technique to ascertain if the pattern in the empirical variogram is non-random.

  17. Segmenting Multiple Sclerosis Lesions Using a Spatially Constrained K-Nearest Neighbour Approach

    DEFF Research Database (Denmark)

    Lyksborg, Mark; Larsen, Rasmus; Sørensen, Per Soelberg;

    2012-01-01

    We propose a method for the segmentation of Multiple Sclerosis lesions. The method is based on probability maps derived from a K-Nearest Neighbours classification. These are used as a non parametric likelihood in a Bayesian formulation with a prior that assumes connectivity of neighbouring voxels......, the diffusion MRI measures of Fractional Anisotropy (FA), Mean Diffusivity (MD) and several spatial features. Results show a benefit from the inclusion of diffusion primarily to the most difficult cases. Results shows that combining probabilistic K-Nearest Neighbour with a Markov Random Field formulation leads...... to a slight improvement of segmentations....

  18. Complementary Health Approaches

    Science.gov (United States)

    ... on some complementary approaches, such as acupuncture and yoga, but there have been fewer studies on other approaches, so much less is known about them. The National Institutes of Health (NIH) is sponsoring research to learn more about ...

  19. Evaluating six soft approaches

    Directory of Open Access Journals (Sweden)

    Lene Sørensen

    2008-09-01

    Full Text Available The paper introduces and evaluates six soft approaches used in strategy development and planning. We take a planner’s perspective on discussing the concepts of strategy development and planning. This means that we see strategy development and planning as learning processes based on Ackoff’s interactive planning principles to be supported by soft approaches in carrying out the principles in action. These six soft approaches are suitable for supporting various steps of the strategy development and planning process. These are the SWOT analysis, the Future Workshop, the Scenario methodology, Strategic Option Development and Analysis, Strategic Choice Approach and Soft Systems Methodology. Evaluations of each methodology are carried out using a conceptual framework in which the organisation, the result, the process and the technology of the specific approach are taken into consideration. Using such a conceptual framework for evaluations of soft approaches increases the understanding of them, their transparency, and their usability in practice.

  20. The sustainable livelihoods approach

    DEFF Research Database (Denmark)

    Oelofse, Myles; Jensen, Henning Høgh

    2008-01-01

    food chain has on producers and their families, an analysis was conducted of the use of the Sustainable Livelihoods Approach (SLA). The SLA provides a holistic and integrative approach which researchers can use as the overriding frame for their research. The application of the approach is recommended...... as it enables us to maintain important elements of the sustainability vision, yet emphasises that a number of assets influence farmers' livelihoods and it maintains the focus on salience, legitimacy, and credibility in the research....

  1. Evaluating six soft approaches

    DEFF Research Database (Denmark)

    Sørensen, Lene Tolstrup; Vidal, Rene Victor Valqui

    2008-01-01

    's interactive planning principles to be supported by soft approaches in carrying out the principles in action. These six soft approaches are suitable forsupporting various steps of the strategy development and planning process. These are the SWOT analysis, the Future Workshop, the Scenario methodology......, Strategic Option Development and Analysis, Strategic Choice Approach and Soft Systems Methodology. Evaluations of each methodology are carried out using a conceptual framework in which the organisation, the result, the process and the technology of the specific approach are taken into consideration. Using...

  2. Evaluating Six Soft Approaches

    DEFF Research Database (Denmark)

    Sørensen, Lene Tolstrup; Valqui Vidal, René Victor

    2008-01-01

    's interactive planning principles to be supported by soft approaches in carrying out the principles in action. These six soft approaches are suitable forsupporting various steps of the strategy development and planning process. These are the SWOT analysis, the Future Workshop, the Scenario methodology......, Strategic Option Development and Analysis, Strategic Choice Approach and Soft Systems Methodology. Evaluations of each methodology are carried out using a conceptual framework in which the organisation, the result, the process and the technology of the specific approach are taken into consideration. Using...

  3. Intradural anterior transpetrosal approach.

    Science.gov (United States)

    Ichimura, Shinya; Hori, Satoshi; Hecht, Nils; Czabanka, Marcus; Vajkoczy, Peter

    2016-10-01

    The standard anterior transpetrosal approach (ATPA) for petroclival lesions is fundamentally an epidural approach and has been practiced for many decades quite successfully. However, this approach has some disadvantages, such as epidural venous bleeding around foramen ovale. We describe here our experience with a modified technique for anterior petrosectomy via an intradural approach that overcomes these disadvantages. Five patients with petroclival lesions underwent surgery via the intradural ATPA. The intraoperative hallmarks are detailed, and surgical results are reported. Total removal of the lesions was achieved in two patients with petroclival meningioma and two patients with pontine cavernoma, whereas subtotal removal was achieved in one patient with petroclival meningioma without significant morbidity. No patient experienced cerebrospinal fluid leakage. The intradural approach is allowed to tailor the extent of anterior petrosectomy to the individually required exposure, and the surgical procedure appeared to be more straightforward than via the epidural route. Caveats encountered with the approach were the temporal basal veins that could be spared as well as identification of the petrous apex due to the lack of familial epidural landmarks. The risk of injury to the temporal bridging veins is higher in this approach than in the epidural approach. Intradural approach is recommended in patients with a large epidural venous route, such as sphenobasal and sphenopetrosal vein. Navigation via bone-window computed tomography is useful to identify the petrous apex.

  4. Alternative Auditing Approaches

    Energy Technology Data Exchange (ETDEWEB)

    Kandt, Alicen J [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-09-15

    This presentation for the 2017 Energy Exchange in Tampa, Florida, offers information about advanced auditing technologies and techniques including alternative auditing approaches and considerations and caveats.

  5. Evaluating six soft approaches

    DEFF Research Database (Denmark)

    Sørensen, Lene Tolstrup; Vidal, Rene Victor Valqui

    2008-01-01

    's interactive planning principles to be supported by soft approaches in carrying out the principles in action. These six soft approaches are suitable forsupporting various steps of the strategy development and planning process. These are the SWOT analysis, the Future Workshop, the Scenario methodology......, Strategic Option Development and Analysis, Strategic Choice Approach and Soft Systems Methodology. Evaluations of each methodology are carried out using a conceptual framework in which the organisation, the result, the process and the technology of the specific approach are taken into consideration. Using...

  6. A Comparison on the Audiolingual Approach and the Communicative Approach

    Institute of Scientific and Technical Information of China (English)

    代海娜

    2015-01-01

    Audiolingual approach and communicative approach are two important approaches in language teaching.In this paper,some differences and both advantages and diadvantages will be discussed.Thus,to conduct the important usage of approachs in language teaching.

  7. The TLC Approach.

    Science.gov (United States)

    Welker, William A.

    2002-01-01

    Notes how the author has developed the Teaching and Learning Cues (TLC) approach, an offspring of textbook organizational patterns instruction that stresses the significance of certain words and phrases in reading. Concludes that with the TLC approach, students learn to appreciate the important role cue words and phrases play in understanding…

  8. Articulating Design Approaches?

    DEFF Research Database (Denmark)

    Kensing, Finn; Bødker, Keld; Simonsen, Jesper

    We are working on an approach for designing CSCW systems since we advocate the importance of generalizing from own work practice as designers and from studies of designers working under industrial conditions. We use the term approach as something in between commodified methods and isolated techni...

  9. Stuttering-Psycholinguistic Approach

    Science.gov (United States)

    Hategan, Carolina Bodea; Anca, Maria; Prihoi, Lacramioara

    2012-01-01

    This research promotes psycholinguistic paradigm, it focusing in delimitating several specific particularities in stuttering pathology. Structural approach, on language sides proves both the recurrent aspects found within specialized national and international literature and the psycholinguistic approaches dependence on the features of the…

  10. Ten practice redesign approaches.

    Science.gov (United States)

    Slayton, Val

    2013-01-01

    As healthcare delivery continues to evolve at a rapid pace, practices need to consider redesign approaches to stay ahead of the pack. From national policy and private payer initiatives to societal macro trends and the growing use of mobile technologies, delivering value, understanding customer needs, and assessing satisfaction are important elements to achieve and maintain success. This article discusses 10 practice redesign approaches.

  11. Modular Approach for Ethics

    Science.gov (United States)

    Wyne, Mudasser F.

    2010-01-01

    It is hard to define a single set of ethics that will cover an entire computer users community. In this paper, the issue is addressed in reference to code of ethics implemented by various professionals, institutes and organizations. The paper presents a higher level model using hierarchical approach. The code developed using this approach could be…

  12. Approaches to understand culture

    DEFF Research Database (Denmark)

    Rasmussen, Lauge Baungaard; Rauner, Felix

    1996-01-01

    Different approaches to understand the concept ofculture are presented and evaluated. The author'sconcept of culture is defined. Different aspectsof the concept are discussed.......Different approaches to understand the concept ofculture are presented and evaluated. The author'sconcept of culture is defined. Different aspectsof the concept are discussed....

  13. The TLC Approach.

    Science.gov (United States)

    Welker, William A.

    2002-01-01

    Notes how the author has developed the Teaching and Learning Cues (TLC) approach, an offspring of textbook organizational patterns instruction that stresses the significance of certain words and phrases in reading. Concludes that with the TLC approach, students learn to appreciate the important role cue words and phrases play in understanding…

  14. When is Menzerath-Altmann law mathematically trivial? A new approach.

    Science.gov (United States)

    Ferrer-i-Cancho, Ramon; Hernández-Fernández, Antoni; Baixeries, Jaume; Dębowski, Łukasz; Mačutek, Ján

    2014-12-01

    Menzerath's law, the tendency of Z (the mean size of the parts) to decrease as X (the number of parts) increases, is found in language, music and genomes. Recently, it has been argued that the presence of the law in genomes is an inevitable consequence of the fact that Z=Y/X, which would imply that Z scales with X as Z ∼ 1/X. That scaling is a very particular case of Menzerath-Altmann law that has been rejected by means of a correlation test between X and Y in genomes, being X the number of chromosomes of a species, Y its genome size in bases and Z the mean chromosome size. Here we review the statistical foundations of that test and consider three non-parametric tests based upon different correlation metrics and one parametric test to evaluate if Z ∼ 1/X in genomes. The most powerful test is a new non-parametric one based upon the correlation ratio, which is able to reject Z ∼ 1/X in nine out of 11 taxonomic groups and detect a borderline group. Rather than a fact, Z ∼ 1/X is a baseline that real genomes do not meet. The view of Menzerath-Altmann law as inevitable is seriously flawed.

  15. A new approach for product cost estimation using data envelopment analysis

    Directory of Open Access Journals (Sweden)

    Adil Salam

    2012-10-01

    Full Text Available Cost estimation of new products has always been difficult as only few design, manufacturing and operational features will be known. In these situations, parametric or non-parametric methods are commonly used to estimate the cost of a product given the corresponding cost drivers. The parametric models use priori determined cost function where the parameters of the function are evaluated from historical data. Non-parametric methods, on the other hand, attempt to fit curves to the historic data without predetermined function. In both methods, it is assumed that the historic data used in the analysis is a true representation of the relation between the cost drivers and the corresponding costs. However, because of efficiency variations of the manufacturers and suppliers, changes in supplier selections, market fluctuations, and several other reasons, certain costs in the historic data may be too high whereas other costs may represent better deals for their corresponding cost drivers. Thus, it may be important to rank the historic data and identify benchmarks and estimate the target costs of the product based on these benchmarks. In this paper, a novel adaptation of cost drivers and cost data is introduced in order to use data envelopment analysis for the purpose of ranking cost data and identify benchmarks, and then estimate the target costs of a new product based on these benchmarks. An illustrative case study has been presented for the cost estimation of landing gears of an aircraft manufactured by an aerospace company located in Montreal, CANADA.

  16. Medicinal plants in the cultural landscape of a Mapuche-Tehuelche community in arid Argentine Patagonia: an eco-sensorial approach

    Science.gov (United States)

    2014-01-01

    Background The taste and smell of medicinal plants and their relation to the cultural landscape of a Mapuche-Tehuelche community in the Patagonian steppe was investigated. We assume that the landscapes as a source of therapeutic resources is perceived, classified and named according to different symbolic, ecological and utilitarian criteria which are influenced by chemosensorial appearance of medicinal plants which are valued by inhabitants. Methods Information relating to the cultural landscape experienced by 18 inhabitants, all representing 85% of the families, in terms of medicinal plants, knowledge of species and their organoleptic perception was obtained through participant observation, interviews and free listing. The data were examined using cualitative and quantitative approach, including discourse analysis and non-parametric statistics. Results Informants use 121 medicinal species, obtained from both wild and non-wild environments, most of which (66%) present aroma and/or taste. It was found that the plants with highest use consensus used for digestive, respiratory, cardio-vascular, analgesic-anti-inflammatory, obstetric-gynaecological and genito-unrinary complaints, have the highest frequencies of cites reporting flavor; and those with the highest frequencies relating to digestive, analgesic-anti-inflammatory and cultural syndromes present the highest frequencies of aroma. Flavor and/or aroma are interpreted as strong or soft, and the strongest are associated with treatment of supernatural ailments. Also, taste is a distinctive trait for the most of the species collected in all natural units of the landscape, while aroma is more closely associated with species growing at higher altitudes. The local pharmacopeia is also enriched with plants that come from more distant phytogeographical environments, such as the Andean forest and the Patagonian Monte, which are obtained through barter with neighboring populations. Herbal products are also obtained in

  17. Medicinal plants in the cultural landscape of a Mapuche-Tehuelche community in arid Argentine Patagonia: an eco-sensorial approach.

    Science.gov (United States)

    Molares, Soledad; Ladio, Ana

    2014-08-26

    The taste and smell of medicinal plants and their relation to the cultural landscape of a Mapuche-Tehuelche community in the Patagonian steppe was investigated. We assume that the landscapes as a source of therapeutic resources is perceived, classified and named according to different symbolic, ecological and utilitarian criteria which are influenced by chemosensorial appearance of medicinal plants which are valued by inhabitants. Information relating to the cultural landscape experienced by 18 inhabitants, all representing 85% of the families, in terms of medicinal plants, knowledge of species and their organoleptic perception was obtained through participant observation, interviews and free listing. The data were examined using cualitative and quantitative approach, including discourse analysis and non-parametric statistics. Informants use 121 medicinal species, obtained from both wild and non-wild environments, most of which (66%) present aroma and/or taste. It was found that the plants with highest use consensus used for digestive, respiratory, cardio-vascular, analgesic-anti-inflammatory, obstetric-gynaecological and genito-unrinary complaints, have the highest frequencies of cites reporting flavor; and those with the highest frequencies relating to digestive, analgesic-anti-inflammatory and cultural syndromes present the highest frequencies of aroma. Flavor and/or aroma are interpreted as strong or soft, and the strongest are associated with treatment of supernatural ailments. Also, taste is a distinctive trait for the most of the species collected in all natural units of the landscape, while aroma is more closely associated with species growing at higher altitudes. The local pharmacopeia is also enriched with plants that come from more distant phytogeographical environments, such as the Andean forest and the Patagonian Monte, which are obtained through barter with neighboring populations. Herbal products are also obtained in regional shop. The practices of

  18. Combining empirical approaches and error modelling to enhance predictive uncertainty estimation in extrapolation for operational flood forecasting. Tests on flood events on the Loire basin, France.

    Science.gov (United States)

    Berthet, Lionel; Marty, Renaud; Bourgin, François; Viatgé, Julie; Piotte, Olivier; Perrin, Charles

    2017-04-01

    An increasing number of operational flood forecasting centres assess the predictive uncertainty associated with their forecasts and communicate it to the end users. This information can match the end-users needs (i.e. prove to be useful for an efficient crisis management) only if it is reliable: reliability is therefore a key quality for operational flood forecasts. In 2015, the French flood forecasting national and regional services (Vigicrues network; www.vigicrues.gouv.fr) implemented a framework to compute quantitative discharge and water level forecasts and to assess the predictive uncertainty. Among the possible technical options to achieve this goal, a statistical analysis of past forecasting errors of deterministic models has been selected (QUOIQUE method, Bourgin, 2014). It is a data-based and non-parametric approach based on as few assumptions as possible about the forecasting error mathematical structure. In particular, a very simple assumption is made regarding the predictive uncertainty distributions for large events outside the range of the calibration data: the multiplicative error distribution is assumed to be constant, whatever the magnitude of the flood. Indeed, the predictive distributions may not be reliable in extrapolation. However, estimating the predictive uncertainty for these rare events is crucial when major floods are of concern. In order to improve the forecasts reliability for major floods, an attempt at combining the operational strength of the empirical statistical analysis and a simple error modelling is done. Since the heteroscedasticity of forecast errors can considerably weaken the predictive reliability for large floods, this error modelling is based on the log-sinh transformation which proved to reduce significantly the heteroscedasticity of the transformed error in a simulation context, even for flood peaks (Wang et al., 2012). Exploratory tests on some operational forecasts issued during the recent floods experienced in

  19. Life Span Developmental Approach

    Directory of Open Access Journals (Sweden)

    Ali Eryilmaz

    2011-03-01

    Full Text Available The Life Span Developmental Approach examines development of individuals which occurs from birth to death. Life span developmental approach is a multi-disciplinary approach related with disciplines like psychology, psychiatry, sociology, anthropology and geriatrics that indicates the fact that development is not completed in adulthood, it continues during the life course. Development is a complex process that consists of dying and death. This approach carefully investigates the development of individuals with respect to developmental stages. This developmental approach suggests that scientific disciplines should not explain developmental facts only with age changes. Along with aging, cognitive, biological, and socioemotional development throughout life should also be considered to provide a reasonable and acceptable context, guideposts, and reasonable expectations for the person. There are three important subjects whom life span developmental approach deals with. These are nature vs nurture, continuity vs discontinuity, and change vs stability. Researchers using life span developmental approach gather and produce knowledge on these three most important domains of individual development with their unique scientific methodology.

  20. Otoplasty: A graduated approach.

    Science.gov (United States)

    Foda, H M

    1999-01-01

    Numerous otoplastic techniques have been described for the correction of protruding ears. Technique selection in otoplasty should be done only after careful analysis of the abnormal anatomy responsible for the protruding ear deformity. A graduated surgical approach is presented which is designed to address all contributing factors to the presenting auricular deformity. The approach starts with the more conservative cartilage-sparing suturing techniques, then proceeds to incorporate other more aggressive cartilage weakening maneuvers. Applying this approach resulted in better long-term results with less postoperative lateralization than that encountered on using the cartilage-sparing techniques alone.

  1. Introducing Systems Approaches

    Science.gov (United States)

    Reynolds, Martin; Holwell, Sue

    Systems Approaches to Managing Change brings together five systems approaches to managing complex issues, each having a proven track record of over 25 years. The five approaches are: System Dynamics (SD) developed originally in the late 1950s by Jay Forrester Viable Systems Model (VSM) developed originally in the late 1960s by Stafford Beer Strategic Options Development and Analysis (SODA: with cognitive mapping) developed originally in the 1970s by Colin Eden Soft Systems Methodology (SSM) developed originally in the 1970s by Peter Checkland Critical Systems Heuristics (CSH) developed originally in the late 1970s by Werner Ulrich

  2. Flipped Classroom Approach

    Directory of Open Access Journals (Sweden)

    Fezile Ozdamli

    2016-07-01

    Full Text Available Flipped classroom is an active, student-centered approach that was formed to increase the quality of period within class. Generally this approach whose applications are done mostly in Physical Sciences, also attracts the attention of educators and researchers in different disciplines recently. Flipped classroom learning which wide-spreads rapidly in the world, is not well recognized in our country. That is why the aim of study is to attract attention to its potential in education field and provide to make it recognize more by educators and researchers. With this aim, in the study what flipped classroom approach is, flipped classroom technology models, its advantages and limitations were explained.

  3. Revitalizing the setting approach

    DEFF Research Database (Denmark)

    Bloch, Paul; Toft, Ulla; Reinbach, Helene Christine

    2014-01-01

    ¿s everyday life. The supersetting approach argues for optimised effectiveness of health promotion action through integrated efforts and long-lasting partnerships involving a diverse range of actors in public institutions, private enterprises, non-governmental organisations and civil society......BackgroundThe concept of health promotion rests on aspirations aiming at enabling people to increase control over and improve their health. Health promotion action is facilitated in settings such as schools, homes and work places. As a contribution to the promotion of healthy lifestyles, we have.......DiscussionThe supersetting approach is a further development of the setting approach in which the significance of integrated and coordinated actions together with a participatory approach are emphasised and important principles are specified, all of which contribute to the attainment of synergistic effects and sustainable...

  4. The transformativity approach

    DEFF Research Database (Denmark)

    Holm, Isak Winkel; Lauta, Kristian Cedervall

    2017-01-01

    During the last five to ten years, a considerable body of research has begun to explore how disasters, real and imagined, trigger social transformations. Even if the contributions to this this research stems from a multitude of academic disciplines, we argue in the article, they constitute...... an identifiable and promising approach for future disaster research. We suggest naming it the transformativity approach. Whereas the vulnerability approach explores the social causation of disasters, the transformativity approach reverses the direction of the gaze and investigates the social transformation...... brought about by disasters. Put simply, the concept of vulnerability is about the upstream causes of disaster and the concept of transformativity about the downstream effects. By discussing three recent contributions (by the historian Greg Bankoff, the legal sociologist Michelle Dauber...

  5. Revitalizing the setting approach

    DEFF Research Database (Denmark)

    Bloch, Paul; Toft, Ulla; Reinbach, Helene Christine

    2014-01-01

    BackgroundThe concept of health promotion rests on aspirations aiming at enabling people to increase control over and improve their health. Health promotion action is facilitated in settings such as schools, homes and work places. As a contribution to the promotion of healthy lifestyles, we have.......DiscussionThe supersetting approach is a further development of the setting approach in which the significance of integrated and coordinated actions together with a participatory approach are emphasised and important principles are specified, all of which contribute to the attainment of synergistic effects and sustainable.......SummaryThe supersetting approach is a relevant and useful conceptual framework for developing intervention-based initiatives for sustainable impact in community health promotion. It strives to attain synergistic effects from activities that are carried out in multiple settings in a coordinated manner. The supersetting...

  6. Hormonal approach in Hirsutism

    OpenAIRE

    Abdullah, Nusratuddin

    2015-01-01

    Hinsutism is a clinical sign that primarily indicate androgen excess and open caused hy relatively benign junctional conditions. Hirsutism requires a careful and systematic clinical evaluation coztoleal with a rational approach to treatment. Initiate therapy only in patients who give informed consent after a complete explanation of the potential benejits and risks of a particular treatment and alternative approaches. The goals ofthe correct management of hirsutism are to ame...

  7. Sustainable fashion: New approaches

    OpenAIRE

    Niinimäki, Kirsi

    2013-01-01

    This publication is intended to be used as a source of inspiration for designers and companies, and all stakeholders whose interest lies in the area of sustainable fashion. While the strategies for sustainability are complex and approaches are many, this publication presents only a few ways to approach sustainable fashion. I hope the publication offers inspiration on how to make positive change in current practices and how to effect new mindsets, creating transformative fashion. Theoretica...

  8. The electronic approach VSCF

    Science.gov (United States)

    Hladky, Mark

    The advantages of the high-power electronic approach to variable speed constant frequency (VSCF) systems are examined. It is shown, in particular, how the inherent flexibility of the VSCF approach allows it to be configured for different applications, contributing to the evolution towards the more electric aircraft. The discussion covers criteria for selection, aircraft electric power system architectures, power level, performance, reliability, and maintainability. The future trends of the VSCF converter technology are also briefly discussed.

  9. Theoretical Approaches to Coping

    Directory of Open Access Journals (Sweden)

    Sofia Zyga

    2013-01-01

    Full Text Available Introduction: Dealing with stress requires conscious effort, it cannot be perceived as equal to individual's spontaneous reactions. The intentional management of stress must not be confused withdefense mechanisms. Coping differs from adjustment in that the latter is more general, has a broader meaning and includes diverse ways of facing a difficulty.Aim: An exploration of the definition of the term "coping", the function of the coping process as well as its differentiation from other similar meanings through a literature review.Methodology: Three theoretical approaches of coping are introduced; the psychoanalytic approach; approaching by characteristics; and the Lazarus and Folkman interactive model.Results: The strategic methods of the coping approaches are described and the article ends with a review of the approaches including the functioning of the stress-coping process , the classificationtypes of coping strategies in stress-inducing situations and with a criticism of coping approaches.Conclusions: The comparison of coping in different situations is difficult, if not impossible. The coping process is a slow process, so an individual may select one method of coping under one set ofcircumstances and a different strategy at some other time. Such selection of strategies takes place as the situation changes.

  10. A BSC-DEA approach to measure the relative efficiency of service industry: A case study of banking sector

    Directory of Open Access Journals (Sweden)

    M. B. Aryanezhad

    2011-04-01

    Full Text Available Performance evaluation plays an important role in determining faults and difficulties of any organization as well as attempting to increase capabilities and improve activities. Data envelopment analysis (DEA, as a non-parametric method, has been one of the most important and significant management tools for measuring output or efficiency. In this paper, we propose a method to utilize balanced score card (BSC as a tool for designing performance evaluation indices of an organization. The integrated BSC-DEA has been applied as an empirical case for a major private bank organization and the results are analyzed.

  11. Insights Through Performative Approaches

    Directory of Open Access Journals (Sweden)

    Martina Battisti

    2008-05-01

    Full Text Available This script aims to explore how performative approaches can be used to enhance the understanding of social situations by going beyond the presenting or outermost layer of a problem. The script evolves in six acts and focuses on a group of academics and consultants who meet to develop a theoretical understanding of performative approaches, to experiment with performative approaches by applying them to a consulting case, and finally to reflect on the learning experiences and the understanding of social situations implicit in the case. We found that with traditional scientific methods it may be difficult to understand the underlying—often unconscious—dynamics, emotions and resistances within social situations. Using performative approaches opens up the possibility to gain an understanding of the social situation beyond the rational and cognitive level. In particular, the use of creative approaches like painting, role-plays or fairy tales may allow new and alternative perspectives and interpretations of a social situation to emerge. The script concludes with practical implications for action research in the context of organizational consulting and development. URN: urn:nbn:de:0114-fqs0802444

  12. Personal Approaches to Career Planning.

    Science.gov (United States)

    DeMont, Billie; DeMont, Roger

    1983-01-01

    Identifies four approaches to career planning based on situational leadership theory: the network approach, self-help approach, engineering approach, and mentor approach. Guidelines for the selection of a planning method based on the nature of the work environment and personal preference are discussed. (JAC)

  13. Personal Approaches to Career Planning.

    Science.gov (United States)

    DeMont, Billie; DeMont, Roger

    1983-01-01

    Identifies four approaches to career planning based on situational leadership theory: the network approach, self-help approach, engineering approach, and mentor approach. Guidelines for the selection of a planning method based on the nature of the work environment and personal preference are discussed. (JAC)

  14. Approaching a Postcolonial Arctic

    DEFF Research Database (Denmark)

    Jensen, Lars

    2016-01-01

    This article explores different postcolonially configured approaches to the Arctic. It begins by considering the Arctic as a region, an entity, and how the customary political science informed approaches are delimited by their focus on understanding the Arctic as a region at the service...... of the contemporary neoliberal order. It moves on to explore how different parts of the Arctic are inscribed in a number of sub-Arctic nation-state binds, focusing mainly on Canada and Denmark. The article argues that the postcolonial can be understood as a prism or a methodology that asks pivotal questions to all...... approaches to the Arctic. Yet the postcolonial itself is characterised by limitations, not least in this context its lack of interest in the Arctic, and its bias towards conventional forms of representation in art. The article points to the need to develop a more integrated critique of colonial and neo...

  15. Life History Approach

    DEFF Research Database (Denmark)

    Olesen, Henning Salling

    2015-01-01

    as in everyday life. Life histories represent lived lives past, present and anticipated future. As such they are interpretations of individuals’ experiences of the way in which societal dynamics take place in the individual body and mind, either by the individual him/herself or by another biographer. The Life...... History approach was developing from interpreting autobiographical and later certain other forms of language interactive material as moments of life history, i.e. it is basically a hermeneutic approach. Talking about a psycho-societal approach indicates the ambition of attacking the dichotomy...... of the social and the psychic, both in the interpretation procedure and in some main theoretical understandings of language, body and mind. My article will present the reflections on the use of life history based methodology in learning and education research as a kind of learning story of research work....

  16. Approaches to Methadone Treatment

    DEFF Research Database (Denmark)

    Järvinen, Margaretha

    2008-01-01

    The paper analyses methadone treatment in Copenhagen – as it is described by methadone users and staff at different outpatient centres. The starting point is a theoretical model distinguishing between two different approaches to methadone treatment: ‘palliative’ and ‘curative’. Included...... in the model are three dimensions (1) treatment goals at the methadone centres (abstinence vs. stabilisation) (2) treatment focus (focus on addiction vs. focus on the consequences of addiction) and (3) conceptualisation of methadone (methadone as similar to or different from heroin). The paper shows...... that there is a discrepancy between the attitudes of the staff and those of the users. While the staff favour an almost clear-cut palliative approach to methadone treatment, defining curative goals as both unrealistic and as belonging to the past, the users prefer an approach that does not exclude the goal of abstinence...

  17. Life History Approach

    DEFF Research Database (Denmark)

    Olesen, Henning Salling

    2015-01-01

    as in everyday life. Life histories represent lived lives past, present and anticipated future. As such they are interpretations of individuals’ experiences of the way in which societal dynamics take place in the individual body and mind, either by the individual him/herself or by another biographer. The Life...... History approach was developing from interpreting autobiographical and later certain other forms of language interactive material as moments of life history, i.e. it is basically a hermeneutic approach. Talking about a psycho-societal approach indicates the ambition of attacking the dichotomy...... of the social and the psychic, both in the interpretation procedure and in some main theoretical understandings of language, body and mind. My article will present the reflections on the use of life history based methodology in learning and education research as a kind of learning story of research work....

  18. Technical approach document

    Energy Technology Data Exchange (ETDEWEB)

    1989-12-01

    The Uranium Mill Tailings Radiation Control Act (UMTRCA) of 1978, Public Law 95-604 (PL95-604), grants the Secretary of Energy the authority and responsibility to perform such actions as are necessary to minimize radiation health hazards and other environmental hazards caused by inactive uranium mill sites. This Technical Approach Document (TAD) describes the general technical approaches and design criteria adopted by the US Department of Energy (DOE) in order to implement remedial action plans (RAPS) and final designs that comply with EPA standards. It does not address the technical approaches necessary for aquifer restoration at processing sites; a guidance document, currently in preparation, will describe aquifer restoration concerns and technical protocols. This document is a second revision to the original document issued in May 1986; the revision has been made in response to changes to the groundwater standards of 40 CFR 192, Subparts A--C, proposed by EPA as draft standards. New sections were added to define the design approaches and designs necessary to comply with the groundwater standards. These new sections are in addition to changes made throughout the document to reflect current procedures, especially in cover design, water resources protection, and alternate site selection; only minor revisions were made to some of the sections. Sections 3.0 is a new section defining the approach taken in the design of disposal cells; Section 4.0 has been revised to include design of vegetated covers; Section 8.0 discusses design approaches necessary for compliance with the groundwater standards; and Section 9.0 is a new section dealing with nonradiological hazardous constituents. 203 refs., 18 figs., 26 tabs.

  19. Radiolab - three different approaches

    DEFF Research Database (Denmark)

    Lønstrup, Ansa

    2012-01-01

    different scholarly approaches to sound studies. The object was selected by Torben Sangild, who was familiar with the chosen context: the signature of the US radio programme and podcast Radiolab. The two other participants did not know the context and chose to analyse the sound object without further...... object with a global audience, taken from one of the most popular podcasts worldwide, accessible on the internet. Finally, it is a piece of functional sound design, rather than a work of art, which raises the question of context more clearly. The result is three rather different approaches: 1) a process...

  20. Financial Management: An Organic Approach

    Science.gov (United States)

    Laux, Judy

    2013-01-01

    Although textbooks present corporate finance using a topical approach, good financial management requires an organic approach that integrates the various assignments financial managers confront every day. Breaking the tasks into meaningful subcategories, the current article offers one approach.

  1. Approaches to acceptable risk

    Energy Technology Data Exchange (ETDEWEB)

    Whipple, C.

    1997-04-30

    Several alternative approaches to address the question {open_quotes}How safe is safe enough?{close_quotes} are reviewed and an attempt is made to apply the reasoning behind these approaches to the issue of acceptability of radiation exposures received in space. The approaches to the issue of the acceptability of technological risk described here are primarily analytical, and are drawn from examples in the management of environmental health risks. These include risk-based approaches, in which specific quantitative risk targets determine the acceptability of an activity, and cost-benefit and decision analysis, which generally focus on the estimation and evaluation of risks, benefits and costs, in a framework that balances these factors against each other. These analytical methods tend by their quantitative nature to emphasize the magnitude of risks, costs and alternatives, and to downplay other factors, especially those that are not easily expressed in quantitative terms, that affect acceptance or rejection of risk. Such other factors include the issues of risk perceptions and how and by whom risk decisions are made.

  2. Orion Emergency Mask Approach

    Science.gov (United States)

    Tuan, George C.; Graf, John C.

    2009-01-01

    Emergency mask approach on Orion poses a challenge to the traditional Shuttle or Station approaches. Currently, in the case of a fire or toxic spill event, the crew utilizes open loop oxygen masks that provide the crew with oxygen to breath, but also dumps the exhaled oxygen into the cabin. For Orion, with a small cabin volume, the extra oxygen will exceed the flammability limit within a short period of time, unless a nitrogen purge is also provided. Another approach to a fire or toxic spill event is the use of a filtering emergency masks. These masks utilize some form of chemical beds to scrub the air clean of toxic providing the crew safe breathing air for a period without elevating the oxygen level in the cabin. Using the masks and a form of smoke-eater filter, it may be possible to clean the cabin completely or to a level for safe transition to a space suit to perform a cabin purge. Issues with filters in the past have been the reaction time, breakthroughs, and high breathing resistance. Development in a new form of chemical filters has shown promise to make the filtering approach feasible.

  3. The Knowledge Governance Approach

    DEFF Research Database (Denmark)

    Foss, Nicolai

    2005-01-01

    An attempt is made to characterize a `knowledge governance approach' as a distinctive, emerging field that cuts across the fields of knowledge management, organisation studies, strategy and human resource management. Knowledge governance is taken up with how the deployment of administrative...... represents various challenges to more `closed' social science disciplines, notably economics....

  4. Evaluating six soft approaches

    DEFF Research Database (Denmark)

    Sørensen, Lene; Vidal, Rene Victor Valqui

    2006-01-01

    The paper introduces and evaluates six soft approaches used in strategy development and planning. We take a planner’s perspective on discussing the concepts of strategy development and planning. This means that we see strategy development and planning as learning processes based on Ackoff’s inter...

  5. Islamic approach in counseling.

    Science.gov (United States)

    Hanin Hamjah, Salasiah; Mat Akhir, Noor Shakirah

    2014-02-01

    A religious approach is one of the matters emphasized in counseling today. Many researchers find that there is a need to apply the religious element in counseling because religion is important in a client's life. The purpose of this research is to identify aspects of the Islamic approach applied in counseling clients by counselors at Pusat Kaunseling Majlis Agama Islam Negeri Sembilan (PKMAINS). In addition, this research also analyses the Islamic approach applied in counseling at PKMAINS with reference to al-Quran and al-Sunnah. This is a qualitative research in the form of case study at PKMAINS. The main method used in this research is interview. The research instrument used is interview protocol. The respondents in this study include 9 counselors who serve in one of the counseling centers in Malaysia. This study also uses questionnaire as an additional instrument, distributed to 36 clients who receive counseling service at the center. The findings of the study show that the Islamic approach applied in counseling at PKMAINS may be categorized into three main aspects: aqidah (faith), ibadah (worship/ultimate devotion and love for God) and akhlaq (moral conduct). Findings also show that the counseling in these aspects is in line with Islamic teachings as contained in al-Quran and al-Sunnah.

  6. Adopting a Pluricentric Approach

    Science.gov (United States)

    van Kerckvoorde, Colette

    2012-01-01

    This article argues for a "D-A-CH" approach, which stands for Germany (D), Austria (A), and Switzerland (CH), in language classes from the introductory level on. I begin by tracing the emergence and development of distinct Standard Swiss and Austrian German varieties. I then discuss marketing efforts for Swiss and Austrian German, and…

  7. Marxian Approaches to Education.

    Science.gov (United States)

    Carnoy, Martin

    Traditional Marxist approaches to the state relegate superstructural institutions like the school to a minor role in the process of social change. More recent theories like those of Gramsci, Althusser, and Poulantzas raise the state and the class struggle in the state apparatuses to a much more prominent position: superstructure, including the…

  8. Implementation of Communicative Approach

    Science.gov (United States)

    Jabeen, Shazi Shah

    2014-01-01

    In the contemporary age of high professional requirements such as excellent communicative skills, the need for successful learning of communicative skills of English language suggests communicative ability to be the goal of language teaching. In other words, to teach English language using communicative approach becomes essential. Studies to…

  9. Comparing Information Access Approaches.

    Science.gov (United States)

    Chalmers, Matthew

    1999-01-01

    Presents a broad view of information access, drawing from philosophy and semiology in constructing a framework for comparative discussion that is used to examine the information representations that underlie four approaches to information access--information retrieval, workflow, collaborative filtering, and the path model. Contains 32 references.…

  10. The Capability Approach

    NARCIS (Netherlands)

    I.A.M. Robeyns (Ingrid)

    2011-01-01

    textabstract In its most general description, the capability approach is a flexible and multi-purpose normative framework, rather than a precise theory of well-being, freedom or justice. At its core are two normative claims: first, the claim that the freedom to achieve well-being is of primary mora

  11. Ecosystem approach in education

    Science.gov (United States)

    Nabiullin, Iskander

    2017-04-01

    Environmental education is a base for sustainable development. Therefore, in our school we pay great attention to environmental education. Environmental education in our school is based on ecosystem approach. What is an ecosystem approach? Ecosystem is a fundamental concept of ecology. Living organisms and their non-living environments interact with each other as a system, and the biosphere planet functions as a global ecosystem. Therefore, it is necessary for children to understand relationships in ecosystems, and we have to develop systems thinking in our students. Ecosystem approach and systems thinking should help us to solve global environmental problems. How do we implement the ecosystem approach? Students must understand that our biosphere functions as a single ecosystem and even small changes can lead to environmental disasters. Even the disappearance of one plant or animal species can lead to irreversible consequences. So in the classroom we learn the importance of each living organism for the nature. We pay special attention to endangered species, which are listed in the Red Data List. Kids are doing projects about these organisms, make videos, print brochures and newspapers. Fieldwork also plays an important role for ecosystem approach. Every summer, we go out for expeditions to study species of plants and animals listed in the Red Data List of Tatarstan. In class, students often write essays on behalf of any endangered species of plants or animals, this also helps them to understand the importance of each living organism in nature. Each spring we organise a festival of environmental projects among students. Groups of 4-5 students work on a solution of environmental problems, such as water, air or soil pollution, waste recycling, the loss of biodiversity, etc. Participants shoot a clip about their project, print brochures. Furthermore, some of the students participate in national and international scientific Olympiads with their projects. In addition to

  12. Domain Approach: An Alternative Approach in Moral Education

    Science.gov (United States)

    Vengadasalam, Chander; Mamat, Wan Hasmah Wan; Mail, Fauziah; Sudramanian, Munimah

    2014-01-01

    This paper discusses the use of the domain approach in moral education in an upper secondary school in Malaysia. Moral Education needs a creative and an innovative approach. Therefore, a few forms of approaches are used in the teaching-learning of Moral Education. This research describes the use of domain approach which comprises the moral domain…

  13. Sociocultural approach to textbook

    Directory of Open Access Journals (Sweden)

    Pešić Jelena M.

    2005-01-01

    Full Text Available The aim of this paper is to present an overview of textbook research at the Institute of psychology, intended to postulate socio-cultural approach to textbook. Shifting the textbook from classical pedagogical-psychological context into the broader and more inspiring cultural context, has led to the conceptualization of textbook as a cultural-supportive system of individual development. We consider firstly, the theoretical background of this conception, founded in Vygotskian idea of cultural mediation of development and then, its operationalization through the concept of cultural-supportive tools. The transfer from theory to practice is presented through the most important practical implications, such as defining the genre specificities of textbook and principles of educational design of textbooks. As a distinctive issue, we also consider the way this approach to textbook (theoretical articulation, analytical concepts, and practical implications contributes to development of socio-cultural paradigm in psychology.

  14. A Personal Approach

    Directory of Open Access Journals (Sweden)

    Kellyn Weir

    2015-12-01

    Full Text Available This study explored the process of taking a personal approach to my son’s problems with computer games. As a psychology student, I should have been in a good position to explore the paradoxical emotions and this situation of conflict. Yet I was also aware that relating closely to the people we are studying has long been a taboo even in qualitative research. I nevertheless adopted a collaborative methodology in which I balanced a dual role of parent and researcher. Taking a personal approach, allowing intimate, reciprocal negotiation, I was not only able to put this taboo to the rare empirical test but also achieved an insight that would otherwise have not been available to me. By engaging in dialogue and encouraging the ability to object, a first-person plural (We, position was achieved in which an understanding of this situation developed and has transferred to our everyday lives.

  15. Rail transport systems approach

    CERN Document Server

    2017-01-01

    This book shows how the systems approach is employed by scientists in various countries to solve specific problems concerning railway transport. In particular, the book describes the experiences of scientists from Romania, Germany, the Czech Republic, the UK, Russia, Ukraine, Lithuania and Poland. For many of these countries there is a problem with the historical differences between the railways. In particular, there are railways with different rail gauges, with different signaling and communication systems, with different energy supplies and, finally, with different political systems, which are reflected in the different approaches to the management of railway economies. The book’s content is divided into two main parts, the first of which provides a systematic analysis of individual means of providing and maintaining rail transport. In turn, the second part addresses infrastructure and management development, with particular attention to security issues. Though primarily written for professionals involved...

  16. Thermodynamics an engineering approach

    CERN Document Server

    Cengel, Yunus A

    2014-01-01

    Thermodynamics, An Engineering Approach, eighth edition, covers the basic principles of thermodynamics while presenting a wealth of real-world engineering examples so students get a feel for how thermodynamics is applied in engineering practice. This text helps students develop an intuitive understanding by emphasizing the physics and physical arguments. Cengel and Boles explore the various facets of thermodynamics through careful explanations of concepts and use of numerous practical examples and figures, having students develop necessary skills to bridge the gap between knowledge and the confidence to properly apply their knowledge. McGraw-Hill is proud to offer Connect with the eighth edition of Cengel/Boles, Thermodynamics, An Engineering Approach. This innovative and powerful new system helps your students learn more efficiently and gives you the ability to assign homework problems simply and easily. Problems are graded automatically, and the results are recorded immediately. Track individual stude...

  17. The Knowledge Governance Approach

    DEFF Research Database (Denmark)

    Foss, Nicolai

    2005-01-01

    An attempt is made to characterize a `knowledge governance approach' as a distinctive, emerging field that cuts across the fields of knowledge management, organisation studies, strategy and human resource management. Knowledge governance is taken up with how the deployment of administrative...... apparatus influences knowledge processes, such as sharing, retaining and creating knowledge. It insists on clear behavioural foundations, adopts an economizing perspective and examines efficient alignment between knowledge transactions with diverse characteristics and governance structures and mechanisms...... with diverse capabilities of handling these transactions. Various open research issues that a knowledge governance approach may illuminate are sketched. Although knowledge governance draws clear inspiration from organizational economics and `rational' organization theory, it recognizes that knowledge...

  18. Transaction based approach

    Science.gov (United States)

    Hunka, Frantisek; Matula, Jiri

    2017-07-01

    Transaction based approach is utilized in some methodologies in business process modeling. Essential parts of these transactions are human beings. The notion of agent or actor role is usually used for them. The paper on a particular example describes possibilities of Design Engineering Methodology for Organizations (DEMO) and Resource-Event-Agent (REA) methodology. Whereas the DEMO methodology can be regarded as a generic methodology having its foundation in the theory of Enterprise Ontology the REA methodology is regarded as the domain specific methodology and has its origin in accountancy systems. The results of these approaches is that the DEMO methodology captures everything that happens in the reality with a good empirical evidence whereas the REA methodology captures only changes connected with economic events. Economic events represent either change of the property rights to economic resource or consumption or production of economic resources. This results from the essence of economic events and their connection to economic resources.

  19. Breakfast: a multidisciplinary approach

    OpenAIRE

    Affinita, Antonio; Catalani, Loredana; Cecchetto, Giovanna; De Lorenzo, Gianfranco; Dilillo, Dario; Donegani, Giorgio; Fransos, Lucia; Lucidi, Fabio; Mameli, Chiara; Manna, Elisa; Marconi, Paolo; Mele, Giuseppe; Minestroni, Laura; Montanari, Massimo; Morcellini, Mario

    2013-01-01

    Background The role of breakfast as an essential part of an healthy diet has been only recently promoted even if breakfast practices were known since the Middle Age. The growing scientific evidences on this topic are extremely sector-based nevertheless breakfast could be regarded from different point of views and from different expertises. This approach, that take into account history, sociology, anthropology, medicine, psychology and pedagogy, is useful to better understand the value of this...

  20. Towards a Tectonic Approach

    DEFF Research Database (Denmark)

    Hvejsel, Marie Frier; Kirkegaard, Poul Henning; Mortensen, Sophie Bondgaard

    2015-01-01

    Given the increasing environmental and legislative demands to reduce energy consumption, not only new constructions but also the existing urban fabric is about to change radically in the coming decades. Existing buildings cannot simply be restored but must undergo a transformation to comply with ...... the building envelope as an aesthetic ‘gesture’, this paper discusses the architectural challenges related to energy renovation in a Danish context and tectonic design method as an approach to these challenges in everyday practice....

  1. The Branding Management Approaches

    Institute of Scientific and Technical Information of China (English)

    YAKOUBI; Mohamed Lamine

    2014-01-01

    [Abstract]We wil try to present,through the display of the various branding concepts and theories, the different branding management approaches.This wil present the different visions of the discipline depending on the author to try and demonstrate their differences,at first, and their complementarities at last to help the different branding management practitioners (brand managers,marketing managers,advertisers,media-planners……) apprehend the right brand positioning strategy to engage.

  2. Approaching Service Innovation Patterns

    OpenAIRE

    Nagy, Andrea

    2013-01-01

    The present paper aims at analyzing the types of innovation in the field of services. First, the concept of innovation is defined and second, field literature is reviewed from the perspective of service innovation. The main types of innovation are identified based on several attempts at defining innovation, the most notable being Schumpeter’s. Thus, it is possible to approach concepts such as product and process innovation, incremental and radical innovation. Another aim has been to regard se...

  3. APPROACHES FOR SUSTAINABLE MANUFACTURING

    Institute of Scientific and Technical Information of China (English)

    G(U)NTHER Seliger; SEBASTIAN Kernbaum; MARCO Zettl

    2007-01-01

    Sustainable development is a holistic approach harmonizing ecological, economical and socio-political needs with respect to the superior objective of enhancing human living standards. Thereby the availability of natural resources and the conservation of the ecosystems have to be considered that future generations have the possibility to meet their own needs. A long-term economical development demands the transition from a source-sink economy to a cycle economy as a result of limited resources, limited environmental capacities to absorb waste and emissions as well as increasing needs of a growing population. A reference model for sustainability in manufacturing is presented and used to illustrate sustainable approaches with respect to management, technology, process and product. Adaptation of products and components is a vital element for supporting efficient reuse of products and components. Consequently adaptation contributes to the ambitious goals of sustainability. Technological enablers for adaptation as modularity, information and communication technology are exemplarily introduced. Moreover, approaches for disseminating knowledge in sustainability are given.

  4. Multivariate Bioclimatic Ecosystem Change Approaches

    Science.gov (United States)

    2015-02-06

    conclude that an analogous patch did not exist. It must exist somewhere, but some of the other MVA techniques were restricted by the mathematical ...found that the Primarily Analogous Multivariate approach developed during this research clearly distinguished itself from the other five approaches in...Principally Analogous Multivariate (PAM) approach ............................................... 29 4.6.1 Introduction to the PAM approach

  5. [Approaches to radial shaft].

    Science.gov (United States)

    Bartoníček, J; Naňka, O; Tuček, M

    2015-10-01

    In the clinical practice, radial shaft may be exposed via two approaches, namely the posterolateral Thompson and volar (anterior) Henry approaches. A feared complication of both of them is the injury to the deep branch of the radial nerve. No consensus has been reached, yet, as to which of the two approaches is more beneficial for the proximal half of radius. According to our anatomical studies and clinical experience, Thompson approach is safe only in fractures of the middle and distal thirds of the radial shaft, but highly risky in fractures of its proximal third. Henry approach may be used in any fracture of the radial shaft and provides a safe exposure of the entire lateral and anterior surfaces of the radius.The Henry approach has three phases. In the first phase, incision is made along the line connecting the biceps brachii tendon and the styloid process of radius. Care must be taken not to damage the lateral cutaneous nerve of forearm.In the second phase, fascia is incised and the brachioradialis identified by the typical transition from the muscle belly to tendon and the shape of the tendon. On the lateral side, the brachioradialis lines the space with the radial artery and veins and the superficial branch of the radial nerve running at its bottom. On the medial side, the space is defined by the pronator teres in the proximal part and the flexor carpi radialis in the distal part. The superficial branch of the radial nerve is retracted together with the brachioradialis laterally, and the radial artery medially.In the third phase, the attachment of the pronator teres is identified by its typical tendon in the middle of convexity of the lateral surface of the radial shaft. The proximal half of the radius must be exposed very carefully in order not to damage the deep branch of the radial nerve. Dissection starts at the insertion of the pronator teres and proceeds proximally along its lateral border in interval between this muscle and insertion of the supinator

  6. Mitochondrial diseases: therapeutic approaches.

    Science.gov (United States)

    DiMauro, Salvatore; Mancuso, Michelangelo

    2007-06-01

    Therapy of mitochondrial encephalomyopathies (defined restrictively as defects of the mitochondrial respiratory chain) is woefully inadequate, despite great progress in our understanding of the molecular bases of these disorders. In this review, we consider sequentially several different therapeutic approaches. Palliative therapy is dictated by good medical practice and includes anticonvulsant medication, control of endocrine dysfunction, and surgical procedures. Removal of noxious metabolites is centered on combating lactic acidosis, but extends to other metabolites. Attempts to bypass blocks in the respiratory chain by administration of electron acceptors have not been successful, but this may be amenable to genetic engineering. Administration of metabolites and cofactors is the mainstay of real-life therapy and is especially important in disorders due to primary deficiencies of specific compounds, such as carnitine or coenzyme Q10. There is increasing interest in the administration of reactive oxygen species scavengers both in primary mitochondrial diseases and in neurodegenerative diseases directly or indirectly related to mitochondrial dysfunction. Aerobic exercise and physical therapy prevent or correct deconditioning and improve exercise tolerance in patients with mitochondrial myopathies due to mitochondrial DNA (mtDNA) mutations. Gene therapy is a challenge because of polyplasmy and heteroplasmy, but interesting experimental approaches are being pursued and include, for example, decreasing the ratio of mutant to wild-type mitochondrial genomes (gene shifting), converting mutated mtDNA genes into normal nuclear DNA genes (allotopic expression), importing cognate genes from other species, or correcting mtDNA mutations with specific restriction endonucleases. Germline therapy raises ethical problems but is being considered for prevention of maternal transmission of mtDNA mutations. Preventive therapy through genetic counseling and prenatal diagnosis is

  7. Peritonitis: laparoscopic approach

    Directory of Open Access Journals (Sweden)

    Agresta Ferdinando

    2006-03-01

    Full Text Available Abstract Background Laparoscopy has became as the preferred surgical approach to a number of different diseases because it allows a correct diagnosis and treatment at the same time. In abdominal emergencies, both components of treatment – exploration to identify the causative pathology and performance of an appropriate operation – can often be accomplished via laparoscopy. There is still a debate of peritonitis as a contraindication to this kind of approach. Aim of the present work is to illustrate retrospectively the results of a case-control experience of laparoscopic vs. open surgery for abdominal peritonitis emergencies carried out at our institution. Methods From January 1992 and January 2002 a total of 935 patients (mean age 42.3 ± 17.2 years underwent emergent and/or urgent surgery. Among them, 602 (64.3% were operated on laparoscopically (of whom 112 -18.7% – with peritonitis, according to the presence of a surgical team trained in laparoscopy. Patients with a history of malignancy, more than two previous major abdominal surgeries or massive bowel distension were not treated Laparoscopically. Peritonitis was not considered contraindication to Laparoscopy. Results The conversion rate was 23.2% in patients with peritonitis and was mainly due to the presence of dense intra-abdominal adhesions. Major complications ranged as high as 5.3% with a postoperative mortality of 1.7%. A definitive diagnosis was accomplished in 85.7% (96 pat. of cases, and 90.6% (87 of these patients were treated successfully by Laparoscopy. Conclusion Even if limited by its retrospective feature, the present experience let us to consider the Laparoscopic approach to abdominal peritonitis emergencies a safe and effective as conventional surgery, with a higher diagnostic yield and allows for lesser trauma and a more rapid postoperative recovery. Such features make Laparoscopy a challenging alternative to open surgery in the management algorithm for abdominal

  8. URBAN POLITICS: KEY APPROACHES

    Directory of Open Access Journals (Sweden)

    Ledyaeva Ol'ga Mikhaylovna

    2012-10-01

    Full Text Available Several approaches that underlie urban politics are discussed in the paper. They include neo-liberalism, political economy discourse, elitist/pluralist debates, and postmodernism. The neoliberal approach focuses on the limited role of the state and individual responsibility. The legal framework protects both the rights and responsibilities of individuals and regulates the operation of the market. It is the market that fosters individual choices and provides goods and services by virtue of the processes which are flexible, efficient and transparent. The political economy approaches (regulation theory, public choice theory, neo-Marxism explain urban politics via the analysis of national and international economic processes and changes in contemporary capitalism. Changes in national and international economies determine what solutions are possible. The discourse has been influenced by the debate on globalization of capital and labour markets. Modern elitism and neopluralism are represented by theories of "growth machines" and "urban regimes". The former focuses on bargaining alliances between political and business leaders in order to manage the urban system and to promote its growth. The latter develops neopluralist explanations of power within local communities with an emphasis on the fragmented nature of the government where local authorities lack comprehensive governing powers. Postmodernism views the city as the site of the crisis of late capitalism which leads to segregation of neighbourhoods onto prosperous areas and ghettoes. In contrast to the modern city, the postmodern city is not defined by its industrial base; rather, it is determined by its consumerist environment of malls and museums, characterized by revivalist architecture. At the same time, the suburban shopping mall and a motorway network make nonsense of the idea of the city as a unique and well-defined space. These and other approaches encompass a wide spectrum of possibilities

  9. Approaching Service Innovation Patterns

    Directory of Open Access Journals (Sweden)

    Andrea NAGY

    2013-06-01

    Full Text Available The present paper aims at analyzing the types of innovation in the field of services. First, the concept of innovation is defined and second, field literature is reviewed from the perspective of service innovation. The main types of innovation are identified based on several attempts at defining innovation, the most notable being Schumpeter’s. Thus, it is possible to approach concepts such as product and process innovation, incremental and radical innovation. Another aim has been to regard service innovation as a standout type of innovation.

  10. [Regenerative approach for COPD].

    Science.gov (United States)

    Kubo, Hiroshi

    2011-10-01

    No treatment to cure of chronic obstructive pulmonary disease (COPD) is available. Regenerative medicine is one of promising areas for this intractable disease. Several reagents and growth factors are known to promote lung regeneration in small animal models. However, regenerative medicines for human lungs are not achieved yet. Recent advances in stem cell biology and tissue engineering have expanded our understanding of lung endogenous stem cells, and this new knowledge provides us with new ideas for future regenerative therapy for lung diseases. Although lungs are the most challenging organ for regenerative medicine, our cumulative knowledge of lung regeneration and of endogenous progenitor cells makes clear the possibilities for regenerative approach to COPD.

  11. Papilledema. An updated approach.

    Directory of Open Access Journals (Sweden)

    Yaney González Yglesias

    2009-07-01

    Full Text Available The term papilledema refers to the edema that appears in the optic papilla after intracranial hypertension. Some of its most frequent causes are intracranial lesions, hydrocephaly, thrombosis venous sinus and meningitis. The increase of intracranial tension is related with the results of the eye´s fundus examination, which are classified differently: incipient, established, chronic, and atrophic, depending on its severity or development. It is indispensable to carry out imaging tests before performing lumbar puncture. This article is an approach to the diagnosis and treatment of this gnoseological entity.

  12. [Cystic pyeloureteritis. Our approach].

    Science.gov (United States)

    Castillo Jimeno, J M; González de Garibay, A S; Ruiz Rubio, J L; Sebastián Borruel, J L

    1992-05-01

    We report a case of massive cystic pyeloureteritis that had been diagnosed by ureterorenoscopy in a patient with recurrent urinary infection and episodes of nephritic colic. The reports published in the literature indicate there is no specific treatment for this disease whose etiology is unknown. Its pathogenesis has not been well-established and it is difficult to distinguish from other urothelial filling defects. Although it has also been reported that it may progress to malignancy, we believe that the therapeutic approach should be conservative.

  13. Experimental approaches and applications

    CERN Document Server

    Crasemann, Bernd

    1975-01-01

    Atomic Inner-Shell Processes, Volume II: Experimental Approaches and Applications focuses on the physics of atomic inner shells, with emphasis on experimental aspects including the use of radioactive atoms for studies of atomic transition probabilities. Surveys of modern techniques of electron and photon spectrometry are also presented, and selected practical applications of inner-shell processes are outlined. Comprised of six chapters, this volume begins with an overview of the general principles underlying the experimental techniques that make use of radioactive isotopes for inner-sh

  14. Towards a Dual Approach

    DEFF Research Database (Denmark)

    Holli, Anne Maria; Harder, Mette Marie Stæhr

    2016-01-01

    countries acknowledged as forerunners in gender equality, which also have ‘fairly strong’ parliamentary standing committees. The results show that both committees on gender equality can be regarded as ‘feminist’ in character and both interact with relevant civil society organisations. Their impact......Drawing on insights from state feminism and legislative studies on parliamentary committees, this article develops a dual approach for the comparative analysis of committees on gender equality. Empirically, it compares the standing committees on gender equality in Denmark and Finland, two Nordic...

  15. The collaboratory approach

    Energy Technology Data Exchange (ETDEWEB)

    Peskin, A.M.

    1997-04-01

    A {open_quotes}collaboratory{close_quotes} has been defined as a center without walls, in which researchers can perform their work without regard to geographical location. To an increasing degree, engineering design and development is also taking the form of far-flung collaborations among divisions of a plant, subcontractors, university consultants and customers. It has long been recognized that quality engineering education presents the student with an environment that duplicates as much as possible that which the graduate will encounter in industry. To that end, it is important that engineering schools begin to introduce the collaboratory approach in its preparation, and even use it in delivery of subject matter to students.

  16. An evolutionary approach

    Science.gov (United States)

    Healy, Thomas J.

    1993-04-01

    The paper describes an evolutionary approach to the development of aerospace systems, represented by the introduction of integrated product teams (IPTs), which are now used at Rockwell's Space Systems Division on all new programs and are introduced into existing projects after demonstrations of increases in quality and reductions in cost and schedule due to IPTs. Each IPT is unique and reflects its own program and lasts for the life of the program. An IPT includes customers, suppliers, subcontractors, and associate contractors, and have a charter, mission, scope of authority, budget, and schedule. Functional management is responsible for the staffing, training, method development, and generic technology development.

  17. Learning Mixtures of Polynomials of Conditional Densities from Data

    DEFF Research Database (Denmark)

    L. López-Cruz, Pedro; Nielsen, Thomas Dyhre; Bielza, Concha;

    2013-01-01

    Mixtures of polynomials (MoPs) are a non-parametric density estimation technique for hybrid Bayesian networks with continuous and discrete variables. We propose two methods for learning MoP ap- proximations of conditional densities from data. Both approaches are based on learning MoP approximations......- ods with the approach for learning mixtures of truncated basis functions from data....

  18. Was unterscheidet harte und weiche Strukturgleichungsmodelle nun wirklich? Ein Klärungsversuch zur LISREL-PLS-Frage

    DEFF Research Database (Denmark)

    Scholderer, Joachim; Balderjahn, Ingo

    2006-01-01

    Summary: Structural equation modeling has developed into a standard application in marketing. However, researchers are divided into two camps: proponents of a "hard", parametric approach exemplified by LISREL, and proponents of a "soft", non-parametric approach exemplified by PLS. Hardly any up-to-date...

  19. Browse Title Index

    African Journals Online (AJOL)

    Items 351 - 400 of 1389 ... Vol 14, No 2 (2014), Dental approach to erosive tooth wear in ... of Nigerian Composite Lifestyle CVD risk factors questionnaire for adolescents ... Vol 16, No 1 (2016), Diagnostic value of diffusion weighted MRI and ... years: parametric and non-parametric survival analysis approach, Abstract PDF.

  20. THE NONSTATIONARITY OF GERMAN AGGREGATE OUTPUT

    NARCIS (Netherlands)

    ZELHORST, D; DEHAAN, J

    1993-01-01

    In this paper the stationarity of German per capita output for the period 1870-1989 is examined, using both a parametric and a non-parametric approach. The standard unit root tests and the scaled variogram suggest that German output is not stationary. Following the approach suggested by Perron (1989

  1. Making literature reviews more reliable through application of lessons from systematic reviews.

    Science.gov (United States)

    Haddaway, N R; Woodcock, P; Macura, B; Collins, A

    2015-12-01

    Review articles can provide valuable summaries of the ever-increasing volume of primary research in conservation biology. Where findings may influence important resource-allocation decisions in policy or practice, there is a need for a high degree of reliability when reviewing evidence. However, traditional literature reviews are susceptible to a number of biases during the identification, selection, and synthesis of included studies (e.g., publication bias, selection bias, and vote counting). Systematic reviews, pioneered in medicine and translated into conservation in 2006, address these issues through a strict methodology that aims to maximize transparency, objectivity, and repeatability. Systematic reviews will always be the gold standard for reliable synthesis of evidence. However, traditional literature reviews remain popular and will continue to be valuable where systematic reviews are not feasible. Where traditional reviews are used, lessons can be taken from systematic reviews and applied to traditional reviews in order to increase their reliability. Certain key aspects of systematic review methods that can be used in a context-specific manner in traditional reviews include focusing on mitigating bias; increasing transparency, consistency, and objectivity, and critically appraising the evidence and avoiding vote counting. In situations where conducting a full systematic review is not feasible, the proposed approach to reviewing evidence in a more systematic way can substantially improve the reliability of review findings, providing a time- and resource-efficient means of maximizing the value of traditional reviews. These methods are aimed particularly at those conducting literature reviews where systematic review is not feasible, for example, for graduate students, single reviewers, or small organizations. © 2015 Society for Conservation Biology.

  2. Oral rinse as a simpler approach to exfoliative cytology: a comparative study.

    Science.gov (United States)

    Mulki, Shaila; Shetty, Pushparaj; Pai, Prakash

    2013-12-01

    Oral rinse is a novel method that can be used to detect dysplasia in potentially malignant disorders and malignant oral lesions in resource challenged areas. A study was undertaken to compare the quality of the normal smears prepared with the oral rinse and that of the wooden tongue spatula. One hundred five normal subjects were selected for the study. Two smears were prepared from clinically normal mucosa using an oral rinse and further two smears were scraped from clinically normal buccal mucosa using a wooden spatula. The smears were graded for cell yield, dispersion and cellular clarity on a three-point scale by two observers. The results were analyzed using Mann Whitney non parametric test. The Oral rinse was found to be significantly more efficient than the wooden spatula, in terms of cell yield (pexfoliative cytology of normal oral mucosa.

  3. Quantitative assessment of drivers of recent climate variability: An information theoretic approach

    CERN Document Server

    Bhaskar, Ankush; Vichare, Geeta; Koganti, Triven; Gurubaran, S

    2016-01-01

    Identification and quantification of possible drivers of recent climate variability remain a challenging task. This important issue is addressed adopting a non-parametric information theory technique, the Transfer Entropy and its normalized variant. It distinctly quantifies actual information exchanged along with the directional flow of information between any two variables with no bearing on their common history or inputs, unlike correlation, mutual information etc. Measurements of greenhouse gases, CO2, CH4, and N2O; volcanic aerosols; solar activity: UV radiation, total solar irradiance (TSI ) and cosmic ray flux (CR); El Nino Southern Oscillation (ENSO) and Global Mean Temperature Anomaly (GMTA) made during 1984-2005 are utilized to distinguish driving and responding climate signals. Estimates of their relative contributions reveal that CO 2 (~24%), CH 4 (~19%) and volcanic aerosols (~23%) are the primary contributors to the observed variations in GMTA. While, UV (~9%) and ENSO (~12%) act as secondary dri...

  4. Interstage Flammability Analysis Approach

    Science.gov (United States)

    Little, Jeffrey K.; Eppard, William M.

    2011-01-01

    The Interstage of the Ares I launch platform houses several key components which are on standby during First Stage operation: the Reaction Control System (ReCS), the Upper Stage (US) Thrust Vector Control (TVC) and the J-2X with the Main Propulsion System (MPS) propellant feed system. Therefore potentially dangerous leaks of propellants could develop. The Interstage leaks analysis addresses the concerns of localized mixing of hydrogen and oxygen gases to produce deflagration zones in the Interstage of the Ares I launch vehicle during First Stage operation. This report details the approach taken to accomplish the analysis. Specified leakage profiles and actual flammability results are not presented due to proprietary and security restrictions. The interior volume formed by the Interstage walls, bounding interfaces with the Upper and First Stages, and surrounding the J2-X engine was modeled using Loci-CHEM to assess the potential for flammable gas mixtures to develop during First Stage operations. The transient analysis included a derived flammability indicator based on mixture ratios to maintain achievable simulation times. Validation of results was based on a comparison to Interstage pressure profiles outlined in prior NASA studies. The approach proved useful in the bounding of flammability risk in supporting program hazard reviews.

  5. Approaches for Stereo Matching

    Directory of Open Access Journals (Sweden)

    Takouhi Ozanian

    1995-04-01

    Full Text Available This review focuses on the last decade's development of the computational stereopsis for recovering three-dimensional information. The main components of the stereo analysis are exposed: image acquisition and camera modeling, feature selection, feature matching and disparity interpretation. A brief survey is given of the well known feature selection approaches and the estimation parameters for this selection are mentioned. The difficulties in identifying correspondent locations in the two images are explained. Methods as to how effectively to constrain the search for correct solution of the correspondence problem are discussed, as are strategies for the whole matching process. Reasons for the occurrence of matching errors are considered. Some recently proposed approaches, employing new ideas in the modeling of stereo matching in terms of energy minimization, are described. Acknowledging the importance of computation time for real-time applications, special attention is paid to parallelism as a way to achieve the required level of performance. The development of trinocular stereo analysis as an alternative to the conventional binocular one, is described. Finally a classification based on the test images for verification of the stereo matching algorithms, is supplied.

  6. Diagnostic approaches for cholangiocarcinoma

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Cholangiocarcinomas arise from the epithelial cells of the bile ducts and are associated with poor prognosis. Despite new diagnostic approaches, the definite diagnosis of this malignancy continues to be challenging. Cholangiocarcinomas often grow longitudinally along the bile duct rather than in a radial direction. Thus, large tumor masses are frequently absent and imaging techniques, including ultrasound, CT, and MRI have only limited sensitivity. Tissue collection during endoscopic (ERCP) and/or percutaneous transhepatic (PTC) procedures are usually used to confirm a definitive diagnosis of cholangiocarcinoma. However, forceps biopsy and brush cytology provide positive results for malignancy in about only 50% of patients. Percutaneous and peroral cholangioscopy using fiber-optic techniques were therefore developed for direct visualization of the biliary tree, yielding additional information about endoscopic appearance and tumor extension, as well as a guided biopsy acquistion. Finally, endoscopic ultrasonography (EUS) complements endoscopic and percutaneous approaches and may provide a tissue diagnosis of tumors in the biliary region through fine- needle aspiration. In the future, new techniques allowing for early detection, including molecular markers, should be developed to improve the diagnostic sensitivity in this increasing tumor entity.

  7. Modular Approach to Spintronics.

    Science.gov (United States)

    Camsari, Kerem Yunus; Ganguly, Samiran; Datta, Supriyo

    2015-06-11

    There has been enormous progress in the last two decades, effectively combining spintronics and magnetics into a powerful force that is shaping the field of memory devices. New materials and phenomena continue to be discovered at an impressive rate, providing an ever-increasing set of building blocks that could be exploited in designing transistor-like functional devices of the future. The objective of this paper is to provide a quantitative foundation for this building block approach, so that new discoveries can be integrated into functional device concepts, quickly analyzed and critically evaluated. Through careful benchmarking against available theory and experiment we establish a set of elemental modules representing diverse materials and phenomena. These elemental modules can be integrated seamlessly to model composite devices involving both spintronic and nanomagnetic phenomena. We envision the library of modules to evolve both by incorporating new modules and by improving existing modules as the field progresses. The primary contribution of this paper is to establish the ground rules or protocols for a modular approach that can build a lasting bridge between materials scientists and circuit designers in the field of spintronics and nanomagnetics.

  8. Advanced intelligence and mechanism approach

    Institute of Scientific and Technical Information of China (English)

    ZHONG Yixin

    2007-01-01

    Advanced intelligence will feature the intelligence research in next 50 years.An understanding of the concept of advanced intelligence as well as its importance will be provided first,and detailed analysis on an approach,the mechanism approach.suitable to the advanced intelligence research will then be flolowed.And the mutual relationship among mechanism approach,traditional approaches existed in artificial intelligence research,and the cognitive informatics will be discussed.It is interesting to discover that mechanism approach is a good one to the Advanced Intelligence research and a tmified form of the existed approaches to artificial intelligence.

  9. The Point Approach and the Phrase Approach to Vocabulary Learning

    Institute of Scientific and Technical Information of China (English)

    刘梦媛

    2013-01-01

      As is known to all, vocabulary acquisition plays an essential role in English learning. However, it was supposed very dif⁃ficult to many Chinese learners. For the reason that so many kinds of approaches exists in the real life, English learners are always do not know which one is suitable and more effective. To solve this problem, the paper will analyze two approaches (point approach and phrase approach) for you.

  10. Using Spline Regression in Semi-Parametric Stochastic Frontier Analysis: An Application to Polish Dairy Farms

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard; Henningsen, Arne

    The estimation of the technical efficiency comprises a vast literature in the field of applied production economics. There are two predominant approaches: the non-parametric and non-stochastic Data Envelopment Analysis (DEA) and the parametric Stochastic Frontier Analysis (SFA). The DEA...... of specifying an unsuitable functional form and thus, model misspecification and biased parameter estimates. Given these problems of the DEA and the SFA, Fan, Li and Weersink (1996) proposed a semi-parametric stochastic frontier model that estimates the production function (frontier) by non-parametric......), Kumbhakar et al. (2007), and Henningsen and Kumbhakar (2009). The aim of this paper and its main contribution to the existing literature is the estimation semi-parametric stochastic frontier models using a different non-parametric estimation technique: spline regression (Ma et al. 2011). We apply...

  11. Bioengineering a conceptual approach

    CERN Document Server

    Pavlovic, Mirjana

    2015-01-01

    This book explores critical principles and new concepts in bioengineering, integrating the biological, physical and chemical laws and principles that provide a foundation for the field. Both biological and engineering perspectives are included, with key topics such as the physical-chemical properties of cells, tissues and organs; principles of molecules; composition and interplay in physiological scenarios; and the complex physiological functions of heart, neuronal cells, muscle cells and tissues. Chapters evaluate the emerging fields of nanotechnology, drug delivery concepts, biomaterials, and regenerative therapy. The leading individuals and events are introduced along with their critical research. Bioengineering: A Conceptual Approach is a valuable resource for professionals or researchers interested in understanding the central elements of bioengineering. Advanced-level students in biomedical engineering and computer science will also find this book valuable as a secondary textbook or reference.

  12. Microscopic approach to polaritons

    DEFF Research Database (Denmark)

    Skettrup, Torben

    1981-01-01

    contrary to experimental experience. In order to remove this absurdity the semiclassical approach must be abandoned and the electromagnetic field quantized. A simple microscopic polariton model is then derived. From this the wave function for the interacting exciton-photon complex is obtained...... of light of the crystal. The introduction of damping smears out the excitonic spectra. The wave function of the polariton, however, turns out to be very independent of damping up to large damping values. Finally, this simplified microscopic polariton model is compared with the exact solutions obtained...... for the macroscopic polariton model by Hopfield. It is seen that standing photon and exciton waves must be included in an exact microscopic polariton model. However, it is concluded that for practical purposes, only the propagating waves are of importance and the simple microscopic polariton wave function derived...

  13. Integration a functional approach

    CERN Document Server

    Bichteler, Klaus

    1998-01-01

    This book covers Lebesgue integration and its generalizations from Daniell's point of view, modified by the use of seminorms. Integrating functions rather than measuring sets is posited as the main purpose of measure theory. From this point of view Lebesgue's integral can be had as a rather straightforward, even simplistic, extension of Riemann's integral; and its aims, definitions, and procedures can be motivated at an elementary level. The notion of measurability, for example, is suggested by Littlewood's observations rather than being conveyed authoritatively through definitions of (sigma)-algebras and good-cut-conditions, the latter of which are hard to justify and thus appear mysterious, even nettlesome, to the beginner. The approach taken provides the additional benefit of cutting the labor in half. The use of seminorms, ubiquitous in modern analysis, speeds things up even further. The book is intended for the reader who has some experience with proofs, a beginning graduate student for example. It might...

  14. Radiolab - three different approaches

    DEFF Research Database (Denmark)

    Lønstrup, Ansa

    2012-01-01

    Radiolab – three different approaches The three papers in this ‘suite’ have a special background and context. At the 2010 conference SoundActs in Aarhus the three panellists were each given the task to provide a paper with an analysis of the same sound object, thus exhibiting and contrasting...... contextual investigation. This object was chosen for several reasons. First of all, it is brief (less than 17 seconds), which meant that it was possible to make a detailed analysis; at the same time, though, it is relatively complex, which means that it can accommodate three different analyses. It is a sound......, who methodologically operates within tree levels of investigation: 1) the syntax, 2) the semantic and 3) the ontology level. Accordingly, this analysis is conducted, as if the sound object was performed by a vocal ensemble oscillating ‘between a musical and a speech act’. Torben Sangild’s paper...

  15. Cognitive approaches to emotions.

    Science.gov (United States)

    Oatley, Keith; Johnson-Laird, P N

    2014-03-01

    Cognitive approaches offer clear links between how emotions are thought about in everyday life and how they are investigated psychologically. Cognitive researchers have focused on how emotions are caused when events or other people affect concerns and on how emotions influence processes such as reasoning, memory, and attention. Three representative cognitive theories of emotion continue to develop productively: the action-readiness theory, the core-affect theory, and the communicative theory. Some principles are common to them and divergences can be resolved by future research. Recent explanations have included how emotions structure social relationships, how they function in psychological illnesses, and how they are central to music and fiction. Copyright © 2013 Elsevier Ltd. All rights reserved.

  16. Therapeutic approaches for shankopathies.

    Science.gov (United States)

    Wang, Xiaoming; Bey, Alexandra L; Chung, Leeyup; Krystal, Andrew D; Jiang, Yong-Hui

    2014-02-01

    Despite recent advances in understanding the molecular mechanisms of autism spectrum disorders (ASD), the current treatments for these disorders are mostly focused on behavioral and educational approaches. The considerable clinical and molecular heterogeneity of ASD present a significant challenge to the development of an effective treatment targeting underlying molecular defects. Deficiency of SHANK family genes causing ASD represent an exciting opportunity for developing molecular therapies because of strong genetic evidence for SHANK as causative genes in ASD and the availability of a panel of Shank mutant mouse models. In this article, we review the literature suggesting the potential for developing therapies based on molecular characteristics and discuss several exciting themes that are emerging from studying Shank mutant mice at the molecular level and in terms of synaptic function.

  17. The Indirect Approach

    Directory of Open Access Journals (Sweden)

    Geir H Moshuus

    2016-07-01

    Full Text Available How do we do good guesswork at meaning if our informant lives in a secret world? Doing research often includes awkward moments, unforeseen events, and incidents. Here we name some of these “happenstances.” We suggest that happenstances may offer a solution to the problem of meaning discrepancies: The happenstance is one of those moments that allow the researcher to temporarily bridge into the meanings of his or her informant. We have carried out research on marginal youth. In both of our studies, happenstances have turned interview situations upside down. Here we identify how these unforeseen events provided us with valuable insights into our informants’ contexts. We conclude by addressing how these happenstances, though they appear to be a product of pure accident, may become part of a systematic approach in discovering contextual knowledge.

  18. Editorial: Approaching 125.

    Science.gov (United States)

    Goodman, Sherryl

    2012-02-01

    With this issue, beginning Volume 121, the editorial team shifts from the strong leadership of David Watson to a team under my direction. Approaching 125 years of publication, the Journal of Abnormal Psychology has earned its place as the preeminent outlet for research in psychopathology. With gratitude to the newly assembled team of associate editors (AEs), consulting editors, and ad hoc reviewers, I look forward to guiding the journal through this next term. Nine well-respected scholars have agreed to serve as AEs: Timothy Brown, Laurie Chassin, Jeff Epstein, Jutta Joormann, Pamela Keel, Kate Keenan, Scott Lilienfeld, Angus MacDonald, and Michael Young. The new team is dedicated to working tirelessly to maintain and enhance the journal's esteemed tradition of excellence. Given the well-established strengths of the journal, I will not suggest any fundamental changes.

  19. Laparoscopic approach to hysterectomy

    Directory of Open Access Journals (Sweden)

    Hakan Nazik

    2013-04-01

    Full Text Available Modern laparoscopic surgery is widely used throughout the world as it offers greater advantages than open procedures. The laparoscopic approach to hysterectomy has evolved over the last 20 years. Hysterectomies are performed abdominally, vaginally, laparoscopically or, more recently, with robotic assistance. Indications for a total laparoscopic hysterectomy are similar to those for total abdominal hysterectomy, and most commonly include uterine leiomyomata, pelvic organ prolapse, and abnormal uterine bleeding. When hysterectomy is going to be performed, the surgeon should decide which method is safer and more cost-effective. This paper aims to make a review of the indications, techniques and advantages of laparoscopic hysterectomy as well as the criteria to be used for appropriate patient selection.

  20. [Orthopedic approach to asymmetry].

    Science.gov (United States)

    Bardinet, Etienne; Duhart, Anne-Marie

    2002-06-01

    Asymmetry being a clinical manifestation of various pathologies, the orthopedic attitude greatly varies from one practitioner to another. Thanks to a better knowledge of the etiopathogenesis, the orthopedic approach in some particular cases reveals very effective, either alone or together with other therapeutics. In mandibular laterodeviations most often consecutive to a maxillary contraction, the best treatment is to expand maxilla which will allow a mandibular centric repositioning. This therapy is often achieved early to limit the asymmetric expression of growth and normalize the dental eruption. In unilateral condylar hypoplasias of variable extension, from a simple defect in condylar growth to a hemifacial microsomia, the therapeutic attitude has mostly evolved. A surgical orthodontic protocol can integrate an increasingly significant orthopedic phase in the course of time. Some authors show that a surgical case may be treated originally only with the help of orthopedics. The devices used are of activator or hyperpropulsor type. In unilateral condylar hyperplasias, orthopedic therapy must be considered with reservations.