WorldWideScience

Sample records for non-parametric vote-counting approach

  1. Vote Counting as Mathematical Proof

    DEFF Research Database (Denmark)

    Schürmann, Carsten; Pattinson, Dirk

    2015-01-01

    then consists of a sequence (or tree) of rule applications and provides an independently checkable certificate of the validity of the result. This reduces the need to trust, or otherwise verify, the correctness of the vote counting software once the certificate has been validated. Using a rule...

  2. Debt and growth: A non-parametric approach

    Science.gov (United States)

    Brida, Juan Gabriel; Gómez, David Matesanz; Seijas, Maria Nela

    2017-11-01

    In this study, we explore the dynamic relationship between public debt and economic growth by using a non-parametric approach based on data symbolization and clustering methods. The study uses annual data of general government consolidated gross debt-to-GDP ratio and gross domestic product for sixteen countries between 1977 and 2015. Using symbolic sequences, we introduce a notion of distance between the dynamical paths of different countries. Then, a Minimal Spanning Tree and a Hierarchical Tree are constructed from time series to help detecting the existence of groups of countries sharing similar economic performance. The main finding of the study appears for the period 2008-2016 when several countries surpassed the 90% debt-to-GDP threshold. During this period, three groups (clubs) of countries are obtained: high, mid and low indebted countries, suggesting that the employed debt-to-GDP threshold drives economic dynamics for the selected countries.

  3. A non-parametric Bayesian approach to decompounding from high frequency data

    NARCIS (Netherlands)

    Gugushvili, Shota; van der Meulen, F.H.; Spreij, Peter

    2016-01-01

    Given a sample from a discretely observed compound Poisson process, we consider non-parametric estimation of the density f0 of its jump sizes, as well as of its intensity λ0. We take a Bayesian approach to the problem and specify the prior on f0 as the Dirichlet location mixture of normal densities.

  4. Non-parametric Tuning of PID Controllers A Modified Relay-Feedback-Test Approach

    CERN Document Server

    Boiko, Igor

    2013-01-01

    The relay feedback test (RFT) has become a popular and efficient  tool used in process identification and automatic controller tuning. Non-parametric Tuning of PID Controllers couples new modifications of classical RFT with application-specific optimal tuning rules to form a non-parametric method of test-and-tuning. Test and tuning are coordinated through a set of common parameters so that a PID controller can obtain the desired gain or phase margins in a system exactly, even with unknown process dynamics. The concept of process-specific optimal tuning rules in the nonparametric setup, with corresponding tuning rules for flow, level pressure, and temperature control loops is presented in the text.   Common problems of tuning accuracy based on parametric and non-parametric approaches are addressed. In addition, the text treats the parametric approach to tuning based on the modified RFT approach and the exact model of oscillations in the system under test using the locus of a perturbedrelay system (LPRS) meth...

  5. A multitemporal and non-parametric approach for assessing the impacts of drought on vegetation greenness

    DEFF Research Database (Denmark)

    Carrao, Hugo; Sepulcre, Guadalupe; Horion, Stéphanie Marie Anne F

    2013-01-01

    This study evaluates the relationship between the frequency and duration of meteorological droughts and the subsequent temporal changes on the quantity of actively photosynthesizing biomass (greenness) estimated from satellite imagery on rainfed croplands in Latin America. An innovative non-parametric...... and non-supervised approach, based on the Fisher-Jenks optimal classification algorithm, is used to identify multi-scale meteorological droughts on the basis of empirical cumulative distributions of 1, 3, 6, and 12-monthly precipitation totals. As input data for the classifier, we use the gridded GPCC...... for the period between 1998 and 2010. The time-series analysis of vegetation greenness is performed during the growing season with a non-parametric method, namely the seasonal Relative Greenness (RG) of spatially accumulated fAPAR. The Global Land Cover map of 2000 and the GlobCover maps of 2005/2006 and 2009...

  6. Comparative Study of Parametric and Non-parametric Approaches in Fault Detection and Isolation

    DEFF Research Database (Denmark)

    Katebi, S.D.; Blanke, M.; Katebi, M.R.

    This report describes a comparative study between two approaches to fault detection and isolation in dynamic systems. The first approach uses a parametric model of the system. The main components of such techniques are residual and signature generation for processing and analyzing. The second...... approach is non-parametric in the sense that the signature analysis is only dependent on the frequency or time domain information extracted directly from the input-output signals. Based on these approaches, two different fault monitoring schemes are developed where the feature extraction and fault decision...

  7. The geometry of distributional preferences and a non-parametric identification approach: The Equality Equivalence Test.

    Science.gov (United States)

    Kerschbamer, Rudolf

    2015-05-01

    This paper proposes a geometric delineation of distributional preference types and a non-parametric approach for their identification in a two-person context. It starts with a small set of assumptions on preferences and shows that this set (i) naturally results in a taxonomy of distributional archetypes that nests all empirically relevant types considered in previous work; and (ii) gives rise to a clean experimental identification procedure - the Equality Equivalence Test - that discriminates between archetypes according to core features of preferences rather than properties of specific modeling variants. As a by-product the test yields a two-dimensional index of preference intensity.

  8. Measuring energy performance with sectoral heterogeneity: A non-parametric frontier approach

    International Nuclear Information System (INIS)

    Wang, H.; Ang, B.W.; Wang, Q.W.; Zhou, P.

    2017-01-01

    Evaluating economy-wide energy performance is an integral part of assessing the effectiveness of a country's energy efficiency policy. Non-parametric frontier approach has been widely used by researchers for such a purpose. This paper proposes an extended non-parametric frontier approach to studying economy-wide energy efficiency and productivity performances by accounting for sectoral heterogeneity. Relevant techniques in index number theory are incorporated to quantify the driving forces behind changes in the economy-wide energy productivity index. The proposed approach facilitates flexible modelling of different sectors' production processes, and helps to examine sectors' impact on the aggregate energy performance. A case study of China's economy-wide energy efficiency and productivity performances in its 11th five-year plan period (2006–2010) is presented. It is found that sectoral heterogeneities in terms of energy performance are significant in China. Meanwhile, China's economy-wide energy productivity increased slightly during the study period, mainly driven by the technical efficiency improvement. A number of other findings have also been reported. - Highlights: • We model economy-wide energy performance by considering sectoral heterogeneity. • The proposed approach can identify sectors' impact on the aggregate energy performance. • Obvious sectoral heterogeneities are identified in evaluating China's energy performance.

  9. Assessing T cell clonal size distribution: a non-parametric approach.

    Directory of Open Access Journals (Sweden)

    Olesya V Bolkhovskaya

    Full Text Available Clonal structure of the human peripheral T-cell repertoire is shaped by a number of homeostatic mechanisms, including antigen presentation, cytokine and cell regulation. Its accurate tuning leads to a remarkable ability to combat pathogens in all their variety, while systemic failures may lead to severe consequences like autoimmune diseases. Here we develop and make use of a non-parametric statistical approach to assess T cell clonal size distributions from recent next generation sequencing data. For 41 healthy individuals and a patient with ankylosing spondylitis, who undergone treatment, we invariably find power law scaling over several decades and for the first time calculate quantitatively meaningful values of decay exponent. It has proved to be much the same among healthy donors, significantly different for an autoimmune patient before the therapy, and converging towards a typical value afterwards. We discuss implications of the findings for theoretical understanding and mathematical modeling of adaptive immunity.

  10. Assessing T cell clonal size distribution: a non-parametric approach.

    Science.gov (United States)

    Bolkhovskaya, Olesya V; Zorin, Daniil Yu; Ivanchenko, Mikhail V

    2014-01-01

    Clonal structure of the human peripheral T-cell repertoire is shaped by a number of homeostatic mechanisms, including antigen presentation, cytokine and cell regulation. Its accurate tuning leads to a remarkable ability to combat pathogens in all their variety, while systemic failures may lead to severe consequences like autoimmune diseases. Here we develop and make use of a non-parametric statistical approach to assess T cell clonal size distributions from recent next generation sequencing data. For 41 healthy individuals and a patient with ankylosing spondylitis, who undergone treatment, we invariably find power law scaling over several decades and for the first time calculate quantitatively meaningful values of decay exponent. It has proved to be much the same among healthy donors, significantly different for an autoimmune patient before the therapy, and converging towards a typical value afterwards. We discuss implications of the findings for theoretical understanding and mathematical modeling of adaptive immunity.

  11. A NON-PARAMETRIC APPROACH TO CONSTRAIN THE TRANSFER FUNCTION IN REVERBERATION MAPPING

    International Nuclear Information System (INIS)

    Li, Yan-Rong; Wang, Jian-Min; Bai, Jin-Ming

    2016-01-01

    Broad emission lines of active galactic nuclei stem from a spatially extended region (broad-line region, BLR) that is composed of discrete clouds and photoionized by the central ionizing continuum. The temporal behaviors of these emission lines are blurred echoes of continuum variations (i.e., reverberation mapping, RM) and directly reflect the structures and kinematic information of BLRs through the so-called transfer function (also known as the velocity-delay map). Based on the previous works of Rybicki and Press and Zu et al., we develop an extended, non-parametric approach to determine the transfer function for RM data, in which the transfer function is expressed as a sum of a family of relatively displaced Gaussian response functions. Therefore, arbitrary shapes of transfer functions associated with complicated BLR geometry can be seamlessly included, enabling us to relax the presumption of a specified transfer function frequently adopted in previous studies and to let it be determined by observation data. We formulate our approach in a previously well-established framework that incorporates the statistical modeling of continuum variations as a damped random walk process and takes into account long-term secular variations which are irrelevant to RM signals. The application to RM data shows the fidelity of our approach.

  12. A NON-PARAMETRIC APPROACH TO CONSTRAIN THE TRANSFER FUNCTION IN REVERBERATION MAPPING

    Energy Technology Data Exchange (ETDEWEB)

    Li, Yan-Rong; Wang, Jian-Min [Key Laboratory for Particle Astrophysics, Institute of High Energy Physics, Chinese Academy of Sciences, 19B Yuquan Road, Beijing 100049 (China); Bai, Jin-Ming, E-mail: liyanrong@mail.ihep.ac.cn [Yunnan Observatories, Chinese Academy of Sciences, Kunming 650011 (China)

    2016-11-10

    Broad emission lines of active galactic nuclei stem from a spatially extended region (broad-line region, BLR) that is composed of discrete clouds and photoionized by the central ionizing continuum. The temporal behaviors of these emission lines are blurred echoes of continuum variations (i.e., reverberation mapping, RM) and directly reflect the structures and kinematic information of BLRs through the so-called transfer function (also known as the velocity-delay map). Based on the previous works of Rybicki and Press and Zu et al., we develop an extended, non-parametric approach to determine the transfer function for RM data, in which the transfer function is expressed as a sum of a family of relatively displaced Gaussian response functions. Therefore, arbitrary shapes of transfer functions associated with complicated BLR geometry can be seamlessly included, enabling us to relax the presumption of a specified transfer function frequently adopted in previous studies and to let it be determined by observation data. We formulate our approach in a previously well-established framework that incorporates the statistical modeling of continuum variations as a damped random walk process and takes into account long-term secular variations which are irrelevant to RM signals. The application to RM data shows the fidelity of our approach.

  13. Rank-based permutation approaches for non-parametric factorial designs.

    Science.gov (United States)

    Umlauft, Maria; Konietschke, Frank; Pauly, Markus

    2017-11-01

    Inference methods for null hypotheses formulated in terms of distribution functions in general non-parametric factorial designs are studied. The methods can be applied to continuous, ordinal or even ordered categorical data in a unified way, and are based only on ranks. In this set-up Wald-type statistics and ANOVA-type statistics are the current state of the art. The first method is asymptotically exact but a rather liberal statistical testing procedure for small to moderate sample size, while the latter is only an approximation which does not possess the correct asymptotic α level under the null. To bridge these gaps, a novel permutation approach is proposed which can be seen as a flexible generalization of the Kruskal-Wallis test to all kinds of factorial designs with independent observations. It is proven that the permutation principle is asymptotically correct while keeping its finite exactness property when data are exchangeable. The results of extensive simulation studies foster these theoretical findings. A real data set exemplifies its applicability. © 2017 The British Psychological Society.

  14. Parametric and non-parametric approach for sensory RATA (Rate-All-That-Apply) method of ledre profile attributes

    Science.gov (United States)

    Hastuti, S.; Harijono; Murtini, E. S.; Fibrianto, K.

    2018-03-01

    This current study is aimed to investigate the use of parametric and non-parametric approach for sensory RATA (Rate-All-That-Apply) method. Ledre as Bojonegoro unique local food product was used as point of interest, in which 319 panelists were involved in the study. The result showed that ledre is characterized as easy-crushed texture, sticky in mouth, stingy sensation and easy to swallow. It has also strong banana flavour with brown in colour. Compared to eggroll and semprong, ledre has more variances in terms of taste as well the roll length. As RATA questionnaire is designed to collect categorical data, non-parametric approach is the common statistical procedure. However, similar results were also obtained as parametric approach, regardless the fact of non-normal distributed data. Thus, it suggests that parametric approach can be applicable for consumer study with large number of respondents, even though it may not satisfy the assumption of ANOVA (Analysis of Variances).

  15. Sensitivity of Technical Efficiency Estimates to Estimation Methods: An Empirical Comparison of Parametric and Non-Parametric Approaches

    OpenAIRE

    de-Graft Acquah, Henry

    2014-01-01

    This paper highlights the sensitivity of technical efficiency estimates to estimation approaches using empirical data. Firm specific technical efficiency and mean technical efficiency are estimated using the non parametric Data Envelope Analysis (DEA) and the parametric Corrected Ordinary Least Squares (COLS) and Stochastic Frontier Analysis (SFA) approaches. Mean technical efficiency is found to be sensitive to the choice of estimation technique. Analysis of variance and Tukey’s test sugge...

  16. Evaluation of world's largest social welfare scheme: An assessment using non-parametric approach.

    Science.gov (United States)

    Singh, Sanjeet

    2016-08-01

    Mahatma Gandhi National Rural Employment Guarantee Act (MGNREGA) is the world's largest social welfare scheme in India for the poverty alleviation through rural employment generation. This paper aims to evaluate and rank the performance of the states in India under MGNREGA scheme. A non-parametric approach, Data Envelopment Analysis (DEA) is used to calculate the overall technical, pure technical, and scale efficiencies of states in India. The sample data is drawn from the annual official reports published by the Ministry of Rural Development, Government of India. Based on three selected input parameters (expenditure indicators) and five output parameters (employment generation indicators), I apply both input and output oriented DEA models to estimate how well the states utilize their resources and generate outputs during the financial year 2013-14. The relative performance evaluation has been made under the assumption of constant returns and also under variable returns to scale to assess the impact of scale on performance. The results indicate that the main source of inefficiency is both technical and managerial practices adopted. 11 states are overall technically efficient and operate at the optimum scale whereas 18 states are pure technical or managerially efficient. It has been found that for some states it necessary to alter scheme size to perform at par with the best performing states. For inefficient states optimal input and output targets along with the resource savings and output gains are calculated. Analysis shows that if all inefficient states operate at optimal input and output levels, on an average 17.89% of total expenditure and a total amount of $780million could have been saved in a single year. Most of the inefficient states perform poorly when it comes to the participation of women and disadvantaged sections (SC&ST) in the scheme. In order to catch up with the performance of best performing states, inefficient states on an average need to enhance

  17. A Non-Parametric Delphi Approach to Foster Innovation Policy Debate in Spain

    Directory of Open Access Journals (Sweden)

    Juan Carlos Salazar-Elena

    2016-05-01

    Full Text Available The aim of this paper is to identify some changes needed in Spain’s innovation policy to fill the gap between its innovation results and those of other European countries in lieu of sustainable leadership. To do this we apply the Delphi methodology to experts from academia, business, and government. To overcome the shortcomings of traditional descriptive methods, we develop an inferential analysis by following a non-parametric bootstrap method which enables us to identify important changes that should be implemented. Particularly interesting is the support found for improving the interconnections among the relevant agents of the innovation system (instead of focusing exclusively in the provision of knowledge and technological inputs through R and D activities, or the support found for “soft” policy instruments aimed at providing a homogeneous framework to assess the innovation capabilities of firms (e.g., for funding purposes. Attention to potential innovators among small and medium enterprises (SMEs and traditional industries is particularly encouraged by experts.

  18. A non-parametric Data Envelopment Analysis approach for improving energy efficiency of grape production

    International Nuclear Information System (INIS)

    Khoshroo, Alireza; Mulwa, Richard; Emrouznejad, Ali; Arabi, Behrouz

    2013-01-01

    Grape is one of the world's largest fruit crops with approximately 67.5 million tonnes produced each year and energy is an important element in modern grape productions as it heavily depends on fossil and other energy resources. Efficient use of these energies is a necessary step toward reducing environmental hazards, preventing destruction of natural resources and ensuring agricultural sustainability. Hence, identifying excessive use of energy as well as reducing energy resources is the main focus of this paper to optimize energy consumption in grape production. In this study we use a two-stage methodology to find the association of energy efficiency and performance explained by farmers' specific characteristics. In the first stage a non-parametric Data Envelopment Analysis is used to model efficiencies as an explicit function of human labor, machinery, chemicals, FYM (farmyard manure), diesel fuel, electricity and water for irrigation energies. In the second step, farm specific variables such as farmers' age, gender, level of education and agricultural experience are used in a Tobit regression framework to explain how these factors influence efficiency of grape farming. The result of the first stage shows substantial inefficiency between the grape producers in the studied area while the second stage shows that the main difference between efficient and inefficient farmers was in the use of chemicals, diesel fuel and water for irrigation. The use of chemicals such as insecticides, herbicides and fungicides were considerably less than inefficient ones. The results revealed that the more educated farmers are more energy efficient in comparison with their less educated counterparts. - Highlights: • The focus of this paper is to identify excessive use of energy and optimize energy consumption in grape production. • We measure the efficiency as a function of labor/machinery/chemicals/farmyard manure/diesel-fuel/electricity/water. • Data were obtained from 41 grape

  19. rSeqNP: a non-parametric approach for detecting differential expression and splicing from RNA-Seq data.

    Science.gov (United States)

    Shi, Yang; Chinnaiyan, Arul M; Jiang, Hui

    2015-07-01

    High-throughput sequencing of transcriptomes (RNA-Seq) has become a powerful tool to study gene expression. Here we present an R package, rSeqNP, which implements a non-parametric approach to test for differential expression and splicing from RNA-Seq data. rSeqNP uses permutation tests to access statistical significance and can be applied to a variety of experimental designs. By combining information across isoforms, rSeqNP is able to detect more differentially expressed or spliced genes from RNA-Seq data. The R package with its source code and documentation are freely available at http://www-personal.umich.edu/∼jianghui/rseqnp/. jianghui@umich.edu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  20. Prediction intervals for future BMI values of individual children - a non-parametric approach by quantile boosting

    Directory of Open Access Journals (Sweden)

    Mayr Andreas

    2012-01-01

    Full Text Available Abstract Background The construction of prediction intervals (PIs for future body mass index (BMI values of individual children based on a recent German birth cohort study with n = 2007 children is problematic for standard parametric approaches, as the BMI distribution in childhood is typically skewed depending on age. Methods We avoid distributional assumptions by directly modelling the borders of PIs by additive quantile regression, estimated by boosting. We point out the concept of conditional coverage to prove the accuracy of PIs. As conditional coverage can hardly be evaluated in practical applications, we conduct a simulation study before fitting child- and covariate-specific PIs for future BMI values and BMI patterns for the present data. Results The results of our simulation study suggest that PIs fitted by quantile boosting cover future observations with the predefined coverage probability and outperform the benchmark approach. For the prediction of future BMI values, quantile boosting automatically selects informative covariates and adapts to the age-specific skewness of the BMI distribution. The lengths of the estimated PIs are child-specific and increase, as expected, with the age of the child. Conclusions Quantile boosting is a promising approach to construct PIs with correct conditional coverage in a non-parametric way. It is in particular suitable for the prediction of BMI patterns depending on covariates, since it provides an interpretable predictor structure, inherent variable selection properties and can even account for longitudinal data structures.

  1. A non-parametric meta-analysis approach for combining independent microarray datasets: application using two microarray datasets pertaining to chronic allograft nephropathy

    Directory of Open Access Journals (Sweden)

    Archer Kellie J

    2008-02-01

    Full Text Available Abstract Background With the popularity of DNA microarray technology, multiple groups of researchers have studied the gene expression of similar biological conditions. Different methods have been developed to integrate the results from various microarray studies, though most of them rely on distributional assumptions, such as the t-statistic based, mixed-effects model, or Bayesian model methods. However, often the sample size for each individual microarray experiment is small. Therefore, in this paper we present a non-parametric meta-analysis approach for combining data from independent microarray studies, and illustrate its application on two independent Affymetrix GeneChip studies that compared the gene expression of biopsies from kidney transplant recipients with chronic allograft nephropathy (CAN to those with normal functioning allograft. Results The simulation study comparing the non-parametric meta-analysis approach to a commonly used t-statistic based approach shows that the non-parametric approach has better sensitivity and specificity. For the application on the two CAN studies, we identified 309 distinct genes that expressed differently in CAN. By applying Fisher's exact test to identify enriched KEGG pathways among those genes called differentially expressed, we found 6 KEGG pathways to be over-represented among the identified genes. We used the expression measurements of the identified genes as predictors to predict the class labels for 6 additional biopsy samples, and the predicted results all conformed to their pathologist diagnosed class labels. Conclusion We present a new approach for combining data from multiple independent microarray studies. This approach is non-parametric and does not rely on any distributional assumptions. The rationale behind the approach is logically intuitive and can be easily understood by researchers not having advanced training in statistics. Some of the identified genes and pathways have been

  2. Non-parametric data-based approach for the quantification and communication of uncertainties in river flood forecasts

    Science.gov (United States)

    Van Steenbergen, N.; Willems, P.

    2012-04-01

    Reliable flood forecasts are the most important non-structural measures to reduce the impact of floods. However flood forecasting systems are subject to uncertainty originating from the input data, model structure and model parameters of the different hydraulic and hydrological submodels. To quantify this uncertainty a non-parametric data-based approach has been developed. This approach analyses the historical forecast residuals (differences between the predictions and the observations at river gauging stations) without using a predefined statistical error distribution. Because the residuals are correlated with the value of the forecasted water level and the lead time, the residuals are split up into discrete classes of simulated water levels and lead times. For each class, percentile values are calculated of the model residuals and stored in a 'three dimensional error' matrix. By 3D interpolation in this error matrix, the uncertainty in new forecasted water levels can be quantified. In addition to the quantification of the uncertainty, the communication of this uncertainty is equally important. The communication has to be done in a consistent way, reducing the chance of misinterpretation. Also, the communication needs to be adapted to the audience; the majority of the larger public is not interested in in-depth information on the uncertainty on the predicted water levels, but only is interested in information on the likelihood of exceedance of certain alarm levels. Water managers need more information, e.g. time dependent uncertainty information, because they rely on this information to undertake the appropriate flood mitigation action. There are various ways in presenting uncertainty information (numerical, linguistic, graphical, time (in)dependent, etc.) each with their advantages and disadvantages for a specific audience. A useful method to communicate uncertainty of flood forecasts is by probabilistic flood mapping. These maps give a representation of the

  3. Evaluation of model-based versus non-parametric monaural noise-reduction approaches for hearing aids.

    Science.gov (United States)

    Harlander, Niklas; Rosenkranz, Tobias; Hohmann, Volker

    2012-08-01

    Single channel noise reduction has been well investigated and seems to have reached its limits in terms of speech intelligibility improvement, however, the quality of such schemes can still be advanced. This study tests to what extent novel model-based processing schemes might improve performance in particular for non-stationary noise conditions. Two prototype model-based algorithms, a speech-model-based, and a auditory-model-based algorithm were compared to a state-of-the-art non-parametric minimum statistics algorithm. A speech intelligibility test, preference rating, and listening effort scaling were performed. Additionally, three objective quality measures for the signal, background, and overall distortions were applied. For a better comparison of all algorithms, particular attention was given to the usage of the similar Wiener-based gain rule. The perceptual investigation was performed with fourteen hearing-impaired subjects. The results revealed that the non-parametric algorithm and the auditory model-based algorithm did not affect speech intelligibility, whereas the speech-model-based algorithm slightly decreased intelligibility. In terms of subjective quality, both model-based algorithms perform better than the unprocessed condition and the reference in particular for highly non-stationary noise environments. Data support the hypothesis that model-based algorithms are promising for improving performance in non-stationary noise conditions.

  4. Estimation of the limit of detection with a bootstrap-derived standard error by a partly non-parametric approach. Application to HPLC drug assays

    DEFF Research Database (Denmark)

    Linnet, Kristian

    2005-01-01

    Bootstrap, HPLC, limit of blank, limit of detection, non-parametric statistics, type I and II errors......Bootstrap, HPLC, limit of blank, limit of detection, non-parametric statistics, type I and II errors...

  5. Selection bias, vote counting, and money-priming effects: A comment on Rohrer, Pashler, and Harris (2015) and Vohs (2015).

    Science.gov (United States)

    Vadillo, Miguel A; Hardwicke, Tom E; Shanks, David R

    2016-05-01

    When a series of studies fails to replicate a well-documented effect, researchers might be tempted to use a "vote counting" approach to decide whether the effect is reliable-that is, simply comparing the number of successful and unsuccessful replications. Vohs's (2015) response to the absence of money priming effects reported by Rohrer, Pashler, and Harris (2015) provides an example of this approach. Unfortunately, vote counting is a poor strategy to assess the reliability of psychological findings because it neglects the impact of selection bias and questionable research practices. In the present comment, we show that a range of meta-analytic tools indicate irregularities in the money priming literature discussed by Rohrer et al. and Vohs, which all point to the conclusion that these effects are distorted by selection bias, reporting biases, or p-hacking. This could help to explain why money-priming effects have proven unreliable in a number of direct replication attempts in which biases have been minimized through preregistration or transparent reporting. Our major conclusion is that the simple proportion of significant findings is a poor guide to the reliability of research and that preregistered replications are an essential means to assess the reliability of money-priming effects. (c) 2016 APA, all rights reserved).

  6. Analysing Vote Counting Algorithms Via Logic - And its Application to the CADE Election Scheme

    DEFF Research Database (Denmark)

    Schürmann, Carsten; Beckert, Bernhard; Gore, Rejeev

    2013-01-01

    We present a method for using first-order logic to specify the semantics of preferences as used in common vote counting algorithms. We also present a corresponding system that uses Celf linear-logic programs to describe voting algorithms and which generates explicit examples when the algorithm de...

  7. Framing Electoral Transparency: A comparative analysis of three e-votes counting ceremonies

    DEFF Research Database (Denmark)

    Schürmann, Carsten; Markussen, Randi; Bélanger, Olivier

    2016-01-01

    in the field of e-voting has been proved to be di cult to tackle practically as well as analytically. In this paper we introduce the notion of ‘frames of transparency’ and deploy it to conduct a comparative analysis of three e-votes counting ceremonies in Norway, Estonia and Australia. We ask the question...

  8. Non-parametric identification of multivariable systems : a local rational modeling approach with application to a vibration isolation benchmark

    NARCIS (Netherlands)

    Voorhoeve, R.J.; van der Maas, A.; Oomen, T.A.J.

    2018-01-01

    Frequency response function (FRF) identification is often used as a basis for control systems design and as a starting point for subsequent parametric system identification. The aim of this paper is to develop a multiple-input multiple-output (MIMO) local parametric modeling approach for FRF

  9. Parametric and Non-Parametric System Modelling

    DEFF Research Database (Denmark)

    Nielsen, Henrik Aalborg

    1999-01-01

    the focus is on combinations of parametric and non-parametric methods of regression. This combination can be in terms of additive models where e.g. one or more non-parametric term is added to a linear regression model. It can also be in terms of conditional parametric models where the coefficients...... considered. It is shown that adaptive estimation in conditional parametric models can be performed by combining the well known methods of local polynomial regression and recursive least squares with exponential forgetting. The approach used for estimation in conditional parametric models also highlights how...... networks is included. In this paper, neural networks are used for predicting the electricity production of a wind farm. The results are compared with results obtained using an adaptively estimated ARX-model. Finally, two papers on stochastic differential equations are included. In the first paper, among...

  10. Non-parametric smoothing of experimental data

    International Nuclear Information System (INIS)

    Kuketayev, A.T.; Pen'kov, F.M.

    2007-01-01

    Full text: Rapid processing of experimental data samples in nuclear physics often requires differentiation in order to find extrema. Therefore, even at the preliminary stage of data analysis, a range of noise reduction methods are used to smooth experimental data. There are many non-parametric smoothing techniques: interval averages, moving averages, exponential smoothing, etc. Nevertheless, it is more common to use a priori information about the behavior of the experimental curve in order to construct smoothing schemes based on the least squares techniques. The latter methodology's advantage is that the area under the curve can be preserved, which is equivalent to conservation of total speed of counting. The disadvantages of this approach include the lack of a priori information. For example, very often the sums of undifferentiated (by a detector) peaks are replaced with one peak during the processing of data, introducing uncontrolled errors in the determination of the physical quantities. The problem is solvable only by having experienced personnel, whose skills are much greater than the challenge. We propose a set of non-parametric techniques, which allows the use of any additional information on the nature of experimental dependence. The method is based on a construction of a functional, which includes both experimental data and a priori information. Minimum of this functional is reached on a non-parametric smoothed curve. Euler (Lagrange) differential equations are constructed for these curves; then their solutions are obtained analytically or numerically. The proposed approach allows for automated processing of nuclear physics data, eliminating the need for highly skilled laboratory personnel. Pursuant to the proposed approach is the possibility to obtain smoothing curves in a given confidence interval, e.g. according to the χ 2 distribution. This approach is applicable when constructing smooth solutions of ill-posed problems, in particular when solving

  11. Has Outsourcing/Contracting Out Saved Money and/or Improved Service Quality? A Vote-Counting Analysis

    OpenAIRE

    Bourbeau, John Allen

    2004-01-01

    Most privatization literature, of which outsourcing/contracting out is a sub-set, discusses: 1) localized anecdotes of how organizations privatized; 2) privatization's history; 3) its types; and/or 4) its pros and cons. What is missing is a methodologically defensible, comprehensive, macro-view of whether or not outsourcing has saved money and/or improved service quality. Using the vote-counting analytical procedure, this dissertation provides one comprehensive view by analyzing and combin...

  12. Speaker Linking and Applications using Non-Parametric Hashing Methods

    Science.gov (United States)

    2016-09-08

    nonparametric estimate of a multivariate density function,” The Annals of Math- ematical Statistics , vol. 36, no. 3, pp. 1049–1051, 1965. [9] E. A. Patrick...Speaker Linking and Applications using Non-Parametric Hashing Methods† Douglas Sturim and William M. Campbell MIT Lincoln Laboratory, Lexington, MA...with many approaches [1, 2]. For this paper, we focus on using i-vectors [2], but the methods apply to any embedding. For the task of speaker QBE and

  13. Non-Parametric Estimation of Correlation Functions

    DEFF Research Database (Denmark)

    Brincker, Rune; Rytter, Anders; Krenk, Steen

    In this paper three methods of non-parametric correlation function estimation are reviewed and evaluated: the direct method, estimation by the Fast Fourier Transform and finally estimation by the Random Decrement technique. The basic ideas of the techniques are reviewed, sources of bias are point...

  14. Bayesian non parametric modelling of Higgs pair production

    Directory of Open Access Journals (Sweden)

    Scarpa Bruno

    2017-01-01

    Full Text Available Statistical classification models are commonly used to separate a signal from a background. In this talk we face the problem of isolating the signal of Higgs pair production using the decay channel in which each boson decays into a pair of b-quarks. Typically in this context non parametric methods are used, such as Random Forests or different types of boosting tools. We remain in the same non-parametric framework, but we propose to face the problem following a Bayesian approach. A Dirichlet process is used as prior for the random effects in a logit model which is fitted by leveraging the Polya-Gamma data augmentation. Refinements of the model include the insertion in the simple model of P-splines to relate explanatory variables with the response and the use of Bayesian trees (BART to describe the atoms in the Dirichlet process.

  15. A non-parametric, microdosimetric-based approach to the evaluation of the biological effects of low doses of ionizing radiation

    International Nuclear Information System (INIS)

    Varma, M.N.; Zaider, M.

    1992-01-01

    A microdosimetric-based specific quality factor, q(y), is determined for eight sets of experimental data on cellular inactivation, mutation, chromosome aberration and neoplastic transformation. Bias-free Bayesian and maximum entropy approaches were used. A comparison of the curves q(y) thus determined reveals a surprising degree of uniformity. In our view this is prima facie evidence that the spatial pattern of microscopic energy deposition - rather than the specific end point or cellular system - is the quantity which determines the dependency of the cellular response to the quality of the ionizing radiation. For further applications of this approach to radiation protection experimental microdosimetric spectra are urgently needed. Further improvement in the quality of the q(y) functions could provide important clues on fundamental biophysical mechanisms. (author)

  16. A non-parametric framework for estimating threshold limit values

    Directory of Open Access Journals (Sweden)

    Ulm Kurt

    2005-11-01

    Full Text Available Abstract Background To estimate a threshold limit value for a compound known to have harmful health effects, an 'elbow' threshold model is usually applied. We are interested on non-parametric flexible alternatives. Methods We describe how a step function model fitted by isotonic regression can be used to estimate threshold limit values. This method returns a set of candidate locations, and we discuss two algorithms to select the threshold among them: the reduced isotonic regression and an algorithm considering the closed family of hypotheses. We assess the performance of these two alternative approaches under different scenarios in a simulation study. We illustrate the framework by analysing the data from a study conducted by the German Research Foundation aiming to set a threshold limit value in the exposure to total dust at workplace, as a causal agent for developing chronic bronchitis. Results In the paper we demonstrate the use and the properties of the proposed methodology along with the results from an application. The method appears to detect the threshold with satisfactory success. However, its performance can be compromised by the low power to reject the constant risk assumption when the true dose-response relationship is weak. Conclusion The estimation of thresholds based on isotonic framework is conceptually simple and sufficiently powerful. Given that in threshold value estimation context there is not a gold standard method, the proposed model provides a useful non-parametric alternative to the standard approaches and can corroborate or challenge their findings.

  17. On Parametric (and Non-Parametric Variation

    Directory of Open Access Journals (Sweden)

    Neil Smith

    2009-11-01

    Full Text Available This article raises the issue of the correct characterization of ‘Parametric Variation’ in syntax and phonology. After specifying their theoretical commitments, the authors outline the relevant parts of the Principles–and–Parameters framework, and draw a three-way distinction among Universal Principles, Parameters, and Accidents. The core of the contribution then consists of an attempt to provide identity criteria for parametric, as opposed to non-parametric, variation. Parametric choices must be antecedently known, and it is suggested that they must also satisfy seven individually necessary and jointly sufficient criteria. These are that they be cognitively represented, systematic, dependent on the input, deterministic, discrete, mutually exclusive, and irreversible.

  18. Multi-Directional Non-Parametric Analysis of Agricultural Efficiency

    DEFF Research Database (Denmark)

    Balezentis, Tomas

    This thesis seeks to develop methodologies for assessment of agricultural efficiency and employ them to Lithuanian family farms. In particular, we focus on three particular objectives throughout the research: (i) to perform a fully non-parametric analysis of efficiency effects, (ii) to extend...... to the Multi-Directional Efficiency Analysis approach when the proposed models were employed to analyse empirical data of Lithuanian family farm performance, we saw substantial differences in efficiencies associated with different inputs. In particular, assets appeared to be the least efficiently used input...... relative to labour, intermediate consumption and land (in some cases land was not treated as a discretionary input). These findings call for further research on relationships among financial structure, investment decisions, and efficiency in Lithuanian family farms. Application of different techniques...

  19. Non-parametric tests of productive efficiency with errors-in-variables

    NARCIS (Netherlands)

    Kuosmanen, T.K.; Post, T.; Scholtes, S.

    2007-01-01

    We develop a non-parametric test of productive efficiency that accounts for errors-in-variables, following the approach of Varian. [1985. Nonparametric analysis of optimizing behavior with measurement error. Journal of Econometrics 30(1/2), 445-458]. The test is based on the general Pareto-Koopmans

  20. Discrete non-parametric kernel estimation for global sensitivity analysis

    International Nuclear Information System (INIS)

    Senga Kiessé, Tristan; Ventura, Anne

    2016-01-01

    This work investigates the discrete kernel approach for evaluating the contribution of the variance of discrete input variables to the variance of model output, via analysis of variance (ANOVA) decomposition. Until recently only the continuous kernel approach has been applied as a metamodeling approach within sensitivity analysis framework, for both discrete and continuous input variables. Now the discrete kernel estimation is known to be suitable for smoothing discrete functions. We present a discrete non-parametric kernel estimator of ANOVA decomposition of a given model. An estimator of sensitivity indices is also presented with its asymtotic convergence rate. Some simulations on a test function analysis and a real case study from agricultural have shown that the discrete kernel approach outperforms the continuous kernel one for evaluating the contribution of moderate or most influential discrete parameters to the model output. - Highlights: • We study a discrete kernel estimation for sensitivity analysis of a model. • A discrete kernel estimator of ANOVA decomposition of the model is presented. • Sensitivity indices are calculated for discrete input parameters. • An estimator of sensitivity indices is also presented with its convergence rate. • An application is realized for improving the reliability of environmental models.

  1. A local non-parametric model for trade sign inference

    Science.gov (United States)

    Blazejewski, Adam; Coggins, Richard

    2005-03-01

    We investigate a regularity in market order submission strategies for 12 stocks with large market capitalization on the Australian Stock Exchange. The regularity is evidenced by a predictable relationship between the trade sign (trade initiator), size of the trade, and the contents of the limit order book before the trade. We demonstrate this predictability by developing an empirical inference model to classify trades into buyer-initiated and seller-initiated. The model employs a local non-parametric method, k-nearest neighbor, which in the past was used successfully for chaotic time series prediction. The k-nearest neighbor with three predictor variables achieves an average out-of-sample classification accuracy of 71.40%, compared to 63.32% for the linear logistic regression with seven predictor variables. The result suggests that a non-linear approach may produce a more parsimonious trade sign inference model with a higher out-of-sample classification accuracy. Furthermore, for most of our stocks the observed regularity in market order submissions seems to have a memory of at least 30 trading days.

  2. Bootstrapping the economy -- a non-parametric method of generating consistent future scenarios

    OpenAIRE

    Müller, Ulrich A; Bürgi, Roland; Dacorogna, Michel M

    2004-01-01

    The fortune and the risk of a business venture depends on the future course of the economy. There is a strong demand for economic forecasts and scenarios that can be applied to planning and modeling. While there is an ongoing debate on modeling economic scenarios, the bootstrapping (or resampling) approach presented here has several advantages. As a non-parametric method, it directly relies on past market behaviors rather than debatable assumptions on models and parameters. Simultaneous dep...

  3. Non-Parametric Analysis of Rating Transition and Default Data

    DEFF Research Database (Denmark)

    Fledelius, Peter; Lando, David; Perch Nielsen, Jens

    2004-01-01

    We demonstrate the use of non-parametric intensity estimation - including construction of pointwise confidence sets - for analyzing rating transition data. We find that transition intensities away from the class studied here for illustration strongly depend on the direction of the previous move b...

  4. Non-parametric analysis of production efficiency of poultry egg ...

    African Journals Online (AJOL)

    Non-parametric analysis of production efficiency of poultry egg farmers in Delta ... analysis of factors affecting the output of poultry farmers showed that stock ... should be put in place for farmers to learn the best farm practices carried out on the ...

  5. Using non-parametric methods in econometric production analysis

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard; Henningsen, Arne

    2012-01-01

    by investigating the relationship between the elasticity of scale and the farm size. We use a balanced panel data set of 371~specialised crop farms for the years 2004-2007. A non-parametric specification test shows that neither the Cobb-Douglas function nor the Translog function are consistent with the "true......Econometric estimation of production functions is one of the most common methods in applied economic production analysis. These studies usually apply parametric estimation techniques, which obligate the researcher to specify a functional form of the production function of which the Cobb...... parameter estimates, but also in biased measures which are derived from the parameters, such as elasticities. Therefore, we propose to use non-parametric econometric methods. First, these can be applied to verify the functional form used in parametric production analysis. Second, they can be directly used...

  6. Non-parametric estimation of the individual's utility map

    OpenAIRE

    Noguchi, Takao; Sanborn, Adam N.; Stewart, Neil

    2013-01-01

    Models of risky choice have attracted much attention in behavioural economics. Previous research has repeatedly demonstrated that individuals' choices are not well explained by expected utility theory, and a number of alternative models have been examined using carefully selected sets of choice alternatives. The model performance however, can depend on which choice alternatives are being tested. Here we develop a non-parametric method for estimating the utility map over the wide range of choi...

  7. STATCAT, Statistical Analysis of Parametric and Non-Parametric Data

    International Nuclear Information System (INIS)

    David, Hugh

    1990-01-01

    1 - Description of program or function: A suite of 26 programs designed to facilitate the appropriate statistical analysis and data handling of parametric and non-parametric data, using classical and modern univariate and multivariate methods. 2 - Method of solution: Data is read entry by entry, using a choice of input formats, and the resultant data bank is checked for out-of- range, rare, extreme or missing data. The completed STATCAT data bank can be treated by a variety of descriptive and inferential statistical methods, and modified, using other standard programs as required

  8. Digital spectral analysis parametric, non-parametric and advanced methods

    CERN Document Server

    Castanié, Francis

    2013-01-01

    Digital Spectral Analysis provides a single source that offers complete coverage of the spectral analysis domain. This self-contained work includes details on advanced topics that are usually presented in scattered sources throughout the literature.The theoretical principles necessary for the understanding of spectral analysis are discussed in the first four chapters: fundamentals, digital signal processing, estimation in spectral analysis, and time-series models.An entire chapter is devoted to the non-parametric methods most widely used in industry.High resolution methods a

  9. Generalized Correlation Coefficient for Non-Parametric Analysis of Microarray Time-Course Data.

    Science.gov (United States)

    Tan, Qihua; Thomassen, Mads; Burton, Mark; Mose, Kristian Fredløv; Andersen, Klaus Ejner; Hjelmborg, Jacob; Kruse, Torben

    2017-06-06

    Modeling complex time-course patterns is a challenging issue in microarray study due to complex gene expression patterns in response to the time-course experiment. We introduce the generalized correlation coefficient and propose a combinatory approach for detecting, testing and clustering the heterogeneous time-course gene expression patterns. Application of the method identified nonlinear time-course patterns in high agreement with parametric analysis. We conclude that the non-parametric nature in the generalized correlation analysis could be an useful and efficient tool for analyzing microarray time-course data and for exploring the complex relationships in the omics data for studying their association with disease and health.

  10. Non-parametric transformation for data correlation and integration: From theory to practice

    Energy Technology Data Exchange (ETDEWEB)

    Datta-Gupta, A.; Xue, Guoping; Lee, Sang Heon [Texas A& M Univ., College Station, TX (United States)

    1997-08-01

    The purpose of this paper is two-fold. First, we introduce the use of non-parametric transformations for correlating petrophysical data during reservoir characterization. Such transformations are completely data driven and do not require a priori functional relationship between response and predictor variables which is the case with traditional multiple regression. The transformations are very general, computationally efficient and can easily handle mixed data types for example, continuous variables such as porosity, permeability and categorical variables such as rock type, lithofacies. The power of the non-parametric transformation techniques for data correlation has been illustrated through synthetic and field examples. Second, we utilize these transformations to propose a two-stage approach for data integration during heterogeneity characterization. The principal advantages of our approach over traditional cokriging or cosimulation methods are: (1) it does not require a linear relationship between primary and secondary data, (2) it exploits the secondary information to its fullest potential by maximizing the correlation between the primary and secondary data, (3) it can be easily applied to cases where several types of secondary or soft data are involved, and (4) it significantly reduces variance function calculations and thus, greatly facilitates non-Gaussian cosimulation. We demonstrate the data integration procedure using synthetic and field examples. The field example involves estimation of pore-footage distribution using well data and multiple seismic attributes.

  11. Spurious Seasonality Detection: A Non-Parametric Test Proposal

    Directory of Open Access Journals (Sweden)

    Aurelio F. Bariviera

    2018-01-01

    Full Text Available This paper offers a general and comprehensive definition of the day-of-the-week effect. Using symbolic dynamics, we develop a unique test based on ordinal patterns in order to detect it. This test uncovers the fact that the so-called “day-of-the-week” effect is partly an artifact of the hidden correlation structure of the data. We present simulations based on artificial time series as well. While time series generated with long memory are prone to exhibit daily seasonality, pure white noise signals exhibit no pattern preference. Since ours is a non-parametric test, it requires no assumptions about the distribution of returns, so that it could be a practical alternative to conventional econometric tests. We also made an exhaustive application of the here-proposed technique to 83 stock indexes around the world. Finally, the paper highlights the relevance of symbolic analysis in economic time series studies.

  12. Non-parametric Bayesian networks: Improving theory and reviewing applications

    International Nuclear Information System (INIS)

    Hanea, Anca; Morales Napoles, Oswaldo; Ababei, Dan

    2015-01-01

    Applications in various domains often lead to high dimensional dependence modelling. A Bayesian network (BN) is a probabilistic graphical model that provides an elegant way of expressing the joint distribution of a large number of interrelated variables. BNs have been successfully used to represent uncertain knowledge in a variety of fields. The majority of applications use discrete BNs, i.e. BNs whose nodes represent discrete variables. Integrating continuous variables in BNs is an area fraught with difficulty. Several methods that handle discrete-continuous BNs have been proposed in the literature. This paper concentrates only on one method called non-parametric BNs (NPBNs). NPBNs were introduced in 2004 and they have been or are currently being used in at least twelve professional applications. This paper provides a short introduction to NPBNs, a couple of theoretical advances, and an overview of applications. The aim of the paper is twofold: one is to present the latest improvements of the theory underlying NPBNs, and the other is to complement the existing overviews of BNs applications with the NPNBs applications. The latter opens the opportunity to discuss some difficulties that applications pose to the theoretical framework and in this way offers some NPBN modelling guidance to practitioners. - Highlights: • The paper gives an overview of the current NPBNs methodology. • We extend the NPBN methodology by relaxing the conditions of one of its fundamental theorems. • We propose improvements of the data mining algorithm for the NPBNs. • We review the professional applications of the NPBNs.

  13. Bootstrap Prediction Intervals in Non-Parametric Regression with Applications to Anomaly Detection

    Science.gov (United States)

    Kumar, Sricharan; Srivistava, Ashok N.

    2012-01-01

    Prediction intervals provide a measure of the probable interval in which the outputs of a regression model can be expected to occur. Subsequently, these prediction intervals can be used to determine if the observed output is anomalous or not, conditioned on the input. In this paper, a procedure for determining prediction intervals for outputs of nonparametric regression models using bootstrap methods is proposed. Bootstrap methods allow for a non-parametric approach to computing prediction intervals with no specific assumptions about the sampling distribution of the noise or the data. The asymptotic fidelity of the proposed prediction intervals is theoretically proved. Subsequently, the validity of the bootstrap based prediction intervals is illustrated via simulations. Finally, the bootstrap prediction intervals are applied to the problem of anomaly detection on aviation data.

  14. Non-parametric adaptive importance sampling for the probability estimation of a launcher impact position

    International Nuclear Information System (INIS)

    Morio, Jerome

    2011-01-01

    Importance sampling (IS) is a useful simulation technique to estimate critical probability with a better accuracy than Monte Carlo methods. It consists in generating random weighted samples from an auxiliary distribution rather than the distribution of interest. The crucial part of this algorithm is the choice of an efficient auxiliary PDF that has to be able to simulate more rare random events. The optimisation of this auxiliary distribution is often in practice very difficult. In this article, we propose to approach the IS optimal auxiliary density with non-parametric adaptive importance sampling (NAIS). We apply this technique for the probability estimation of spatial launcher impact position since it has currently become a more and more important issue in the field of aeronautics.

  15. Performance of non-parametric algorithms for spatial mapping of tropical forest structure

    Directory of Open Access Journals (Sweden)

    Liang Xu

    2016-08-01

    Full Text Available Abstract Background Mapping tropical forest structure is a critical requirement for accurate estimation of emissions and removals from land use activities. With the availability of a wide range of remote sensing imagery of vegetation characteristics from space, development of finer resolution and more accurate maps has advanced in recent years. However, the mapping accuracy relies heavily on the quality of input layers, the algorithm chosen, and the size and quality of inventory samples for calibration and validation. Results By using airborne lidar data as the “truth” and focusing on the mean canopy height (MCH as a key structural parameter, we test two commonly-used non-parametric techniques of maximum entropy (ME and random forest (RF for developing maps over a study site in Central Gabon. Results of mapping show that both approaches have improved accuracy with more input layers in mapping canopy height at 100 m (1-ha pixels. The bias-corrected spatial models further improve estimates for small and large trees across the tails of height distributions with a trade-off in increasing overall mean squared error that can be readily compensated by increasing the sample size. Conclusions A significant improvement in tropical forest mapping can be achieved by weighting the number of inventory samples against the choice of image layers and the non-parametric algorithms. Without future satellite observations with better sensitivity to forest biomass, the maps based on existing data will remain slightly biased towards the mean of the distribution and under and over estimating the upper and lower tails of the distribution.

  16. An artificial neural network architecture for non-parametric visual odometry in wireless capsule endoscopy

    International Nuclear Information System (INIS)

    Dimas, George; Iakovidis, Dimitris K; Karargyris, Alexandros; Ciuti, Gastone; Koulaouzidis, Anastasios

    2017-01-01

    Wireless capsule endoscopy is a non-invasive screening procedure of the gastrointestinal (GI) tract performed with an ingestible capsule endoscope (CE) of the size of a large vitamin pill. Such endoscopes are equipped with a usually low-frame-rate color camera which enables the visualization of the GI lumen and the detection of pathologies. The localization of the commercially available CEs is performed in the 3D abdominal space using radio-frequency (RF) triangulation from external sensor arrays, in combination with transit time estimation. State-of-the-art approaches, such as magnetic localization, which have been experimentally proved more accurate than the RF approach, are still at an early stage. Recently, we have demonstrated that CE localization is feasible using solely visual cues and geometric models. However, such approaches depend on camera parameters, many of which are unknown. In this paper the authors propose a novel non-parametric visual odometry (VO) approach to CE localization based on a feed-forward neural network architecture. The effectiveness of this approach in comparison to state-of-the-art geometric VO approaches is validated using a robotic-assisted in vitro experimental setup. (paper)

  17. An artificial neural network architecture for non-parametric visual odometry in wireless capsule endoscopy

    Science.gov (United States)

    Dimas, George; Iakovidis, Dimitris K.; Karargyris, Alexandros; Ciuti, Gastone; Koulaouzidis, Anastasios

    2017-09-01

    Wireless capsule endoscopy is a non-invasive screening procedure of the gastrointestinal (GI) tract performed with an ingestible capsule endoscope (CE) of the size of a large vitamin pill. Such endoscopes are equipped with a usually low-frame-rate color camera which enables the visualization of the GI lumen and the detection of pathologies. The localization of the commercially available CEs is performed in the 3D abdominal space using radio-frequency (RF) triangulation from external sensor arrays, in combination with transit time estimation. State-of-the-art approaches, such as magnetic localization, which have been experimentally proved more accurate than the RF approach, are still at an early stage. Recently, we have demonstrated that CE localization is feasible using solely visual cues and geometric models. However, such approaches depend on camera parameters, many of which are unknown. In this paper the authors propose a novel non-parametric visual odometry (VO) approach to CE localization based on a feed-forward neural network architecture. The effectiveness of this approach in comparison to state-of-the-art geometric VO approaches is validated using a robotic-assisted in vitro experimental setup.

  18. Continuous/discrete non parametric Bayesian belief nets with UNICORN and UNINET

    NARCIS (Netherlands)

    Cooke, R.M.; Kurowicka, D.; Hanea, A.M.; Morales Napoles, O.; Ababei, D.A.; Ale, B.J.M.; Roelen, A.

    2007-01-01

    Hanea et al. (2006) presented a method for quantifying and computing continuous/discrete non parametric Bayesian Belief Nets (BBN). Influences are represented as conditional rank correlations, and the joint normal copula enables rapid sampling and conditionalization. Further mathematical background

  19. Kernel bandwidth estimation for non-parametric density estimation: a comparative study

    CSIR Research Space (South Africa)

    Van der Walt, CM

    2013-12-01

    Full Text Available We investigate the performance of conventional bandwidth estimators for non-parametric kernel density estimation on a number of representative pattern-recognition tasks, to gain a better understanding of the behaviour of these estimators in high...

  20. Notes on the Implementation of Non-Parametric Statistics within the Westinghouse Realistic Large Break LOCA Evaluation Model (ASTRUM)

    International Nuclear Information System (INIS)

    Frepoli, Cesare; Oriani, Luca

    2006-01-01

    In recent years, non-parametric or order statistics methods have been widely used to assess the impact of the uncertainties within Best-Estimate LOCA evaluation models. The bounding of the uncertainties is achieved with a direct Monte Carlo sampling of the uncertainty attributes, with the minimum trial number selected to 'stabilize' the estimation of the critical output values (peak cladding temperature (PCT), local maximum oxidation (LMO), and core-wide oxidation (CWO A non-parametric order statistics uncertainty analysis was recently implemented within the Westinghouse Realistic Large Break LOCA evaluation model, also referred to as 'Automated Statistical Treatment of Uncertainty Method' (ASTRUM). The implementation or interpretation of order statistics in safety analysis is not fully consistent within the industry. This has led to an extensive public debate among regulators and researchers which can be found in the open literature. The USNRC-approved Westinghouse method follows a rigorous implementation of the order statistics theory, which leads to the execution of 124 simulations within a Large Break LOCA analysis. This is a solid approach which guarantees that a bounding value (at 95% probability) of the 95 th percentile for each of the three 10 CFR 50.46 ECCS design acceptance criteria (PCT, LMO and CWO) is obtained. The objective of this paper is to provide additional insights on the ASTRUM statistical approach, with a more in-depth analysis of pros and cons of the order statistics and of the Westinghouse approach in the implementation of this statistical methodology. (authors)

  1. Impulse response identification with deterministic inputs using non-parametric methods

    International Nuclear Information System (INIS)

    Bhargava, U.K.; Kashyap, R.L.; Goodman, D.M.

    1985-01-01

    This paper addresses the problem of impulse response identification using non-parametric methods. Although the techniques developed herein apply to the truncated, untruncated, and the circulant models, we focus on the truncated model which is useful in certain applications. Two methods of impulse response identification will be presented. The first is based on the minimization of the C/sub L/ Statistic, which is an estimate of the mean-square prediction error; the second is a Bayesian approach. For both of these methods, we consider the effects of using both the identity matrix and the Laplacian matrix as weights on the energy in the impulse response. In addition, we present a method for estimating the effective length of the impulse response. Estimating the length is particularly important in the truncated case. Finally, we develop a method for estimating the noise variance at the output. Often, prior information on the noise variance is not available, and a good estimate is crucial to the success of estimating the impulse response with a nonparametric technique

  2. Non-parametric correlative uncertainty quantification and sensitivity analysis: Application to a Langmuir bimolecular adsorption model

    Science.gov (United States)

    Feng, Jinchao; Lansford, Joshua; Mironenko, Alexander; Pourkargar, Davood Babaei; Vlachos, Dionisios G.; Katsoulakis, Markos A.

    2018-03-01

    We propose non-parametric methods for both local and global sensitivity analysis of chemical reaction models with correlated parameter dependencies. The developed mathematical and statistical tools are applied to a benchmark Langmuir competitive adsorption model on a close packed platinum surface, whose parameters, estimated from quantum-scale computations, are correlated and are limited in size (small data). The proposed mathematical methodology employs gradient-based methods to compute sensitivity indices. We observe that ranking influential parameters depends critically on whether or not correlations between parameters are taken into account. The impact of uncertainty in the correlation and the necessity of the proposed non-parametric perspective are demonstrated.

  3. Non-parametric correlative uncertainty quantification and sensitivity analysis: Application to a Langmuir bimolecular adsorption model

    Directory of Open Access Journals (Sweden)

    Jinchao Feng

    2018-03-01

    Full Text Available We propose non-parametric methods for both local and global sensitivity analysis of chemical reaction models with correlated parameter dependencies. The developed mathematical and statistical tools are applied to a benchmark Langmuir competitive adsorption model on a close packed platinum surface, whose parameters, estimated from quantum-scale computations, are correlated and are limited in size (small data. The proposed mathematical methodology employs gradient-based methods to compute sensitivity indices. We observe that ranking influential parameters depends critically on whether or not correlations between parameters are taken into account. The impact of uncertainty in the correlation and the necessity of the proposed non-parametric perspective are demonstrated.

  4. Assessing pupil and school performance by non-parametric and parametric techniques

    NARCIS (Netherlands)

    de Witte, K.; Thanassoulis, E.; Simpson, G.; Battisti, G.; Charlesworth-May, A.

    2010-01-01

    This paper discusses the use of the non-parametric free disposal hull (FDH) and the parametric multi-level model (MLM) as alternative methods for measuring pupil and school attainment where hierarchical structured data are available. Using robust FDH estimates, we show how to decompose the overall

  5. Low default credit scoring using two-class non-parametric kernel density estimation

    CSIR Research Space (South Africa)

    Rademeyer, E

    2016-12-01

    Full Text Available This paper investigates the performance of two-class classification credit scoring data sets with low default ratios. The standard two-class parametric Gaussian and non-parametric Parzen classifiers are extended, using Bayes’ rule, to include either...

  6. Non-Parametric Bayesian Updating within the Assessment of Reliability for Offshore Wind Turbine Support Structures

    DEFF Research Database (Denmark)

    Ramirez, José Rangel; Sørensen, John Dalsgaard

    2011-01-01

    This work illustrates the updating and incorporation of information in the assessment of fatigue reliability for offshore wind turbine. The new information, coming from external and condition monitoring can be used to direct updating of the stochastic variables through a non-parametric Bayesian u...

  7. Non-parametric production analysis of pesticides use in the Netherlands

    NARCIS (Netherlands)

    Oude Lansink, A.G.J.M.; Silva, E.

    2004-01-01

    Many previous empirical studies on the productivity of pesticides suggest that pesticides are under-utilized in agriculture despite the general held believe that these inputs are substantially over-utilized. This paper uses data envelopment analysis (DEA) to calculate non-parametric measures of the

  8. The Support Reduction Algorithm for Computing Non-Parametric Function Estimates in Mixture Models

    OpenAIRE

    GROENEBOOM, PIET; JONGBLOED, GEURT; WELLNER, JON A.

    2008-01-01

    In this paper, we study an algorithm (which we call the support reduction algorithm) that can be used to compute non-parametric M-estimators in mixture models. The algorithm is compared with natural competitors in the context of convex regression and the ‘Aspect problem’ in quantum physics.

  9. Non-parametric Estimation of Diffusion-Paths Using Wavelet Scaling Methods

    DEFF Research Database (Denmark)

    Høg, Esben

    In continuous time, diffusion processes have been used for modelling financial dynamics for a long time. For example the Ornstein-Uhlenbeck process (the simplest mean-reverting process) has been used to model non-speculative price processes. We discuss non--parametric estimation of these processes...

  10. Non-Parametric Estimation of Diffusion-Paths Using Wavelet Scaling Methods

    DEFF Research Database (Denmark)

    Høg, Esben

    2003-01-01

    In continuous time, diffusion processes have been used for modelling financial dynamics for a long time. For example the Ornstein-Uhlenbeck process (the simplest mean--reverting process) has been used to model non-speculative price processes. We discuss non--parametric estimation of these processes...

  11. A non-parametric method for correction of global radiation observations

    DEFF Research Database (Denmark)

    Bacher, Peder; Madsen, Henrik; Perers, Bengt

    2013-01-01

    in the observations are corrected. These are errors such as: tilt in the leveling of the sensor, shadowing from surrounding objects, clipping and saturation in the signal processing, and errors from dirt and wear. The method is based on a statistical non-parametric clear-sky model which is applied to both...

  12. A comparative study of non-parametric models for identification of ...

    African Journals Online (AJOL)

    However, the frequency response method using random binary signals was good for unpredicted white noise characteristics and considered the best method for non-parametric system identifica-tion. The autoregressive external input (ARX) model was very useful for system identification, but on applicati-on, few input ...

  13. A non-parametric hierarchical model to discover behavior dynamics from tracks

    NARCIS (Netherlands)

    Kooij, J.F.P.; Englebienne, G.; Gavrila, D.M.

    2012-01-01

    We present a novel non-parametric Bayesian model to jointly discover the dynamics of low-level actions and high-level behaviors of tracked people in open environments. Our model represents behaviors as Markov chains of actions which capture high-level temporal dynamics. Actions may be shared by

  14. Experimental Sentinel-2 LAI estimation using parametric, non-parametric and physical retrieval methods - A comparison

    NARCIS (Netherlands)

    Verrelst, Jochem; Rivera, Juan Pablo; Veroustraete, Frank; Muñoz-Marí, Jordi; Clevers, J.G.P.W.; Camps-Valls, Gustau; Moreno, José

    2015-01-01

    Given the forthcoming availability of Sentinel-2 (S2) images, this paper provides a systematic comparison of retrieval accuracy and processing speed of a multitude of parametric, non-parametric and physically-based retrieval methods using simulated S2 data. An experimental field dataset (SPARC),

  15. A Non-Parametric Surrogate-based Test of Significance for T-Wave Alternans Detection

    Science.gov (United States)

    Nemati, Shamim; Abdala, Omar; Bazán, Violeta; Yim-Yeh, Susie; Malhotra, Atul; Clifford, Gari

    2010-01-01

    We present a non-parametric adaptive surrogate test that allows for the differentiation of statistically significant T-Wave Alternans (TWA) from alternating patterns that can be solely explained by the statistics of noise. The proposed test is based on estimating the distribution of noise induced alternating patterns in a beat sequence from a set of surrogate data derived from repeated reshuffling of the original beat sequence. Thus, in assessing the significance of the observed alternating patterns in the data no assumptions are made about the underlying noise distribution. In addition, since the distribution of noise-induced alternans magnitudes is calculated separately for each sequence of beats within the analysis window, the method is robust to data non-stationarities in both noise and TWA. The proposed surrogate method for rejecting noise was compared to the standard noise rejection methods used with the Spectral Method (SM) and the Modified Moving Average (MMA) techniques. Using a previously described realistic multi-lead model of TWA, and real physiological noise, we demonstrate the proposed approach reduces false TWA detections, while maintaining a lower missed TWA detection compared with all the other methods tested. A simple averaging-based TWA estimation algorithm was coupled with the surrogate significance testing and was evaluated on three public databases; the Normal Sinus Rhythm Database (NRSDB), the Chronic Heart Failure Database (CHFDB) and the Sudden Cardiac Death Database (SCDDB). Differences in TWA amplitudes between each database were evaluated at matched heart rate (HR) intervals from 40 to 120 beats per minute (BPM). Using the two-sample Kolmogorov-Smirnov test, we found that significant differences in TWA levels exist between each patient group at all decades of heart rates. The most marked difference was generally found at higher heart rates, and the new technique resulted in a larger margin of separability between patient populations than

  16. Power of non-parametric linkage analysis in mapping genes contributing to human longevity in long-lived sib-pairs

    DEFF Research Database (Denmark)

    Tan, Qihua; Zhao, J H; Iachine, I

    2004-01-01

    This report investigates the power issue in applying the non-parametric linkage analysis of affected sib-pairs (ASP) [Kruglyak and Lander, 1995: Am J Hum Genet 57:439-454] to localize genes that contribute to human longevity using long-lived sib-pairs. Data were simulated by introducing a recently...... developed statistical model for measuring marker-longevity associations [Yashin et al., 1999: Am J Hum Genet 65:1178-1193], enabling direct power comparison between linkage and association approaches. The non-parametric linkage (NPL) scores estimated in the region harboring the causal allele are evaluated...... in case of a dominant effect. Although the power issue may depend heavily on the true genetic nature in maintaining survival, our study suggests that results from small-scale sib-pair investigations should be referred with caution, given the complexity of human longevity....

  17. Hadron Energy Reconstruction for ATLAS Barrel Combined Calorimeter Using Non-Parametrical Method

    CERN Document Server

    Kulchitskii, Yu A

    2000-01-01

    Hadron energy reconstruction for the ATLAS barrel prototype combined calorimeter in the framework of the non-parametrical method is discussed. The non-parametrical method utilizes only the known e/h ratios and the electron calibration constants and does not require the determination of any parameters by a minimization technique. Thus, this technique lends itself to fast energy reconstruction in a first level trigger. The reconstructed mean values of the hadron energies are within \\pm1% of the true values and the fractional energy resolution is [(58\\pm 3)%{\\sqrt{GeV}}/\\sqrt{E}+(2.5\\pm0.3)%]\\bigoplus(1.7\\pm0.2) GeV/E. The value of the e/h ratio obtained for the electromagnetic compartment of the combined calorimeter is 1.74\\pm0.04. Results of a study of the longitudinal hadronic shower development are also presented.

  18. Efficiency Analysis of German Electricity Distribution Utilities : Non-Parametric and Parametric Tests

    OpenAIRE

    von Hirschhausen, Christian R.; Cullmann, Astrid

    2005-01-01

    Abstract This paper applies parametric and non-parametric and parametric tests to assess the efficiency of electricity distribution companies in Germany. We address traditional issues in electricity sector benchmarking, such as the role of scale effects and optimal utility size, as well as new evidence specific to the situation in Germany. We use labour, capital, and peak load capacity as inputs, and units sold and the number of customers as output. The data cover 307 (out of 553) ...

  19. A simple non-parametric goodness-of-fit test for elliptical copulas

    Directory of Open Access Journals (Sweden)

    Jaser Miriam

    2017-12-01

    Full Text Available In this paper, we propose a simple non-parametric goodness-of-fit test for elliptical copulas of any dimension. It is based on the equality of Kendall’s tau and Blomqvist’s beta for all bivariate margins. Nominal level and power of the proposed test are investigated in a Monte Carlo study. An empirical application illustrates our goodness-of-fit test at work.

  20. Generalized Correlation Coefficient for Non-Parametric Analysis of Microarray Time-Course Data

    DEFF Research Database (Denmark)

    Tan, Qihua; Thomassen, Mads; Burton, Mark

    2017-01-01

    the heterogeneous time-course gene expression patterns. Application of the method identified nonlinear time-course patterns in high agreement with parametric analysis. We conclude that the non-parametric nature in the generalized correlation analysis could be an useful and efficient tool for analyzing microarray...... time-course data and for exploring the complex relationships in the omics data for studying their association with disease and health....

  1. Using multinomial and imprecise probability for non-parametric modelling of rainfall in Manizales (Colombia

    Directory of Open Access Journals (Sweden)

    Ibsen Chivatá Cárdenas

    2008-05-01

    Full Text Available This article presents a rainfall model constructed by applying non-parametric modelling and imprecise probabilities; these tools were used because there was not enough homogeneous information in the study area. The area’s hydro-logical information regarding rainfall was scarce and existing hydrological time series were not uniform. A distributed extended rainfall model was constructed from so-called probability boxes (p-boxes, multinomial probability distribu-tion and confidence intervals (a friendly algorithm was constructed for non-parametric modelling by combining the last two tools. This model confirmed the high level of uncertainty involved in local rainfall modelling. Uncertainty en-compassed the whole range (domain of probability values thereby showing the severe limitations on information, leading to the conclusion that a detailed estimation of probability would lead to significant error. Nevertheless, rele-vant information was extracted; it was estimated that maximum daily rainfall threshold (70 mm would be surpassed at least once every three years and the magnitude of uncertainty affecting hydrological parameter estimation. This paper’s conclusions may be of interest to non-parametric modellers and decisions-makers as such modelling and imprecise probability represents an alternative for hydrological variable assessment and maybe an obligatory proce-dure in the future. Its potential lies in treating scarce information and represents a robust modelling strategy for non-seasonal stochastic modelling conditions

  2. Tremor Detection Using Parametric and Non-Parametric Spectral Estimation Methods: A Comparison with Clinical Assessment

    Science.gov (United States)

    Martinez Manzanera, Octavio; Elting, Jan Willem; van der Hoeven, Johannes H.; Maurits, Natasha M.

    2016-01-01

    In the clinic, tremor is diagnosed during a time-limited process in which patients are observed and the characteristics of tremor are visually assessed. For some tremor disorders, a more detailed analysis of these characteristics is needed. Accelerometry and electromyography can be used to obtain a better insight into tremor. Typically, routine clinical assessment of accelerometry and electromyography data involves visual inspection by clinicians and occasionally computational analysis to obtain objective characteristics of tremor. However, for some tremor disorders these characteristics may be different during daily activity. This variability in presentation between the clinic and daily life makes a differential diagnosis more difficult. A long-term recording of tremor by accelerometry and/or electromyography in the home environment could help to give a better insight into the tremor disorder. However, an evaluation of such recordings using routine clinical standards would take too much time. We evaluated a range of techniques that automatically detect tremor segments in accelerometer data, as accelerometer data is more easily obtained in the home environment than electromyography data. Time can be saved if clinicians only have to evaluate the tremor characteristics of segments that have been automatically detected in longer daily activity recordings. We tested four non-parametric methods and five parametric methods on clinical accelerometer data from 14 patients with different tremor disorders. The consensus between two clinicians regarding the presence or absence of tremor on 3943 segments of accelerometer data was employed as reference. The nine methods were tested against this reference to identify their optimal parameters. Non-parametric methods generally performed better than parametric methods on our dataset when optimal parameters were used. However, one parametric method, employing the high frequency content of the tremor bandwidth under consideration

  3. Non-parametric system identification from non-linear stochastic response

    DEFF Research Database (Denmark)

    Rüdinger, Finn; Krenk, Steen

    2001-01-01

    An estimation method is proposed for identification of non-linear stiffness and damping of single-degree-of-freedom systems under stationary white noise excitation. Non-parametric estimates of the stiffness and damping along with an estimate of the white noise intensity are obtained by suitable...... of the energy at mean-level crossings, which yields the damping relative to white noise intensity. Finally, an estimate of the noise intensity is extracted by estimating the absolute damping from the autocovariance functions of a set of modified phase plane variables at different energy levels. The method...

  4. kruX: matrix-based non-parametric eQTL discovery.

    Science.gov (United States)

    Qi, Jianlong; Asl, Hassan Foroughi; Björkegren, Johan; Michoel, Tom

    2014-01-14

    The Kruskal-Wallis test is a popular non-parametric statistical test for identifying expression quantitative trait loci (eQTLs) from genome-wide data due to its robustness against variations in the underlying genetic model and expression trait distribution, but testing billions of marker-trait combinations one-by-one can become computationally prohibitive. We developed kruX, an algorithm implemented in Matlab, Python and R that uses matrix multiplications to simultaneously calculate the Kruskal-Wallis test statistic for several millions of marker-trait combinations at once. KruX is more than ten thousand times faster than computing associations one-by-one on a typical human dataset. We used kruX and a dataset of more than 500k SNPs and 20k expression traits measured in 102 human blood samples to compare eQTLs detected by the Kruskal-Wallis test to eQTLs detected by the parametric ANOVA and linear model methods. We found that the Kruskal-Wallis test is more robust against data outliers and heterogeneous genotype group sizes and detects a higher proportion of non-linear associations, but is more conservative for calling additive linear associations. kruX enables the use of robust non-parametric methods for massive eQTL mapping without the need for a high-performance computing infrastructure and is freely available from http://krux.googlecode.com.

  5. MEASURING DARK MATTER PROFILES NON-PARAMETRICALLY IN DWARF SPHEROIDALS: AN APPLICATION TO DRACO

    International Nuclear Information System (INIS)

    Jardel, John R.; Gebhardt, Karl; Fabricius, Maximilian H.; Williams, Michael J.; Drory, Niv

    2013-01-01

    We introduce a novel implementation of orbit-based (or Schwarzschild) modeling that allows dark matter density profiles to be calculated non-parametrically in nearby galaxies. Our models require no assumptions to be made about velocity anisotropy or the dark matter profile. The technique can be applied to any dispersion-supported stellar system, and we demonstrate its use by studying the Local Group dwarf spheroidal galaxy (dSph) Draco. We use existing kinematic data at larger radii and also present 12 new radial velocities within the central 13 pc obtained with the VIRUS-W integral field spectrograph on the 2.7 m telescope at McDonald Observatory. Our non-parametric Schwarzschild models find strong evidence that the dark matter profile in Draco is cuspy for 20 ≤ r ≤ 700 pc. The profile for r ≥ 20 pc is well fit by a power law with slope α = –1.0 ± 0.2, consistent with predictions from cold dark matter simulations. Our models confirm that, despite its low baryon content relative to other dSphs, Draco lives in a massive halo.

  6. Robust non-parametric one-sample tests for the analysis of recurrent events.

    Science.gov (United States)

    Rebora, Paola; Galimberti, Stefania; Valsecchi, Maria Grazia

    2010-12-30

    One-sample non-parametric tests are proposed here for inference on recurring events. The focus is on the marginal mean function of events and the basis for inference is the standardized distance between the observed and the expected number of events under a specified reference rate. Different weights are considered in order to account for various types of alternative hypotheses on the mean function of the recurrent events process. A robust version and a stratified version of the test are also proposed. The performance of these tests was investigated through simulation studies under various underlying event generation processes, such as homogeneous and nonhomogeneous Poisson processes, autoregressive and renewal processes, with and without frailty effects. The robust versions of the test have been shown to be suitable in a wide variety of event generating processes. The motivating context is a study on gene therapy in a very rare immunodeficiency in children, where a major end-point is the recurrence of severe infections. Robust non-parametric one-sample tests for recurrent events can be useful to assess efficacy and especially safety in non-randomized studies or in epidemiological studies for comparison with a standard population. Copyright © 2010 John Wiley & Sons, Ltd.

  7. Cliff´s Delta Calculator: A non-parametric effect size program for two groups of observations

    Directory of Open Access Journals (Sweden)

    Guillermo Macbeth

    2011-05-01

    Full Text Available The Cliff´s Delta statistic is an effect size measure that quantifies the amount of difference between two non-parametric variables beyond p-values interpretation. This measure can be understood as a useful complementary analysis for the corresponding hypothesis testing. During the last two decades the use of effect size measures has been strongly encouraged by methodologists and leading institutions of behavioral sciences. The aim of this contribution is to introduce the Cliff´s Delta Calculator software that performs such analysis and offers some interpretation tips. Differences and similarities with the parametric case are analysed and illustrated. The implementation of this free program is fully described and compared with other calculators. Alternative algorithmic approaches are mathematically analysed and a basic linear algebra proof of its equivalence is formally presented. Two worked examples in cognitive psychology are commented. A visual interpretation of Cliff´s Delta is suggested. Availability, installation and applications of the program are presented and discussed.

  8. A non-parametric consistency test of the ΛCDM model with Planck CMB data

    Energy Technology Data Exchange (ETDEWEB)

    Aghamousa, Amir; Shafieloo, Arman [Korea Astronomy and Space Science Institute, Daejeon 305-348 (Korea, Republic of); Hamann, Jan, E-mail: amir@aghamousa.com, E-mail: jan.hamann@unsw.edu.au, E-mail: shafieloo@kasi.re.kr [School of Physics, The University of New South Wales, Sydney NSW 2052 (Australia)

    2017-09-01

    Non-parametric reconstruction methods, such as Gaussian process (GP) regression, provide a model-independent way of estimating an underlying function and its uncertainty from noisy data. We demonstrate how GP-reconstruction can be used as a consistency test between a given data set and a specific model by looking for structures in the residuals of the data with respect to the model's best-fit. Applying this formalism to the Planck temperature and polarisation power spectrum measurements, we test their global consistency with the predictions of the base ΛCDM model. Our results do not show any serious inconsistencies, lending further support to the interpretation of the base ΛCDM model as cosmology's gold standard.

  9. The application of non-parametric statistical method for an ALARA implementation

    International Nuclear Information System (INIS)

    Cho, Young Ho; Herr, Young Hoi

    2003-01-01

    The cost-effective reduction of Occupational Radiation Dose (ORD) at a nuclear power plant could not be achieved without going through an extensive analysis of accumulated ORD data of existing plants. Through the data analysis, it is required to identify what are the jobs of repetitive high ORD at the nuclear power plant. In this study, Percentile Rank Sum Method (PRSM) is proposed to identify repetitive high ORD jobs, which is based on non-parametric statistical theory. As a case study, the method is applied to ORD data of maintenance and repair jobs at Kori units 3 and 4 that are pressurized water reactors with 950 MWe capacity and have been operated since 1986 and 1987, respectively in Korea. The results was verified and validated, and PRSM has been demonstrated to be an efficient method of analyzing the data

  10. Non-parametric PSF estimation from celestial transit solar images using blind deconvolution

    Directory of Open Access Journals (Sweden)

    González Adriana

    2016-01-01

    Full Text Available Context: Characterization of instrumental effects in astronomical imaging is important in order to extract accurate physical information from the observations. The measured image in a real optical instrument is usually represented by the convolution of an ideal image with a Point Spread Function (PSF. Additionally, the image acquisition process is also contaminated by other sources of noise (read-out, photon-counting. The problem of estimating both the PSF and a denoised image is called blind deconvolution and is ill-posed. Aims: We propose a blind deconvolution scheme that relies on image regularization. Contrarily to most methods presented in the literature, our method does not assume a parametric model of the PSF and can thus be applied to any telescope. Methods: Our scheme uses a wavelet analysis prior model on the image and weak assumptions on the PSF. We use observations from a celestial transit, where the occulting body can be assumed to be a black disk. These constraints allow us to retain meaningful solutions for the filter and the image, eliminating trivial, translated, and interchanged solutions. Under an additive Gaussian noise assumption, they also enforce noise canceling and avoid reconstruction artifacts by promoting the whiteness of the residual between the blurred observations and the cleaned data. Results: Our method is applied to synthetic and experimental data. The PSF is estimated for the SECCHI/EUVI instrument using the 2007 Lunar transit, and for SDO/AIA using the 2012 Venus transit. Results show that the proposed non-parametric blind deconvolution method is able to estimate the core of the PSF with a similar quality to parametric methods proposed in the literature. We also show that, if these parametric estimations are incorporated in the acquisition model, the resulting PSF outperforms both the parametric and non-parametric methods.

  11. Comparing and optimizing land use classification in a Himalayan area using parametric and non parametric approaches

    NARCIS (Netherlands)

    Sterk, G.; Sameer Saran,; Raju, P.L.N.; Amit, Bharti

    2007-01-01

    Supervised classification is one of important tasks in remote sensing image interpretation, in which the image pixels are classified to various predefined land use/land cover classes based on the spectral reflectance values in different bands. In reality some classes may have very close spectral

  12. Rational inefficiency and non-performing loans in Chinese banking: A non-parametric bootstrapping approach

    OpenAIRE

    Matthews, Kent; Guo, Jianguang; Zhang, Nina

    2007-01-01

    The existing Chinese banking system was born out of a state-planning framework focussed on the funding of state-owned enterprises. Despite the development of a modern banking system, numerous studies of Chinese banking point to its high level of average inefficiency. Much of this inefficiency relates to the high level of non-performing loans held on the banks books. This study argues that a significant component of inefficiency relates to a defunct bureaucratic incentive structure. Using boot...

  13. Further Empirical Results on Parametric Versus Non-Parametric IRT Modeling of Likert-Type Personality Data

    Science.gov (United States)

    Maydeu-Olivares, Albert

    2005-01-01

    Chernyshenko, Stark, Chan, Drasgow, and Williams (2001) investigated the fit of Samejima's logistic graded model and Levine's non-parametric MFS model to the scales of two personality questionnaires and found that the graded model did not fit well. We attribute the poor fit of the graded model to small amounts of multidimensionality present in…

  14. Non-parametric model selection for subject-specific topological organization of resting-state functional connectivity.

    Science.gov (United States)

    Ferrarini, Luca; Veer, Ilya M; van Lew, Baldur; Oei, Nicole Y L; van Buchem, Mark A; Reiber, Johan H C; Rombouts, Serge A R B; Milles, J

    2011-06-01

    In recent years, graph theory has been successfully applied to study functional and anatomical connectivity networks in the human brain. Most of these networks have shown small-world topological characteristics: high efficiency in long distance communication between nodes, combined with highly interconnected local clusters of nodes. Moreover, functional studies performed at high resolutions have presented convincing evidence that resting-state functional connectivity networks exhibits (exponentially truncated) scale-free behavior. Such evidence, however, was mostly presented qualitatively, in terms of linear regressions of the degree distributions on log-log plots. Even when quantitative measures were given, these were usually limited to the r(2) correlation coefficient. However, the r(2) statistic is not an optimal estimator of explained variance, when dealing with (truncated) power-law models. Recent developments in statistics have introduced new non-parametric approaches, based on the Kolmogorov-Smirnov test, for the problem of model selection. In this work, we have built on this idea to statistically tackle the issue of model selection for the degree distribution of functional connectivity at rest. The analysis, performed at voxel level and in a subject-specific fashion, confirmed the superiority of a truncated power-law model, showing high consistency across subjects. Moreover, the most highly connected voxels were found to be consistently part of the default mode network. Our results provide statistically sound support to the evidence previously presented in literature for a truncated power-law model of resting-state functional connectivity. Copyright © 2010 Elsevier Inc. All rights reserved.

  15. Trend Analysis of Pahang River Using Non-Parametric Analysis: Mann Kendalls Trend Test

    International Nuclear Information System (INIS)

    Nur Hishaam Sulaiman; Mohd Khairul Amri Kamarudin; Mohd Khairul Amri Kamarudin; Ahmad Dasuki Mustafa; Muhammad Azizi Amran; Fazureen Azaman; Ismail Zainal Abidin; Norsyuhada Hairoma

    2015-01-01

    Flood is common in Pahang especially during northeast monsoon season from November to February. Three river cross station: Lubuk Paku, Sg. Yap and Temerloh were selected as area of this study. The stream flow and water level data were gathered from DID record. Data set for this study were analysed by using non-parametric analysis, Mann-Kendall Trend Test. The results that obtained from stream flow and water level analysis indicate that there are positively significant trend for Lubuk Paku (0.001) and Sg. Yap (<0.0001) from 1972-2011 with the p-value < 0.05. Temerloh (0.178) data from 1963-2011 recorded no trend for stream flow parameter but negative trend for water level parameter. Hydrological pattern and trend are extremely affected by outside factors such as north east monsoon season that occurred in South China Sea and affected Pahang during November to March. There are other factors such as development and management of the areas which can be considered as factors affected the data and results. Hydrological Pattern is important to indicate the river trend such as stream flow and water level. It can be used as flood mitigation by local authorities. (author)

  16. Non-parametric early seizure detection in an animal model of temporal lobe epilepsy

    Science.gov (United States)

    Talathi, Sachin S.; Hwang, Dong-Uk; Spano, Mark L.; Simonotto, Jennifer; Furman, Michael D.; Myers, Stephen M.; Winters, Jason T.; Ditto, William L.; Carney, Paul R.

    2008-03-01

    The performance of five non-parametric, univariate seizure detection schemes (embedding delay, Hurst scale, wavelet scale, nonlinear autocorrelation and variance energy) were evaluated as a function of the sampling rate of EEG recordings, the electrode types used for EEG acquisition, and the spatial location of the EEG electrodes in order to determine the applicability of the measures in real-time closed-loop seizure intervention. The criteria chosen for evaluating the performance were high statistical robustness (as determined through the sensitivity and the specificity of a given measure in detecting a seizure) and the lag in seizure detection with respect to the seizure onset time (as determined by visual inspection of the EEG signal by a trained epileptologist). An optimality index was designed to evaluate the overall performance of each measure. For the EEG data recorded with microwire electrode array at a sampling rate of 12 kHz, the wavelet scale measure exhibited better overall performance in terms of its ability to detect a seizure with high optimality index value and high statistics in terms of sensitivity and specificity.

  17. Comparison between linear and non-parametric regression models for genome-enabled prediction in wheat.

    Science.gov (United States)

    Pérez-Rodríguez, Paulino; Gianola, Daniel; González-Camacho, Juan Manuel; Crossa, José; Manès, Yann; Dreisigacker, Susanne

    2012-12-01

    In genome-enabled prediction, parametric, semi-parametric, and non-parametric regression models have been used. This study assessed the predictive ability of linear and non-linear models using dense molecular markers. The linear models were linear on marker effects and included the Bayesian LASSO, Bayesian ridge regression, Bayes A, and Bayes B. The non-linear models (this refers to non-linearity on markers) were reproducing kernel Hilbert space (RKHS) regression, Bayesian regularized neural networks (BRNN), and radial basis function neural networks (RBFNN). These statistical models were compared using 306 elite wheat lines from CIMMYT genotyped with 1717 diversity array technology (DArT) markers and two traits, days to heading (DTH) and grain yield (GY), measured in each of 12 environments. It was found that the three non-linear models had better overall prediction accuracy than the linear regression specification. Results showed a consistent superiority of RKHS and RBFNN over the Bayesian LASSO, Bayesian ridge regression, Bayes A, and Bayes B models.

  18. Design Automation Using Script Languages. High-Level CAD Templates in Non-Parametric Programs

    Science.gov (United States)

    Moreno, R.; Bazán, A. M.

    2017-10-01

    The main purpose of this work is to study the advantages offered by the application of traditional techniques of technical drawing in processes for automation of the design, with non-parametric CAD programs, provided with scripting languages. Given that an example drawing can be solved with traditional step-by-step detailed procedures, is possible to do the same with CAD applications and to generalize it later, incorporating references. In today’s modern CAD applications, there are striking absences of solutions for building engineering: oblique projections (military and cavalier), 3D modelling of complex stairs, roofs, furniture, and so on. The use of geometric references (using variables in script languages) and their incorporation into high-level CAD templates allows the automation of processes. Instead of repeatedly creating similar designs or modifying their data, users should be able to use these templates to generate future variations of the same design. This paper presents the automation process of several complex drawing examples based on CAD script files aided with parametric geometry calculation tools. The proposed method allows us to solve complex geometry designs not currently incorporated in the current CAD applications and to subsequently create other new derivatives without user intervention. Automation in the generation of complex designs not only saves time but also increases the quality of the presentations and reduces the possibility of human errors.

  19. Modeling the World Health Organization Disability Assessment Schedule II using non-parametric item response models.

    Science.gov (United States)

    Galindo-Garre, Francisca; Hidalgo, María Dolores; Guilera, Georgina; Pino, Oscar; Rojo, J Emilio; Gómez-Benito, Juana

    2015-03-01

    The World Health Organization Disability Assessment Schedule II (WHO-DAS II) is a multidimensional instrument developed for measuring disability. It comprises six domains (getting around, self-care, getting along with others, life activities and participation in society). The main purpose of this paper is the evaluation of the psychometric properties for each domain of the WHO-DAS II with parametric and non-parametric Item Response Theory (IRT) models. A secondary objective is to assess whether the WHO-DAS II items within each domain form a hierarchy of invariantly ordered severity indicators of disability. A sample of 352 patients with a schizophrenia spectrum disorder is used in this study. The 36 items WHO-DAS II was administered during the consultation. Partial Credit and Mokken scale models are used to study the psychometric properties of the questionnaire. The psychometric properties of the WHO-DAS II scale are satisfactory for all the domains. However, we identify a few items that do not discriminate satisfactorily between different levels of disability and cannot be invariantly ordered in the scale. In conclusion the WHO-DAS II can be used to assess overall disability in patients with schizophrenia, but some domains are too general to assess functionality in these patients because they contain items that are not applicable to this pathology. Copyright © 2014 John Wiley & Sons, Ltd.

  20. Two non-parametric methods for derivation of constraints from radiotherapy dose–histogram data

    International Nuclear Information System (INIS)

    Ebert, M A; Kennedy, A; Joseph, D J; Gulliford, S L; Buettner, F; Foo, K; Haworth, A; Denham, J W

    2014-01-01

    Dose constraints based on histograms provide a convenient and widely-used method for informing and guiding radiotherapy treatment planning. Methods of derivation of such constraints are often poorly described. Two non-parametric methods for derivation of constraints are described and investigated in the context of determination of dose-specific cut-points—values of the free parameter (e.g., percentage volume of the irradiated organ) which best reflect resulting changes in complication incidence. A method based on receiver operating characteristic (ROC) analysis and one based on a maximally-selected standardized rank sum are described and compared using rectal toxicity data from a prostate radiotherapy trial. Multiple test corrections are applied using a free step-down resampling algorithm, which accounts for the large number of tests undertaken to search for optimal cut-points and the inherent correlation between dose–histogram points. Both methods provide consistent significant cut-point values, with the rank sum method displaying some sensitivity to the underlying data. The ROC method is simple to implement and can utilize a complication atlas, though an advantage of the rank sum method is the ability to incorporate all complication grades without the need for grade dichotomization. (note)

  1. Performances of non-parametric statistics in sensitivity analysis and parameter ranking

    International Nuclear Information System (INIS)

    Saltelli, A.

    1987-01-01

    Twelve parametric and non-parametric sensitivity analysis techniques are compared in the case of non-linear model responses. The test models used are taken from the long-term risk analysis for the disposal of high level radioactive waste in a geological formation. They describe the transport of radionuclides through a set of engineered and natural barriers from the repository to the biosphere and to man. The output data from these models are the dose rates affecting the maximum exposed individual of a critical group at a given point in time. All the techniques are applied to the output from the same Monte Carlo simulations, where a modified version of Latin Hypercube method is used for the sample selection. Hypothesis testing is systematically applied to quantify the degree of confidence in the results given by the various sensitivity estimators. The estimators are ranked according to their robustness and stability, on the basis of two test cases. The conclusions are that no estimator can be considered the best from all points of view and recommend the use of more than just one estimator in sensitivity analysis

  2. Parametric and Non-Parametric Vibration-Based Structural Identification Under Earthquake Excitation

    Science.gov (United States)

    Pentaris, Fragkiskos P.; Fouskitakis, George N.

    2014-05-01

    The problem of modal identification in civil structures is of crucial importance, and thus has been receiving increasing attention in recent years. Vibration-based methods are quite promising as they are capable of identifying the structure's global characteristics, they are relatively easy to implement and they tend to be time effective and less expensive than most alternatives [1]. This paper focuses on the off-line structural/modal identification of civil (concrete) structures subjected to low-level earthquake excitations, under which, they remain within their linear operating regime. Earthquakes and their details are recorded and provided by the seismological network of Crete [2], which 'monitors' the broad region of south Hellenic arc, an active seismic region which functions as a natural laboratory for earthquake engineering of this kind. A sufficient number of seismic events are analyzed in order to reveal the modal characteristics of the structures under study, that consist of the two concrete buildings of the School of Applied Sciences, Technological Education Institute of Crete, located in Chania, Crete, Hellas. Both buildings are equipped with high-sensitivity and accuracy seismographs - providing acceleration measurements - established at the basement (structure's foundation) presently considered as the ground's acceleration (excitation) and at all levels (ground floor, 1st floor, 2nd floor and terrace). Further details regarding the instrumentation setup and data acquisition may be found in [3]. The present study invokes stochastic, both non-parametric (frequency-based) and parametric methods for structural/modal identification (natural frequencies and/or damping ratios). Non-parametric methods include Welch-based spectrum and Frequency response Function (FrF) estimation, while parametric methods, include AutoRegressive (AR), AutoRegressive with eXogeneous input (ARX) and Autoregressive Moving-Average with eXogeneous input (ARMAX) models[4, 5

  3. Non-Parametric Kinetic (NPK Analysis of Thermal Oxidation of Carbon Aerogels

    Directory of Open Access Journals (Sweden)

    Azadeh Seifi

    2017-05-01

    Full Text Available In recent years, much attention has been paid to aerogel materials (especially carbon aerogels due to their potential uses in energy-related applications, such as thermal energy storage and thermal protection systems. These open cell carbon-based porous materials (carbon aerogels can strongly react with oxygen at relatively low temperatures (~ 400°C. Therefore, it is necessary to evaluate the thermal performance of carbon aerogels in view of their energy-related applications at high temperatures and under thermal oxidation conditions. The objective of this paper is to study theoretically and experimentally the oxidation reaction kinetics of carbon aerogel using the non-parametric kinetic (NPK as a powerful method. For this purpose, a non-isothermal thermogravimetric analysis, at three different heating rates, was performed on three samples each with its specific pore structure, density and specific surface area. The most significant feature of this method, in comparison with the model-free isoconversional methods, is its ability to separate the functionality of the reaction rate with the degree of conversion and temperature by the direct use of thermogravimetric data. Using this method, it was observed that the Nomen-Sempere model could provide the best fit to the data, while the temperature dependence of the rate constant was best explained by a Vogel-Fulcher relationship, where the reference temperature was the onset temperature of oxidation. Moreover, it was found from the results of this work that the assumption of the Arrhenius relation for the temperature dependence of the rate constant led to over-estimation of the apparent activation energy (up to 160 kJ/mol that was considerably different from the values (up to 3.5 kJ/mol predicted by the Vogel-Fulcher relationship in isoconversional methods

  4. A non-parametric peak calling algorithm for DamID-Seq.

    Directory of Open Access Journals (Sweden)

    Renhua Li

    Full Text Available Protein-DNA interactions play a significant role in gene regulation and expression. In order to identify transcription factor binding sites (TFBS of double sex (DSX-an important transcription factor in sex determination, we applied the DNA adenine methylation identification (DamID technology to the fat body tissue of Drosophila, followed by deep sequencing (DamID-Seq. One feature of DamID-Seq data is that induced adenine methylation signals are not assured to be symmetrically distributed at TFBS, which renders the existing peak calling algorithms for ChIP-Seq, including SPP and MACS, inappropriate for DamID-Seq data. This challenged us to develop a new algorithm for peak calling. A challenge in peaking calling based on sequence data is estimating the averaged behavior of background signals. We applied a bootstrap resampling method to short sequence reads in the control (Dam only. After data quality check and mapping reads to a reference genome, the peaking calling procedure compromises the following steps: 1 reads resampling; 2 reads scaling (normalization and computing signal-to-noise fold changes; 3 filtering; 4 Calling peaks based on a statistically significant threshold. This is a non-parametric method for peak calling (NPPC. We also used irreproducible discovery rate (IDR analysis, as well as ChIP-Seq data to compare the peaks called by the NPPC. We identified approximately 6,000 peaks for DSX, which point to 1,225 genes related to the fat body tissue difference between female and male Drosophila. Statistical evidence from IDR analysis indicated that these peaks are reproducible across biological replicates. In addition, these peaks are comparable to those identified by use of ChIP-Seq on S2 cells, in terms of peak number, location, and peaks width.

  5. A non-parametric peak calling algorithm for DamID-Seq.

    Science.gov (United States)

    Li, Renhua; Hempel, Leonie U; Jiang, Tingbo

    2015-01-01

    Protein-DNA interactions play a significant role in gene regulation and expression. In order to identify transcription factor binding sites (TFBS) of double sex (DSX)-an important transcription factor in sex determination, we applied the DNA adenine methylation identification (DamID) technology to the fat body tissue of Drosophila, followed by deep sequencing (DamID-Seq). One feature of DamID-Seq data is that induced adenine methylation signals are not assured to be symmetrically distributed at TFBS, which renders the existing peak calling algorithms for ChIP-Seq, including SPP and MACS, inappropriate for DamID-Seq data. This challenged us to develop a new algorithm for peak calling. A challenge in peaking calling based on sequence data is estimating the averaged behavior of background signals. We applied a bootstrap resampling method to short sequence reads in the control (Dam only). After data quality check and mapping reads to a reference genome, the peaking calling procedure compromises the following steps: 1) reads resampling; 2) reads scaling (normalization) and computing signal-to-noise fold changes; 3) filtering; 4) Calling peaks based on a statistically significant threshold. This is a non-parametric method for peak calling (NPPC). We also used irreproducible discovery rate (IDR) analysis, as well as ChIP-Seq data to compare the peaks called by the NPPC. We identified approximately 6,000 peaks for DSX, which point to 1,225 genes related to the fat body tissue difference between female and male Drosophila. Statistical evidence from IDR analysis indicated that these peaks are reproducible across biological replicates. In addition, these peaks are comparable to those identified by use of ChIP-Seq on S2 cells, in terms of peak number, location, and peaks width.

  6. A Non-Parametric Item Response Theory Evaluation of the CAGE Instrument Among Older Adults.

    Science.gov (United States)

    Abdin, Edimansyah; Sagayadevan, Vathsala; Vaingankar, Janhavi Ajit; Picco, Louisa; Chong, Siow Ann; Subramaniam, Mythily

    2018-02-23

    The validity of the CAGE using item response theory (IRT) has not yet been examined in older adult population. This study aims to investigate the psychometric properties of the CAGE using both non-parametric and parametric IRT models, assess whether there is any differential item functioning (DIF) by age, gender and ethnicity and examine the measurement precision at the cut-off scores. We used data from the Well-being of the Singapore Elderly study to conduct Mokken scaling analysis (MSA), dichotomous Rasch and 2-parameter logistic IRT models. The measurement precision at the cut-off scores were evaluated using classification accuracy (CA) and classification consistency (CC). The MSA showed the overall scalability H index was 0.459, indicating a medium performing instrument. All items were found to be homogenous, measuring the same construct and able to discriminate well between respondents with high levels of the construct and the ones with lower levels. The item discrimination ranged from 1.07 to 6.73 while the item difficulty ranged from 0.33 to 2.80. Significant DIF was found for 2-item across ethnic group. More than 90% (CC and CA ranged from 92.5% to 94.3%) of the respondents were consistently and accurately classified by the CAGE cut-off scores of 2 and 3. The current study provides new evidence on the validity of the CAGE from the IRT perspective. This study provides valuable information of each item in the assessment of the overall severity of alcohol problem and the precision of the cut-off scores in older adult population.

  7. Speeding Up Non-Parametric Bootstrap Computations for Statistics Based on Sample Moments in Small/Moderate Sample Size Applications.

    Directory of Open Access Journals (Sweden)

    Elias Chaibub Neto

    Full Text Available In this paper we propose a vectorized implementation of the non-parametric bootstrap for statistics based on sample moments. Basically, we adopt the multinomial sampling formulation of the non-parametric bootstrap, and compute bootstrap replications of sample moment statistics by simply weighting the observed data according to multinomial counts instead of evaluating the statistic on a resampled version of the observed data. Using this formulation we can generate a matrix of bootstrap weights and compute the entire vector of bootstrap replications with a few matrix multiplications. Vectorization is particularly important for matrix-oriented programming languages such as R, where matrix/vector calculations tend to be faster than scalar operations implemented in a loop. We illustrate the application of the vectorized implementation in real and simulated data sets, when bootstrapping Pearson's sample correlation coefficient, and compared its performance against two state-of-the-art R implementations of the non-parametric bootstrap, as well as a straightforward one based on a for loop. Our investigations spanned varying sample sizes and number of bootstrap replications. The vectorized bootstrap compared favorably against the state-of-the-art implementations in all cases tested, and was remarkably/considerably faster for small/moderate sample sizes. The same results were observed in the comparison with the straightforward implementation, except for large sample sizes, where the vectorized bootstrap was slightly slower than the straightforward implementation due to increased time expenditures in the generation of weight matrices via multinomial sampling.

  8. Estimating technical efficiency in the hospital sector with panel data: a comparison of parametric and non-parametric techniques.

    Science.gov (United States)

    Siciliani, Luigi

    2006-01-01

    Policy makers are increasingly interested in developing performance indicators that measure hospital efficiency. These indicators may give the purchasers of health services an additional regulatory tool to contain health expenditure. Using panel data, this study compares different parametric (econometric) and non-parametric (linear programming) techniques for the measurement of a hospital's technical efficiency. This comparison was made using a sample of 17 Italian hospitals in the years 1996-9. Highest correlations are found in the efficiency scores between the non-parametric data envelopment analysis under the constant returns to scale assumption (DEA-CRS) and several parametric models. Correlation reduces markedly when using more flexible non-parametric specifications such as data envelopment analysis under the variable returns to scale assumption (DEA-VRS) and the free disposal hull (FDH) model. Correlation also generally reduces when moving from one output to two-output specifications. This analysis suggests that there is scope for developing performance indicators at hospital level using panel data, but it is important that extensive sensitivity analysis is carried out if purchasers wish to make use of these indicators in practice.

  9. Monitoring coastal marshes biomass with CASI: a comparison of parametric and non-parametric models

    Science.gov (United States)

    Mo, Y.; Kearney, M.

    2017-12-01

    Coastal marshes are important carbon sinks that face multiple natural and anthropogenic stresses. Optical remote sensing is a powerful tool for closely monitoring the biomass of coastal marshes. However, application of hyperspectral sensors on assessing the biomass of diverse coastal marsh ecosystems is limited. This study samples spectral and biophysical data from coastal freshwater, intermediate, brackish, and saline marshes in Louisiana, and develops parametric and non-parametric models for using the Compact Airborne Spectrographic Imager (CASI) to retrieve the marshes' biomass. Linear models and random forest models are developed from simulated CASI data (48 bands, 380-1050 nm, bandwidth 14 nm). Linear models are also developed using narrowband vegetation indices computed from all possible band combinations from the blue, red, and near infrared wavelengths. It is found that the linear models derived from the optimal narrowband vegetation indices provide strong predictions for the marshes' Leaf Area Index (LAI; R2 > 0.74 for ARVI), but not for their Aboveground Green Biomass (AGB; R2 > 0.25). The linear models derived from the simulated CASI data strongly predict the marshes' LAI (R2 = 0.93) and AGB (R2 = 0.71) and have 27 and 30 bands/variables in the final models through stepwise regression, respectively. The random forest models derived from the simulated CASI data also strongly predict the marshes' LAI and AGB (R2 = 0.91 and 0.84, respectively), where the most important variables for predicting LAI are near infrared bands at 784 and 756 nm and for predicting ABG are red bands at 684 and 670 nm. In sum, the random forest model is preferable for assessing coastal marsh biomass using CASI data as it offers high R2 for both LAI and AGB. The superior performance of the random forest model is likely to due to that it fully utilizes the full-spectrum data and makes no assumption of the approximate normality of the sampling population. This study offers solutions

  10. Non-parametric order statistics method applied to uncertainty propagation in fuel rod calculations

    International Nuclear Information System (INIS)

    Arimescu, V.E.; Heins, L.

    2001-01-01

    Advances in modeling fuel rod behavior and accumulations of adequate experimental data have made possible the introduction of quantitative methods to estimate the uncertainty of predictions made with best-estimate fuel rod codes. The uncertainty range of the input variables is characterized by a truncated distribution which is typically a normal, lognormal, or uniform distribution. While the distribution for fabrication parameters is defined to cover the design or fabrication tolerances, the distribution of modeling parameters is inferred from the experimental database consisting of separate effects tests and global tests. The final step of the methodology uses a Monte Carlo type of random sampling of all relevant input variables and performs best-estimate code calculations to propagate these uncertainties in order to evaluate the uncertainty range of outputs of interest for design analysis, such as internal rod pressure and fuel centerline temperature. The statistical method underlying this Monte Carlo sampling is non-parametric order statistics, which is perfectly suited to evaluate quantiles of populations with unknown distribution. The application of this method is straightforward in the case of one single fuel rod, when a 95/95 statement is applicable: 'with a probability of 95% and confidence level of 95% the values of output of interest are below a certain value'. Therefore, the 0.95-quantile is estimated for the distribution of all possible values of one fuel rod with a statistical confidence of 95%. On the other hand, a more elaborate procedure is required if all the fuel rods in the core are being analyzed. In this case, the aim is to evaluate the following global statement: with 95% confidence level, the expected number of fuel rods which are not exceeding a certain value is all the fuel rods in the core except only a few fuel rods. In both cases, the thresholds determined by the analysis should be below the safety acceptable design limit. An indirect

  11. A non-parametric conditional bivariate reference region with an application to height/weight measurements on normal girls

    DEFF Research Database (Denmark)

    Petersen, Jørgen Holm

    2009-01-01

    A conceptually simple two-dimensional conditional reference curve is described. The curve gives a decision basis for determining whether a bivariate response from an individual is "normal" or "abnormal" when taking into account that a third (conditioning) variable may influence the bivariate...... response. The reference curve is not only characterized analytically but also by geometric properties that are easily communicated to medical doctors - the users of such curves. The reference curve estimator is completely non-parametric, so no distributional assumptions are needed about the two......-dimensional response. An example that will serve to motivate and illustrate the reference is the study of the height/weight distribution of 7-8-year-old Danish school girls born in 1930, 1950, or 1970....

  12. Technical Topic 3.2.2.d Bayesian and Non-Parametric Statistics: Integration of Neural Networks with Bayesian Networks for Data Fusion and Predictive Modeling

    Science.gov (United States)

    2016-05-31

    Distribution Unlimited UU UU UU UU 31-05-2016 15-Apr-2014 14-Jan-2015 Final Report: Technical Topic 3.2.2.d Bayesian and Non- parametric Statistics...of Papers published in non peer-reviewed journals: Final Report: Technical Topic 3.2.2.d Bayesian and Non- parametric Statistics: Integration of Neural...Transfer N/A Number of graduating undergraduates who achieved a 3.5 GPA to 4.0 (4.0 max scale ): Number of graduating undergraduates funded by a DoD funded

  13. Non-parametric data predistortion for non-linear channels with memory

    OpenAIRE

    Piazza, Roberto; Shankar, Bhavani; Ottersten, Björn

    2013-01-01

    With the growing application of high order modulation techniques, the mitigation of the non-linear distortions introduced by the power amplification, has become a major issue in telecommunication. More sophisticated techniques to counteract the strong generated interferences need to be investigated in order to achieve the desired power and spectral efficiency. This work proposes a novel approach for the definition of a transmitter technique (predistortion) that outperforms the standard method...

  14. Parametric and non-parametric models for lifespan modeling of insulation systems in electrical machines

    OpenAIRE

    Salameh , Farah; Picot , Antoine; Chabert , Marie; Maussion , Pascal

    2017-01-01

    International audience; This paper describes an original statistical approach for the lifespan modeling of electric machine insulation materials. The presented models aim to study the effect of three main stress factors (voltage, frequency and temperature) and their interactions on the insulation lifespan. The proposed methodology is applied to two different insulation materials tested in partial discharge regime. Accelerated ageing tests are organized according to experimental optimization m...

  15. A new measure for gene expression biclustering based on non-parametric correlation.

    Science.gov (United States)

    Flores, Jose L; Inza, Iñaki; Larrañaga, Pedro; Calvo, Borja

    2013-12-01

    One of the emerging techniques for performing the analysis of the DNA microarray data known as biclustering is the search of subsets of genes and conditions which are coherently expressed. These subgroups provide clues about the main biological processes. Until now, different approaches to this problem have been proposed. Most of them use the mean squared residue as quality measure but relevant and interesting patterns can not be detected such as shifting, or scaling patterns. Furthermore, recent papers show that there exist new coherence patterns involved in different kinds of cancer and tumors such as inverse relationships between genes which can not be captured. The proposed measure is called Spearman's biclustering measure (SBM) which performs an estimation of the quality of a bicluster based on the non-linear correlation among genes and conditions simultaneously. The search of biclusters is performed by using a evolutionary technique called estimation of distribution algorithms which uses the SBM measure as fitness function. This approach has been examined from different points of view by using artificial and real microarrays. The assessment process has involved the use of quality indexes, a set of bicluster patterns of reference including new patterns and a set of statistical tests. It has been also examined the performance using real microarrays and comparing to different algorithmic approaches such as Bimax, CC, OPSM, Plaid and xMotifs. SBM shows several advantages such as the ability to recognize more complex coherence patterns such as shifting, scaling and inversion and the capability to selectively marginalize genes and conditions depending on the statistical significance. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  16. Convergence in energy consumption per capita across the US states, 1970–2013: An exploration through selected parametric and non-parametric methods

    International Nuclear Information System (INIS)

    Mohammadi, Hassan; Ram, Rati

    2017-01-01

    Noting the paucity of studies of convergence in energy consumption across the US states, and the usefulness of a study that shares the spirit of the enormous research on convergence in energy-related variables in cross-country contexts, this paper explores convergence in per-capita energy consumption across the US states over the 44-year period 1970–2013. Several well-known parametric and non-parametric approaches are explored partly to shed light on the substantive question and partly to provide a comparative methodological perspective on these approaches. Several statements summarize the outcome of our explorations. First, the widely-used Barro-type regressions do not indicate beta-convergence during the entire period or any of several sub-periods. Second, lack of sigma-convergence is also noted in terms of standard deviation of logarithms and coefficient of variation which do not show a decline between 1970 and 2013, but show slight upward trends. Third, kernel density function plots indicate some flattening of the distribution which is consistent with the results from sigma-convergence scenario. Fourth, intra-distribution mobility (“gamma convergence”) in terms of an index of rank concordance suggests a slow decline in the index. Fifth, the general impression from several types of panel and time-series unit-root tests is that of non-stationarity of the series and thus the lack of stochastic convergence during the period. Sixth, therefore, the overall impression seems to be that of the lack of convergence across states in per-capita energy consumption. The present interstate inequality in per-capita energy consumption may, therefore, reflect variations in structural factors and might not be expected to diminish.

  17. Non parametric denoising methods based on wavelets: Application to electron microscopy images in low exposure time

    International Nuclear Information System (INIS)

    Soumia, Sid Ahmed; Messali, Zoubeida; Ouahabi, Abdeldjalil; Trepout, Sylvain; Messaoudi, Cedric; Marco, Sergio

    2015-01-01

    The 3D reconstruction of the Cryo-Transmission Electron Microscopy (Cryo-TEM) and Energy Filtering TEM images (EFTEM) hampered by the noisy nature of these images, so that their alignment becomes so difficult. This noise refers to the collision between the frozen hydrated biological samples and the electrons beam, where the specimen is exposed to the radiation with a high exposure time. This sensitivity to the electrons beam led specialists to obtain the specimen projection images at very low exposure time, which resulting the emergence of a new problem, an extremely low signal-to-noise ratio (SNR). This paper investigates the problem of TEM images denoising when they are acquired at very low exposure time. So, our main objective is to enhance the quality of TEM images to improve the alignment process which will in turn improve the three dimensional tomography reconstructions. We have done multiple tests on special TEM images acquired at different exposure time 0.5s, 0.2s, 0.1s and 1s (i.e. with different values of SNR)) and equipped by Golding beads for helping us in the assessment step. We herein, propose a structure to combine multiple noisy copies of the TEM images. The structure is based on four different denoising methods, to combine the multiple noisy TEM images copies. Namely, the four different methods are Soft, the Hard as Wavelet-Thresholding methods, Bilateral Filter as a non-linear technique able to maintain the edges neatly, and the Bayesian approach in the wavelet domain, in which context modeling is used to estimate the parameter for each coefficient. To ensure getting a high signal-to-noise ratio, we have guaranteed that we are using the appropriate wavelet family at the appropriate level. So we have chosen âĂIJsym8âĂİ wavelet at level 3 as the most appropriate parameter. Whereas, for the bilateral filtering many tests are done in order to determine the proper filter parameters represented by the size of the filter, the range parameter and the

  18. Non parametric forecasting of functional-valued processes: application to the electricity load

    International Nuclear Information System (INIS)

    Cugliari, J.

    2011-01-01

    This thesis addresses the problem of predicting a functional valued stochastic process. We first explore the model proposed by Antoniadis et al. (2006) in the context of a practical application -the french electrical power demand- where the hypothesis of stationarity may fail. The departure from stationarity is twofold: an evolving mean level and the existence of groups that may be seen as classes of stationarity. We explore some corrections that enhance the prediction performance. The corrections aim to take into account the presence of these nonstationary features. In particular, to handle the existence of groups, we constraint the model to use only the data that belongs to the same group of the last available data. If one knows the grouping, a simple post-treatment suffices to obtain better prediction performances. If the grouping is unknown, we propose it from data using clustering analysis. The infinite dimension of the not necessarily stationary trajectories have to be taken into account by the clustering algorithm. We propose two strategies for this, both based on wavelet transforms. The first one uses a feature extraction approach through the Discrete Wavelet Transform combined with a feature selection algorithm to select the significant features to be used in a classical clustering algorithm. The second approach clusters directly the functions by means of a dissimilarity measure of the Continuous Wavelet spectra.The third part of thesis is dedicated to explore an alternative prediction model that incorporates exogenous information. For this purpose we use the framework given by the Autoregressive Hilbertian processes. We propose a new class of processes that we call Conditional Autoregressive Hilbertian (carh) and develop the equivalent of projection and resolvent classes of estimators to predict such processes. (author)

  19. Non-parametric characterization of long-term rainfall time series

    Science.gov (United States)

    Tiwari, Harinarayan; Pandey, Brij Kishor

    2018-03-01

    The statistical study of rainfall time series is one of the approaches for efficient hydrological system design. Identifying, and characterizing long-term rainfall time series could aid in improving hydrological systems forecasting. In the present study, eventual statistics was applied for the long-term (1851-2006) rainfall time series under seven meteorological regions of India. Linear trend analysis was carried out using Mann-Kendall test for the observed rainfall series. The observed trend using the above-mentioned approach has been ascertained using the innovative trend analysis method. Innovative trend analysis has been found to be a strong tool to detect the general trend of rainfall time series. Sequential Mann-Kendall test has also been carried out to examine nonlinear trends of the series. The partial sum of cumulative deviation test is also found to be suitable to detect the nonlinear trend. Innovative trend analysis, sequential Mann-Kendall test and partial cumulative deviation test have potential to detect the general as well as nonlinear trend for the rainfall time series. Annual rainfall analysis suggests that the maximum changes in mean rainfall is 11.53% for West Peninsular India, whereas the maximum fall in mean rainfall is 7.8% for the North Mountainous Indian region. The innovative trend analysis method is also capable of finding the number of change point available in the time series. Additionally, we have performed von Neumann ratio test and cumulative deviation test to estimate the departure from homogeneity. Singular spectrum analysis has been applied in this study to evaluate the order of departure from homogeneity in the rainfall time series. Monsoon season (JS) of North Mountainous India and West Peninsular India zones has higher departure from homogeneity and singular spectrum analysis shows the results to be in coherence with the same.

  20. Non parametric, self organizing, scalable modeling of spatiotemporal inputs: the sign language paradigm.

    Science.gov (United States)

    Caridakis, G; Karpouzis, K; Drosopoulos, A; Kollias, S

    2012-12-01

    Modeling and recognizing spatiotemporal, as opposed to static input, is a challenging task since it incorporates input dynamics as part of the problem. The vast majority of existing methods tackle the problem as an extension of the static counterpart, using dynamics, such as input derivatives, at feature level and adopting artificial intelligence and machine learning techniques originally designed for solving problems that do not specifically address the temporal aspect. The proposed approach deals with temporal and spatial aspects of the spatiotemporal domain in a discriminative as well as coupling manner. Self Organizing Maps (SOM) model the spatial aspect of the problem and Markov models its temporal counterpart. Incorporation of adjacency, both in training and classification, enhances the overall architecture with robustness and adaptability. The proposed scheme is validated both theoretically, through an error propagation study, and experimentally, on the recognition of individual signs, performed by different, native Greek Sign Language users. Results illustrate the architecture's superiority when compared to Hidden Markov Model techniques and variations both in terms of classification performance and computational cost. Copyright © 2012 Elsevier Ltd. All rights reserved.

  1. Zero- vs. one-dimensional, parametric vs. non-parametric, and confidence interval vs. hypothesis testing procedures in one-dimensional biomechanical trajectory analysis.

    Science.gov (United States)

    Pataky, Todd C; Vanrenterghem, Jos; Robinson, Mark A

    2015-05-01

    Biomechanical processes are often manifested as one-dimensional (1D) trajectories. It has been shown that 1D confidence intervals (CIs) are biased when based on 0D statistical procedures, and the non-parametric 1D bootstrap CI has emerged in the Biomechanics literature as a viable solution. The primary purpose of this paper was to clarify that, for 1D biomechanics datasets, the distinction between 0D and 1D methods is much more important than the distinction between parametric and non-parametric procedures. A secondary purpose was to demonstrate that a parametric equivalent to the 1D bootstrap exists in the form of a random field theory (RFT) correction for multiple comparisons. To emphasize these points we analyzed six datasets consisting of force and kinematic trajectories in one-sample, paired, two-sample and regression designs. Results showed, first, that the 1D bootstrap and other 1D non-parametric CIs were qualitatively identical to RFT CIs, and all were very different from 0D CIs. Second, 1D parametric and 1D non-parametric hypothesis testing results were qualitatively identical for all six datasets. Last, we highlight the limitations of 1D CIs by demonstrating that they are complex, design-dependent, and thus non-generalizable. These results suggest that (i) analyses of 1D data based on 0D models of randomness are generally biased unless one explicitly identifies 0D variables before the experiment, and (ii) parametric and non-parametric 1D hypothesis testing provide an unambiguous framework for analysis when one׳s hypothesis explicitly or implicitly pertains to whole 1D trajectories. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. A multi-instrument non-parametric reconstruction of the electron pressure profile in the galaxy cluster CLJ1226.9+3332

    Science.gov (United States)

    Romero, C.; McWilliam, M.; Macías-Pérez, J.-F.; Adam, R.; Ade, P.; André, P.; Aussel, H.; Beelen, A.; Benoît, A.; Bideaud, A.; Billot, N.; Bourrion, O.; Calvo, M.; Catalano, A.; Coiffard, G.; Comis, B.; de Petris, M.; Désert, F.-X.; Doyle, S.; Goupy, J.; Kramer, C.; Lagache, G.; Leclercq, S.; Lestrade, J.-F.; Mauskopf, P.; Mayet, F.; Monfardini, A.; Pascale, E.; Perotto, L.; Pisano, G.; Ponthieu, N.; Revéret, V.; Ritacco, A.; Roussel, H.; Ruppin, F.; Schuster, K.; Sievers, A.; Triqueneaux, S.; Tucker, C.; Zylka, R.

    2018-04-01

    Context. In the past decade, sensitive, resolved Sunyaev-Zel'dovich (SZ) studies of galaxy clusters have become common. Whereas many previous SZ studies have parameterized the pressure profiles of galaxy clusters, non-parametric reconstructions will provide insights into the thermodynamic state of the intracluster medium. Aim. We seek to recover the non-parametric pressure profiles of the high redshift (z = 0.89) galaxy cluster CLJ 1226.9+3332 as inferred from SZ data from the MUSTANG, NIKA, Bolocam, and Planck instruments, which all probe different angular scales. Methods: Our non-parametric algorithm makes use of logarithmic interpolation, which under the assumption of ellipsoidal symmetry is analytically integrable. For MUSTANG, NIKA, and Bolocam we derive a non-parametric pressure profile independently and find good agreement among the instruments. In particular, we find that the non-parametric profiles are consistent with a fitted generalized Navaro-Frenk-White (gNFW) profile. Given the ability of Planck to constrain the total signal, we include a prior on the integrated Compton Y parameter as determined by Planck. Results: For a given instrument, constraints on the pressure profile diminish rapidly beyond the field of view. The overlap in spatial scales probed by these four datasets is therefore critical in checking for consistency between instruments. By using multiple instruments, our analysis of CLJ 1226.9+3332 covers a large radial range, from the central regions to the cluster outskirts: 0.05 R500 generation of SZ instruments such as NIKA2 and MUSTANG2.

  3. Non-parametric reconstruction of an inflaton potential from Einstein–Cartan–Sciama–Kibble gravity with particle production

    Directory of Open Access Journals (Sweden)

    Shantanu Desai

    2016-04-01

    Full Text Available The coupling between spin and torsion in the Einstein–Cartan–Sciama–Kibble theory of gravity generates gravitational repulsion at very high densities, which prevents a singularity in a black hole and may create there a new universe. We show that quantum particle production in such a universe near the last bounce, which represents the Big Bang, gives the dynamics that solves the horizon, flatness, and homogeneity problems in cosmology. For a particular range of the particle production coefficient, we obtain a nearly constant Hubble parameter that gives an exponential expansion of the universe with more than 60 e-folds, which lasts about ∼10−42 s. This scenario can thus explain cosmic inflation without requiring a fundamental scalar field and reheating. From the obtained time dependence of the scale factor, we follow the prescription of Ellis and Madsen to reconstruct in a non-parametric way a scalar field potential which gives the same dynamics of the early universe. This potential gives the slow-roll parameters of cosmic inflation, from which we calculate the tensor-to-scalar ratio, the scalar spectral index of density perturbations, and its running as functions of the production coefficient. We find that these quantities do not significantly depend on the scale factor at the Big Bounce. Our predictions for these quantities are consistent with the Planck 2015 observations.

  4. A new powerful non-parametric two-stage approach for testing multiple phenotypes in family-based association studies

    NARCIS (Netherlands)

    Lange, C; Lyon, H; DeMeo, D; Raby, B; Silverman, EK; Weiss, ST

    2003-01-01

    We introduce a new powerful nonparametric testing strategy for family-based association studies in which multiple quantitative traits are recorded and the phenotype with the strongest genetic component is not known prior to the analysis. In the first stage, using a population-based test based on the

  5. Community structure in real-world networks from a non-parametrical synchronization-based dynamical approach

    International Nuclear Information System (INIS)

    Moujahid, Abdelmalik; D’Anjou, Alicia; Cases, Blanca

    2012-01-01

    Highlights: ► A synchronization-based algorithm for community structure detection is proposed. ► We model a complex network based on coupled nonidentical chaotic Rössler oscillators. ► The interaction scheme contemplates an uniformly increasing coupling force. ► The frequencies of oscillators are adapted according to a parameterless mechanism. ► The adaptation mechanism reveals the community structure present in the network. - Abstract: This work analyzes the problem of community structure in real-world networks based on the synchronization of nonidentical coupled chaotic Rössler oscillators each one characterized by a defined natural frequency, and coupled according to a predefined network topology. The interaction scheme contemplates an uniformly increasing coupling force to simulate a society in which the association between the agents grows in time. To enhance the stability of the correlated states that could emerge from the synchronization process, we propose a parameterless mechanism that adapts the characteristic frequencies of coupled oscillators according to a dynamic connectivity matrix deduced from correlated data. We show that the characteristic frequency vector that results from the adaptation mechanism reveals the underlying community structure present in the network.

  6. Transit Timing Observations from Kepler: II. Confirmation of Two Multiplanet Systems via a Non-parametric Correlation Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ford, Eric B.; /Florida U.; Fabrycky, Daniel C.; /Lick Observ.; Steffen, Jason H.; /Fermilab; Carter, Joshua A.; /Harvard-Smithsonian Ctr. Astrophys.; Fressin, Francois; /Harvard-Smithsonian Ctr. Astrophys.; Holman, Matthew J.; /Harvard-Smithsonian Ctr. Astrophys.; Lissauer, Jack J.; /NASA, Ames; Moorhead, Althea V.; /Florida U.; Morehead, Robert C.; /Florida U.; Ragozzine, Darin; /Harvard-Smithsonian Ctr. Astrophys.; Rowe, Jason F.; /NASA, Ames /SETI Inst., Mtn. View /San Diego State U., Astron. Dept.

    2012-01-01

    We present a new method for confirming transiting planets based on the combination of transit timing variations (TTVs) and dynamical stability. Correlated TTVs provide evidence that the pair of bodies are in the same physical system. Orbital stability provides upper limits for the masses of the transiting companions that are in the planetary regime. This paper describes a non-parametric technique for quantifying the statistical significance of TTVs based on the correlation of two TTV data sets. We apply this method to an analysis of the transit timing variations of two stars with multiple transiting planet candidates identified by Kepler. We confirm four transiting planets in two multiple planet systems based on their TTVs and the constraints imposed by dynamical stability. An additional three candidates in these same systems are not confirmed as planets, but are likely to be validated as real planets once further observations and analyses are possible. If all were confirmed, these systems would be near 4:6:9 and 2:4:6:9 period commensurabilities. Our results demonstrate that TTVs provide a powerful tool for confirming transiting planets, including low-mass planets and planets around faint stars for which Doppler follow-up is not practical with existing facilities. Continued Kepler observations will dramatically improve the constraints on the planet masses and orbits and provide sensitivity for detecting additional non-transiting planets. If Kepler observations were extended to eight years, then a similar analysis could likely confirm systems with multiple closely spaced, small transiting planets in or near the habitable zone of solar-type stars.

  7. Energy-saving and emission-abatement potential of Chinese coal-fired power enterprise: A non-parametric analysis

    International Nuclear Information System (INIS)

    Wei, Chu; Löschel, Andreas; Liu, Bing

    2015-01-01

    In the context of soaring demand for electricity, mitigating and controlling greenhouse gas emissions is a great challenge for China's power sector. Increasing attention has been placed on the evaluation of energy efficiency and CO 2 abatement potential in the power sector. However, studies at the micro-level are relatively rare due to serious data limitations. This study uses the 2004 and 2008 Census data of Zhejiang province to construct a non-parametric frontier in order to assess the abatement space of energy and associated CO 2 emission from China's coal-fired power enterprises. A Weighted Russell Directional Distance Function (WRDDF) is applied to construct an energy-saving potential index and a CO 2 emission-abatement potential index. Both indicators depict the inefficiency level in terms of energy utilization and CO 2 emissions of electric power plants. Our results show a substantial variation of energy-saving potential and CO 2 abatement potential among enterprises. We find that large power enterprises are less efficient in 2004, but become more efficient than smaller enterprises in 2008. State-owned enterprises (SOE) are not significantly different in 2008 from 2004, but perform better than their non-SOE counterparts in 2008. This change in performance for large enterprises and SOE might be driven by the “top-1000 Enterprise Energy Conservation Action” that was implemented in 2006. - Highlights: • Energy-saving potential and CO 2 abatement-potential for Chinese power enterprise are evaluated. • The potential to curb energy and emission shows great variation and dynamic changes. • Large enterprise is less efficient than small enterprise in 2004, but more efficient in 2008. • The state-owned enterprise performs better than non-state-owned enterprise in 2008

  8. TRANSIT TIMING OBSERVATIONS FROM KEPLER. II. CONFIRMATION OF TWO MULTIPLANET SYSTEMS VIA A NON-PARAMETRIC CORRELATION ANALYSIS

    International Nuclear Information System (INIS)

    Ford, Eric B.; Moorhead, Althea V.; Morehead, Robert C.; Fabrycky, Daniel C.; Steffen, Jason H.; Carter, Joshua A.; Fressin, Francois; Holman, Matthew J.; Ragozzine, Darin; Charbonneau, David; Lissauer, Jack J.; Rowe, Jason F.; Borucki, William J.; Bryson, Stephen T.; Burke, Christopher J.; Caldwell, Douglas A.; Welsh, William F.; Allen, Christopher; Batalha, Natalie M.; Buchhave, Lars A.

    2012-01-01

    We present a new method for confirming transiting planets based on the combination of transit timing variations (TTVs) and dynamical stability. Correlated TTVs provide evidence that the pair of bodies is in the same physical system. Orbital stability provides upper limits for the masses of the transiting companions that are in the planetary regime. This paper describes a non-parametric technique for quantifying the statistical significance of TTVs based on the correlation of two TTV data sets. We apply this method to an analysis of the TTVs of two stars with multiple transiting planet candidates identified by Kepler. We confirm four transiting planets in two multiple-planet systems based on their TTVs and the constraints imposed by dynamical stability. An additional three candidates in these same systems are not confirmed as planets, but are likely to be validated as real planets once further observations and analyses are possible. If all were confirmed, these systems would be near 4:6:9 and 2:4:6:9 period commensurabilities. Our results demonstrate that TTVs provide a powerful tool for confirming transiting planets, including low-mass planets and planets around faint stars for which Doppler follow-up is not practical with existing facilities. Continued Kepler observations will dramatically improve the constraints on the planet masses and orbits and provide sensitivity for detecting additional non-transiting planets. If Kepler observations were extended to eight years, then a similar analysis could likely confirm systems with multiple closely spaced, small transiting planets in or near the habitable zone of solar-type stars.

  9. Non-parametric trend analysis of the aridity index for three large arid and semi-arid basins in Iran

    Science.gov (United States)

    Ahani, Hossien; Kherad, Mehrzad; Kousari, Mohammad Reza; van Roosmalen, Lieke; Aryanfar, Ramin; Hosseini, Seyyed Mashaallah

    2013-05-01

    Currently, an important scientific challenge that researchers are facing is to gain a better understanding of climate change at the regional scale, which can be especially challenging in an area with low and highly variable precipitation amounts such as Iran. Trend analysis of the medium-term change using ground station observations of meteorological variables can enhance our knowledge of the dominant processes in an area and contribute to the analysis of future climate projections. Generally, studies focus on the long-term variability of temperature and precipitation and to a lesser extent on other important parameters such as moisture indices. In this study the recent 50-year trends (1955-2005) of precipitation (P), potential evapotranspiration (PET), and aridity index (AI) in monthly time scale were studied over 14 synoptic stations in three large Iran basins using the Mann-Kendall non-parametric test. Additionally, an analysis of the monthly, seasonal and annual trend of each parameter was performed. Results showed no significant trends in the monthly time series. However, PET showed significant, mostly decreasing trends, for the seasonal values, which resulted in a significant negative trend in annual PET at five stations. Significant negative trends in seasonal P values were only found at a number of stations in spring and summer and no station showed significant negative trends in annual P. Due to the varied positive and negative trends in annual P and to a lesser extent PET, almost as many stations with negative as positive trends in annual AI were found, indicating that both drying and wetting trends occurred in Iran. Overall, the northern part of the study area showed an increasing trend in annual AI which meant that the region became wetter, while the south showed decreasing trends in AI.

  10. TRANSIT TIMING OBSERVATIONS FROM KEPLER. II. CONFIRMATION OF TWO MULTIPLANET SYSTEMS VIA A NON-PARAMETRIC CORRELATION ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    Ford, Eric B.; Moorhead, Althea V.; Morehead, Robert C. [Astronomy Department, University of Florida, 211 Bryant Space Sciences Center, Gainesville, FL 32611 (United States); Fabrycky, Daniel C. [UCO/Lick Observatory, University of California, Santa Cruz, CA 95064 (United States); Steffen, Jason H. [Fermilab Center for Particle Astrophysics, P.O. Box 500, MS 127, Batavia, IL 60510 (United States); Carter, Joshua A.; Fressin, Francois; Holman, Matthew J.; Ragozzine, Darin; Charbonneau, David [Harvard-Smithsonian Center for Astrophysics, 60 Garden Street, Cambridge, MA 02138 (United States); Lissauer, Jack J.; Rowe, Jason F.; Borucki, William J.; Bryson, Stephen T.; Burke, Christopher J.; Caldwell, Douglas A. [NASA Ames Research Center, Moffett Field, CA 94035 (United States); Welsh, William F. [Astronomy Department, San Diego State University, San Diego, CA 92182-1221 (United States); Allen, Christopher [Orbital Sciences Corporation/NASA Ames Research Center, Moffett Field, CA 94035 (United States); Batalha, Natalie M. [Department of Physics and Astronomy, San Jose State University, San Jose, CA 95192 (United States); Buchhave, Lars A., E-mail: eford@astro.ufl.edu [Niels Bohr Institute, Copenhagen University, DK-2100 Copenhagen (Denmark); Collaboration: Kepler Science Team; and others

    2012-05-10

    We present a new method for confirming transiting planets based on the combination of transit timing variations (TTVs) and dynamical stability. Correlated TTVs provide evidence that the pair of bodies is in the same physical system. Orbital stability provides upper limits for the masses of the transiting companions that are in the planetary regime. This paper describes a non-parametric technique for quantifying the statistical significance of TTVs based on the correlation of two TTV data sets. We apply this method to an analysis of the TTVs of two stars with multiple transiting planet candidates identified by Kepler. We confirm four transiting planets in two multiple-planet systems based on their TTVs and the constraints imposed by dynamical stability. An additional three candidates in these same systems are not confirmed as planets, but are likely to be validated as real planets once further observations and analyses are possible. If all were confirmed, these systems would be near 4:6:9 and 2:4:6:9 period commensurabilities. Our results demonstrate that TTVs provide a powerful tool for confirming transiting planets, including low-mass planets and planets around faint stars for which Doppler follow-up is not practical with existing facilities. Continued Kepler observations will dramatically improve the constraints on the planet masses and orbits and provide sensitivity for detecting additional non-transiting planets. If Kepler observations were extended to eight years, then a similar analysis could likely confirm systems with multiple closely spaced, small transiting planets in or near the habitable zone of solar-type stars.

  11. Proposing a framework for airline service quality evaluation using Type-2 Fuzzy TOPSIS and non-parametric analysis

    Directory of Open Access Journals (Sweden)

    Navid Haghighat

    2017-12-01

    Full Text Available This paper focuses on evaluating airline service quality from the perspective of passengers' view. Until now a lot of researches has been performed in airline service quality evaluation in the world but a little research has been conducted in Iran, yet. In this study, a framework for measuring airline service quality in Iran is proposed. After reviewing airline service quality criteria, SSQAI model was selected because of its comprehensiveness in covering airline service quality dimensions. SSQAI questionnaire items were redesigned to adopt with Iranian airlines requirements and environmental circumstances in the Iran's economic and cultural context. This study includes fuzzy decision-making theory, considering the possible fuzzy subjective judgment of the evaluators during airline service quality evaluation. Fuzzy TOPSIS have been applied for ranking airlines service quality performances. Three major Iranian airlines which have the most passenger transfer volumes in domestic and foreign flights were chosen for evaluation in this research. Results demonstrated Mahan airline has got the best service quality performance rank in gaining passengers' satisfaction with delivery of high-quality services to its passengers, among the three major Iranian airlines. IranAir and Aseman airlines placed in the second and third rank, respectively, according to passenger's evaluation. Statistical analysis has been used in analyzing passenger responses. Due to the abnormality of data, Non-parametric tests were applied. To demonstrate airline ranks in every criterion separately, Friedman test was performed. Variance analysis and Tukey test were applied to study the influence of increasing in age and educational level of passengers on degree of their satisfaction from airline's service quality. Results showed that age has no significant relation to passenger satisfaction of airlines, however, increasing in educational level demonstrated a negative impact on

  12. Dependence between fusion temperatures and chemical components of a certain type of coal using classical, non-parametric and bootstrap techniques

    Energy Technology Data Exchange (ETDEWEB)

    Gonzalez-Manteiga, W.; Prada-Sanchez, J.M.; Fiestras-Janeiro, M.G.; Garcia-Jurado, I. (Universidad de Santiago de Compostela, Santiago de Compostela (Spain). Dept. de Estadistica e Investigacion Operativa)

    1990-11-01

    A statistical study of the dependence between various critical fusion temperatures of a certain kind of coal and its chemical components is carried out. As well as using classical dependence techniques (multiple, stepwise and PLS regression, principal components, canonical correlation, etc.) together with the corresponding inference on the parameters of interest, non-parametric regression and bootstrap inference are also performed. 11 refs., 3 figs., 8 tabs.

  13. Mokken scale analysis of mental health and well-being questionnaire item responses: a non-parametric IRT method in empirical research for applied health researchers

    Directory of Open Access Journals (Sweden)

    Stochl Jan

    2012-06-01

    Full Text Available Abstract Background Mokken scaling techniques are a useful tool for researchers who wish to construct unidimensional tests or use questionnaires that comprise multiple binary or polytomous items. The stochastic cumulative scaling model offered by this approach is ideally suited when the intention is to score an underlying latent trait by simple addition of the item response values. In our experience, the Mokken model appears to be less well-known than for example the (related Rasch model, but is seeing increasing use in contemporary clinical research and public health. Mokken's method is a generalisation of Guttman scaling that can assist in the determination of the dimensionality of tests or scales, and enables consideration of reliability, without reliance on Cronbach's alpha. This paper provides a practical guide to the application and interpretation of this non-parametric item response theory method in empirical research with health and well-being questionnaires. Methods Scalability of data from 1 a cross-sectional health survey (the Scottish Health Education Population Survey and 2 a general population birth cohort study (the National Child Development Study illustrate the method and modeling steps for dichotomous and polytomous items respectively. The questionnaire data analyzed comprise responses to the 12 item General Health Questionnaire, under the binary recoding recommended for screening applications, and the ordinal/polytomous responses to the Warwick-Edinburgh Mental Well-being Scale. Results and conclusions After an initial analysis example in which we select items by phrasing (six positive versus six negatively worded items we show that all items from the 12-item General Health Questionnaire (GHQ-12 – when binary scored – were scalable according to the double monotonicity model, in two short scales comprising six items each (Bech’s “well-being” and “distress” clinical scales. An illustration of ordinal item analysis

  14. Mokken scale analysis of mental health and well-being questionnaire item responses: a non-parametric IRT method in empirical research for applied health researchers.

    Science.gov (United States)

    Stochl, Jan; Jones, Peter B; Croudace, Tim J

    2012-06-11

    Mokken scaling techniques are a useful tool for researchers who wish to construct unidimensional tests or use questionnaires that comprise multiple binary or polytomous items. The stochastic cumulative scaling model offered by this approach is ideally suited when the intention is to score an underlying latent trait by simple addition of the item response values. In our experience, the Mokken model appears to be less well-known than for example the (related) Rasch model, but is seeing increasing use in contemporary clinical research and public health. Mokken's method is a generalisation of Guttman scaling that can assist in the determination of the dimensionality of tests or scales, and enables consideration of reliability, without reliance on Cronbach's alpha. This paper provides a practical guide to the application and interpretation of this non-parametric item response theory method in empirical research with health and well-being questionnaires. Scalability of data from 1) a cross-sectional health survey (the Scottish Health Education Population Survey) and 2) a general population birth cohort study (the National Child Development Study) illustrate the method and modeling steps for dichotomous and polytomous items respectively. The questionnaire data analyzed comprise responses to the 12 item General Health Questionnaire, under the binary recoding recommended for screening applications, and the ordinal/polytomous responses to the Warwick-Edinburgh Mental Well-being Scale. After an initial analysis example in which we select items by phrasing (six positive versus six negatively worded items) we show that all items from the 12-item General Health Questionnaire (GHQ-12)--when binary scored--were scalable according to the double monotonicity model, in two short scales comprising six items each (Bech's "well-being" and "distress" clinical scales). An illustration of ordinal item analysis confirmed that all 14 positively worded items of the Warwick-Edinburgh Mental

  15. On Rigorous Drought Assessment Using Daily Time Scale: Non-Stationary Frequency Analyses, Revisited Concepts, and a New Method to Yield Non-Parametric Indices

    Directory of Open Access Journals (Sweden)

    Charles Onyutha

    2017-10-01

    Full Text Available Some of the problems in drought assessments are that: analyses tend to focus on coarse temporal scales, many of the methods yield skewed indices, a few terminologies are ambiguously used, and analyses comprise an implicit assumption that the observations come from a stationary process. To solve these problems, this paper introduces non-stationary frequency analyses of quantiles. How to use non-parametric rescaling to obtain robust indices that are not (or minimally skewed is also introduced. To avoid ambiguity, some concepts on, e.g., incidence, extremity, etc., were revisited through shift from monthly to daily time scale. Demonstrations on the introduced methods were made using daily flow and precipitation insufficiency (precipitation minus potential evapotranspiration from the Blue Nile basin in Africa. Results show that, when a significant trend exists in extreme events, stationarity-based quantiles can be far different from those when non-stationarity is considered. The introduced non-parametric indices were found to closely agree with the well-known standardized precipitation evapotranspiration indices in many aspects but skewness. Apart from revisiting some concepts, the advantages of the use of fine instead of coarse time scales in drought assessment were given. The links for obtaining freely downloadable tools on how to implement the introduced methods were provided.

  16. The relationship between multilevel models and non-parametric multilevel mixture models: Discrete approximation of intraclass correlation, random coefficient distributions, and residual heteroscedasticity.

    Science.gov (United States)

    Rights, Jason D; Sterba, Sonya K

    2016-11-01

    Multilevel data structures are common in the social sciences. Often, such nested data are analysed with multilevel models (MLMs) in which heterogeneity between clusters is modelled by continuously distributed random intercepts and/or slopes. Alternatively, the non-parametric multilevel regression mixture model (NPMM) can accommodate the same nested data structures through discrete latent class variation. The purpose of this article is to delineate analytic relationships between NPMM and MLM parameters that are useful for understanding the indirect interpretation of the NPMM as a non-parametric approximation of the MLM, with relaxed distributional assumptions. We define how seven standard and non-standard MLM specifications can be indirectly approximated by particular NPMM specifications. We provide formulas showing how the NPMM can serve as an approximation of the MLM in terms of intraclass correlation, random coefficient means and (co)variances, heteroscedasticity of residuals at level 1, and heteroscedasticity of residuals at level 2. Further, we discuss how these relationships can be useful in practice. The specific relationships are illustrated with simulated graphical demonstrations, and direct and indirect interpretations of NPMM classes are contrasted. We provide an R function to aid in implementing and visualizing an indirect interpretation of NPMM classes. An empirical example is presented and future directions are discussed. © 2016 The British Psychological Society.

  17. Search of significant features in a direct non parametric pattern recognition method. Application to the classification of a multiwire spark chamber picture

    International Nuclear Information System (INIS)

    Buccheri, R.; Coffaro, P.; Di Gesu, V.; Salemi, S.; Colomba, G.

    1975-01-01

    Preliminary results are given of the application of a direct non parametric pattern recognition method to the classification of the pictures of a multiwire spark chamber. The method, developed in an earlier work for an optical spark chamber, looks promising. The picture sample used has with respect to the previous one, the following characteristis: a) the event pictures have a more complicated structure; b) the amount of background sparks in an event is greater; c) there exists a kind of noise which is almost always present in some structured way (double sparkling, bursts...). New features have been used to characterize the event pictures; the results show that the method could be also used as a super filter to reduce the cost of further analysis. (Auth.)

  18. Parametric and non-parametric masking of randomness in sequence alignments can be improved and leads to better resolved trees

    Directory of Open Access Journals (Sweden)

    von Reumont Björn M

    2010-03-01

    Full Text Available Abstract Background Methods of alignment masking, which refers to the technique of excluding alignment blocks prior to tree reconstructions, have been successful in improving the signal-to-noise ratio in sequence alignments. However, the lack of formally well defined methods to identify randomness in sequence alignments has prevented a routine application of alignment masking. In this study, we compared the effects on tree reconstructions of the most commonly used profiling method (GBLOCKS which uses a predefined set of rules in combination with alignment masking, with a new profiling approach (ALISCORE based on Monte Carlo resampling within a sliding window, using different data sets and alignment methods. While the GBLOCKS approach excludes variable sections above a certain threshold which choice is left arbitrary, the ALISCORE algorithm is free of a priori rating of parameter space and therefore more objective. Results ALISCORE was successfully extended to amino acids using a proportional model and empirical substitution matrices to score randomness in multiple sequence alignments. A complex bootstrap resampling leads to an even distribution of scores of randomly similar sequences to assess randomness of the observed sequence similarity. Testing performance on real data, both masking methods, GBLOCKS and ALISCORE, helped to improve tree resolution. The sliding window approach was less sensitive to different alignments of identical data sets and performed equally well on all data sets. Concurrently, ALISCORE is capable of dealing with different substitution patterns and heterogeneous base composition. ALISCORE and the most relaxed GBLOCKS gap parameter setting performed best on all data sets. Correspondingly, Neighbor-Net analyses showed the most decrease in conflict. Conclusions Alignment masking improves signal-to-noise ratio in multiple sequence alignments prior to phylogenetic reconstruction. Given the robust performance of alignment

  19. HOMOGENEOUS UGRIZ PHOTOMETRY FOR ACS VIRGO CLUSTER SURVEY GALAXIES: A NON-PARAMETRIC ANALYSIS FROM SDSS IMAGING

    International Nuclear Information System (INIS)

    Chen, Chin-Wei; Cote, Patrick; Ferrarese, Laura; West, Andrew A.; Peng, Eric W.

    2010-01-01

    We present photometric and structural parameters for 100 ACS Virgo Cluster Survey (ACSVCS) galaxies based on homogeneous, multi-wavelength (ugriz), wide-field SDSS (DR5) imaging. These early-type galaxies, which trace out the red sequence in the Virgo Cluster, span a factor of nearly ∼10 3 in g-band luminosity. We describe an automated pipeline that generates background-subtracted mosaic images, masks field sources and measures mean shapes, total magnitudes, effective radii, and effective surface brightnesses using a model-independent approach. A parametric analysis of the surface brightness profiles is also carried out to obtain Sersic-based structural parameters and mean galaxy colors. We compare the galaxy parameters to those in the literature, including those from the ACSVCS, finding good agreement in most cases, although the sizes of the brightest, and most extended, galaxies are found to be most uncertain and model dependent. Our photometry provides an external measurement of the random errors on total magnitudes from the widely used Virgo Cluster Catalog, which we estimate to be σ(B T )∼ 0.13 mag for the brightest galaxies, rising to ∼ 0.3 mag for galaxies at the faint end of our sample (B T ∼ 16). The distribution of axial ratios of low-mass ( d warf ) galaxies bears a strong resemblance to the one observed for the higher-mass ( g iant ) galaxies. The global structural parameters for the full galaxy sample-profile shape, effective radius, and mean surface brightness-are found to vary smoothly and systematically as a function of luminosity, with unmistakable evidence for changes in structural homology along the red sequence. As noted in previous studies, the ugriz galaxy colors show a nonlinear but smooth variation over a ∼7 mag range in absolute magnitude, with an enhanced scatter for the faintest systems that is likely the signature of their more diverse star formation histories.

  20. Use of NON-PARAMETRIC Item Response Theory to develop a shortened version of the Positive and Negative Syndrome Scale (PANSS)

    Science.gov (United States)

    2011-01-01

    Background Nonparametric item response theory (IRT) was used to examine (a) the performance of the 30 Positive and Negative Syndrome Scale (PANSS) items and their options ((levels of severity), (b) the effectiveness of various subscales to discriminate among differences in symptom severity, and (c) the development of an abbreviated PANSS (Mini-PANSS) based on IRT and a method to link scores to the original PANSS. Methods Baseline PANSS scores from 7,187 patients with Schizophrenia or Schizoaffective disorder who were enrolled between 1995 and 2005 in psychopharmacology trials were obtained. Option characteristic curves (OCCs) and Item Characteristic Curves (ICCs) were constructed to examine the probability of rating each of seven options within each of 30 PANSS items as a function of subscale severity, and summed-score linking was applied to items selected for the Mini-PANSS. Results The majority of items forming the Positive and Negative subscales (i.e. 19 items) performed very well and discriminate better along symptom severity compared to the General Psychopathology subscale. Six of the seven Positive Symptom items, six of the seven Negative Symptom items, and seven out of the 16 General Psychopathology items were retained for inclusion in the Mini-PANSS. Summed score linking and linear interpolation was able to produce a translation table for comparing total subscale scores of the Mini-PANSS to total subscale scores on the original PANSS. Results show scores on the subscales of the Mini-PANSS can be linked to scores on the original PANSS subscales, with very little bias. Conclusions The study demonstrated the utility of non-parametric IRT in examining the item properties of the PANSS and to allow selection of items for an abbreviated PANSS scale. The comparisons between the 30-item PANSS and the Mini-PANSS revealed that the shorter version is comparable to the 30-item PANSS, but when applying IRT, the Mini-PANSS is also a good indicator of illness severity

  1. Use of non-parametric item response theory to develop a shortened version of the Positive and Negative Syndrome Scale (PANSS).

    Science.gov (United States)

    Khan, Anzalee; Lewis, Charles; Lindenmayer, Jean-Pierre

    2011-11-16

    Nonparametric item response theory (IRT) was used to examine (a) the performance of the 30 Positive and Negative Syndrome Scale (PANSS) items and their options ((levels of severity), (b) the effectiveness of various subscales to discriminate among differences in symptom severity, and (c) the development of an abbreviated PANSS (Mini-PANSS) based on IRT and a method to link scores to the original PANSS. Baseline PANSS scores from 7,187 patients with Schizophrenia or Schizoaffective disorder who were enrolled between 1995 and 2005 in psychopharmacology trials were obtained. Option characteristic curves (OCCs) and Item Characteristic Curves (ICCs) were constructed to examine the probability of rating each of seven options within each of 30 PANSS items as a function of subscale severity, and summed-score linking was applied to items selected for the Mini-PANSS. The majority of items forming the Positive and Negative subscales (i.e. 19 items) performed very well and discriminate better along symptom severity compared to the General Psychopathology subscale. Six of the seven Positive Symptom items, six of the seven Negative Symptom items, and seven out of the 16 General Psychopathology items were retained for inclusion in the Mini-PANSS. Summed score linking and linear interpolation was able to produce a translation table for comparing total subscale scores of the Mini-PANSS to total subscale scores on the original PANSS. Results show scores on the subscales of the Mini-PANSS can be linked to scores on the original PANSS subscales, with very little bias. The study demonstrated the utility of non-parametric IRT in examining the item properties of the PANSS and to allow selection of items for an abbreviated PANSS scale. The comparisons between the 30-item PANSS and the Mini-PANSS revealed that the shorter version is comparable to the 30-item PANSS, but when applying IRT, the Mini-PANSS is also a good indicator of illness severity.

  2. Estimation from PET data of transient changes in dopamine concentration induced by alcohol: support for a non-parametric signal estimation method

    Energy Technology Data Exchange (ETDEWEB)

    Constantinescu, C C; Yoder, K K; Normandin, M D; Morris, E D [Department of Radiology, Indiana University School of Medicine, Indianapolis, IN (United States); Kareken, D A [Department of Neurology, Indiana University School of Medicine, Indianapolis, IN (United States); Bouman, C A [Weldon School of Biomedical Engineering, Purdue University, West Lafayette, IN (United States); O' Connor, S J [Department of Psychiatry, Indiana University School of Medicine, Indianapolis, IN (United States)], E-mail: emorris@iupui.edu

    2008-03-07

    We previously developed a model-independent technique (non-parametric ntPET) for extracting the transient changes in neurotransmitter concentration from paired (rest and activation) PET studies with a receptor ligand. To provide support for our method, we introduced three hypotheses of validation based on work by Endres and Carson (1998 J. Cereb. Blood Flow Metab. 18 1196-210) and Yoder et al (2004 J. Nucl. Med. 45 903-11), and tested them on experimental data. All three hypotheses describe relationships between the estimated free (synaptic) dopamine curves (F{sup DA}(t)) and the change in binding potential ({delta}BP). The veracity of the F{sup DA}(t) curves recovered by nonparametric ntPET is supported when the data adhere to the following hypothesized behaviors: (1) {delta}BP should decline with increasing DA peak time, (2) {delta}BP should increase as the strength of the temporal correlation between F{sup DA}(t) and the free raclopride (F{sup RAC}(t)) curve increases, (3) {delta}BP should decline linearly with the effective weighted availability of the receptor sites. We analyzed regional brain data from 8 healthy subjects who received two [{sup 11}C]raclopride scans: one at rest, and one during which unanticipated IV alcohol was administered to stimulate dopamine release. For several striatal regions, nonparametric ntPET was applied to recover F{sup DA}(t), and binding potential values were determined. Kendall rank-correlation analysis confirmed that the F{sup DA}(t) data followed the expected trends for all three validation hypotheses. Our findings lend credence to our model-independent estimates of F{sup DA}(t). Application of nonparametric ntPET may yield important insights into how alterations in timing of dopaminergic neurotransmission are involved in the pathologies of addiction and other psychiatric disorders.

  3. International comparisons of the technical efficiency of the hospital sector: panel data analysis of OECD countries using parametric and non-parametric approaches.

    Science.gov (United States)

    Varabyova, Yauheniya; Schreyögg, Jonas

    2013-09-01

    There is a growing interest in the cross-country comparisons of the performance of national health care systems. The present work provides a comparison of the technical efficiency of the hospital sector using unbalanced panel data from OECD countries over the period 2000-2009. The estimation of the technical efficiency of the hospital sector is performed using nonparametric data envelopment analysis (DEA) and parametric stochastic frontier analysis (SFA). Internal and external validity of findings is assessed by estimating the Spearman rank correlations between the results obtained in different model specifications. The panel-data analyses using two-step DEA and one-stage SFA show that countries, which have higher health care expenditure per capita, tend to have a more technically efficient hospital sector. Whether the expenditure is financed through private or public sources is not related to the technical efficiency of the hospital sector. On the other hand, the hospital sector in countries with higher income inequality and longer average hospital length of stay is less technically efficient. Copyright © 2013 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  4. Validation and psychometric properties of the Somatic and Psychological HEalth REport (SPHERE) in a young Australian-based population sample using non-parametric item response theory.

    Science.gov (United States)

    Couvy-Duchesne, Baptiste; Davenport, Tracey A; Martin, Nicholas G; Wright, Margaret J; Hickie, Ian B

    2017-08-01

    The Somatic and Psychological HEalth REport (SPHERE) is a 34-item self-report questionnaire that assesses symptoms of mental distress and persistent fatigue. As it was developed as a screening instrument for use mainly in primary care-based clinical settings, its validity and psychometric properties have not been studied extensively in population-based samples. We used non-parametric Item Response Theory to assess scale validity and item properties of the SPHERE-34 scales, collected through four waves of the Brisbane Longitudinal Twin Study (N = 1707, mean age = 12, 51% females; N = 1273, mean age = 14, 50% females; N = 1513, mean age = 16, 54% females, N = 1263, mean age = 18, 56% females). We estimated the heritability of the new scores, their genetic correlation, and their predictive ability in a sub-sample (N = 1993) who completed the Composite International Diagnostic Interview. After excluding items most responsible for noise, sex or wave bias, the SPHERE-34 questionnaire was reduced to 21 items (SPHERE-21), comprising a 14-item scale for anxiety-depression and a 10-item scale for chronic fatigue (3 items overlapping). These new scores showed high internal consistency (alpha > 0.78), moderate three months reliability (ICC = 0.47-0.58) and item scalability (Hi > 0.23), and were positively correlated (phenotypic correlations r = 0.57-0.70; rG = 0.77-1.00). Heritability estimates ranged from 0.27 to 0.51. In addition, both scores were associated with later DSM-IV diagnoses of MDD, social anxiety and alcohol dependence (OR in 1.23-1.47). Finally, a post-hoc comparison showed that several psychometric properties of the SPHERE-21 were similar to those of the Beck Depression Inventory. The scales of SPHERE-21 measure valid and comparable constructs across sex and age groups (from 9 to 28 years). SPHERE-21 scores are heritable, genetically correlated and show good predictive ability of mental health in an Australian-based population

  5. Quantitative Phylogenomics of Within-Species Mitogenome Variation: Monte Carlo and Non-Parametric Analysis of Phylogeographic Structure among Discrete Transatlantic Breeding Areas of Harp Seals (Pagophilus groenlandicus.

    Directory of Open Access Journals (Sweden)

    Steven M Carr

    -stepping-stone biogeographic models, but not a simple 1-step trans-Atlantic model. Plots of the cumulative pairwise sequence difference curves among seals in each of the four populations provide continuous proxies for phylogenetic diversification within each. Non-parametric Kolmogorov-Smirnov (K-S tests of maximum pairwise differences between these curves indicates that the Greenland Sea population has a markedly younger phylogenetic structure than either the White Sea population or the two Northwest Atlantic populations, which are of intermediate age and homogeneous structure. The Monte Carlo and K-S assessments provide sensitive quantitative tests of within-species mitogenomic phylogeography. This is the first study to indicate that the White Sea and Greenland Sea populations have different population genetic histories. The analysis supports the hypothesis that Harp Seals comprises three genetically distinguishable breeding populations, in the White Sea, Greenland Sea, and Northwest Atlantic. Implications for an ice-dependent species during ongoing climate change are discussed.

  6. The Stellar Initial Mass Function in Early-type Galaxies from Absorption Line Spectroscopy. IV. A Super-Salpeter IMF in the Center of NGC 1407 from Non-parametric Models

    Energy Technology Data Exchange (ETDEWEB)

    Conroy, Charlie [Department of Astronomy, Harvard University, Cambridge, MA, 02138 (United States); Van Dokkum, Pieter G. [Department of Astronomy, Yale University, New Haven, CT, 06511 (United States); Villaume, Alexa [Department of Astronomy and Astrophysics, University of California, Santa Cruz, CA 95064 (United States)

    2017-03-10

    It is now well-established that the stellar initial mass function (IMF) can be determined from the absorption line spectra of old stellar systems, and this has been used to measure the IMF and its variation across the early-type galaxy population. Previous work focused on measuring the slope of the IMF over one or more stellar mass intervals, implicitly assuming that this is a good description of the IMF and that the IMF has a universal low-mass cutoff. In this work we consider more flexible IMFs, including two-component power laws with a variable low-mass cutoff and a general non-parametric model. We demonstrate with mock spectra that the detailed shape of the IMF can be accurately recovered as long as the data quality is high (S/N ≳ 300 Å{sup −1}) and cover a wide wavelength range (0.4–1.0 μ m). We apply these flexible IMF models to a high S/N spectrum of the center of the massive elliptical galaxy NGC 1407. Fitting the spectrum with non-parametric IMFs, we find that the IMF in the center shows a continuous rise extending toward the hydrogen-burning limit, with a behavior that is well-approximated by a power law with an index of −2.7. These results provide strong evidence for the existence of extreme (super-Salpeter) IMFs in the cores of massive galaxies.

  7. Comparação de duas metodologias de amostragem atmosférica com ferramenta estatística não paramétrica Comparison of two atmospheric sampling methodologies with non-parametric statistical tools

    Directory of Open Access Journals (Sweden)

    Maria João Nunes

    2005-03-01

    Full Text Available In atmospheric aerosol sampling, it is inevitable that the air that carries particles is in motion, as a result of both externally driven wind and the sucking action of the sampler itself. High or low air flow sampling speeds may lead to significant particle size bias. The objective of this work is the validation of measurements enabling the comparison of species concentration from both air flow sampling techniques. The presence of several outliers and increase of residuals with concentration becomes obvious, requiring non-parametric methods, recommended for the handling of data which may not be normally distributed. This way, conversion factors are obtained for each of the various species under study using Kendall regression.

  8. Primal and dual approaches to fishing capacity

    DEFF Research Database (Denmark)

    Kerstens, Kristiaan; Vestergaard, Niels

    2013-01-01

    Application of primal non-parametric approaches to estimation of fishing capacity provides useful disaggregated information about fishing firm’s capacities utilizations. A potentially serious issue is that the estimated capacity utilization rates can be relatively low. This may call for reductions...... in the fishing fleet that are political impossible to defend. In this paper two modifications of the traditional approach are explored. First, non-convex technologies are introduced and it is shown how the primal non-parametric approach leads to different capacity utilization rates. Then capacity utilization...... measures using cost functions are specified for both convex and non-convex technologies. It is illustrated how the convexity assumption impacts capacity utilization rates and how this dual approach differs from the primal approach. Second, the effect of utilizing these different convex versus non...

  9. Benchmark of the non-parametric Bayesian deconvolution method implemented in the SINBAD code for X/γ rays spectra processing

    Energy Technology Data Exchange (ETDEWEB)

    Rohée, E. [CEA, LIST, Laboratoire Capteurs et Architectures Electroniques, F-91191 Gif-sur-Yvette (France); Coulon, R., E-mail: romain.coulon@cea.fr [CEA, LIST, Laboratoire Capteurs et Architectures Electroniques, F-91191 Gif-sur-Yvette (France); Carrel, F. [CEA, LIST, Laboratoire Capteurs et Architectures Electroniques, F-91191 Gif-sur-Yvette (France); Dautremer, T.; Barat, E.; Montagu, T. [CEA, LIST, Laboratoire de Modélisation et Simulation des Systèmes, F-91191 Gif-sur-Yvette (France); Normand, S. [CEA, DAM, Le Ponant, DPN/STXN, F-75015 Paris (France); Jammes, C. [CEA, DEN, Cadarache, DER/SPEx/LDCI, F-13108 Saint-Paul-lez-Durance (France)

    2016-11-11

    Radionuclide identification and quantification are a serious concern for many applications as for in situ monitoring at nuclear facilities, laboratory analysis, special nuclear materials detection, environmental monitoring, and waste measurements. High resolution gamma-ray spectrometry based on high purity germanium diode detectors is the best solution available for isotopic identification. Over the last decades, methods have been developed to improve gamma spectra analysis. However, some difficulties remain in the analysis when full energy peaks are folded together with high ratio between their amplitudes, and when the Compton background is much larger compared to the signal of a single peak. In this context, this study deals with the comparison between a conventional analysis based on “iterative peak fitting deconvolution” method and a “nonparametric Bayesian deconvolution” approach developed by the CEA LIST and implemented into the SINBAD code. The iterative peak fit deconvolution is used in this study as a reference method largely validated by industrial standards to unfold complex spectra from HPGe detectors. Complex cases of spectra are studied from IAEA benchmark protocol tests and with measured spectra. The SINBAD code shows promising deconvolution capabilities compared to the conventional method without any expert parameter fine tuning.

  10. Short-term monitoring of benzene air concentration in an urban area: a preliminary study of application of Kruskal-Wallis non-parametric test to assess pollutant impact on global environment and indoor.

    Science.gov (United States)

    Mura, Maria Chiara; De Felice, Marco; Morlino, Roberta; Fuselli, Sergio

    2010-01-01

    In step with the need to develop statistical procedures to manage small-size environmental samples, in this work we have used concentration values of benzene (C6H6), concurrently detected by seven outdoor and indoor monitoring stations over 12 000 minutes, in order to assess the representativeness of collected data and the impact of the pollutant on indoor environment. Clearly, the former issue is strictly connected to sampling-site geometry, which proves critical to correctly retrieving information from analysis of pollutants of sanitary interest. Therefore, according to current criteria for network-planning, single stations have been interpreted as nodes of a set of adjoining triangles; then, a) node pairs have been taken into account in order to estimate pollutant stationarity on triangle sides, as well as b) node triplets, to statistically associate data from air-monitoring with the corresponding territory area, and c) node sextuplets, to assess the impact probability of the outdoor pollutant on indoor environment for each area. Distributions from the various node combinations are all non-Gaussian, in the consequently, Kruskal-Wallis (KW) non-parametric statistics has been exploited to test variability on continuous density function from each pair, triplet and sextuplet. Results from the above-mentioned statistical analysis have shown randomness of site selection, which has not allowed a reliable generalization of monitoring data to the entire selected territory, except for a single "forced" case (70%); most important, they suggest a possible procedure to optimize network design.

  11. Short-term monitoring of benzene air concentration in an urban area: a preliminary study of application of Kruskal-Wallis non-parametric test to assess pollutant impact on global environment and indoor

    Directory of Open Access Journals (Sweden)

    Maria Chiara Mura

    2010-12-01

    Full Text Available In step with the need to develop statistical procedures to manage small-size environmental samples, in this work we have used concentration values of benzene (C6H6, concurrently detected by seven outdoor and indoor monitoring stations over 12 000 minutes, in order to assess the representativeness of collected data and the impact of the pollutant on indoor environment. Clearly, the former issue is strictly connected to sampling-site geometry, which proves critical to correctly retrieving information from analysis of pollutants of sanitary interest. Therefore, according to current criteria for network-planning, single stations have been interpreted as nodes of a set of adjoining triangles; then, a node pairs have been taken into account in order to estimate pollutant stationarity on triangle sides, as well as b node triplets, to statistically associate data from air-monitoring with the corresponding territory area, and c node sextuplets, to assess the impact probability of the outdoor pollutant on indoor environment for each area. Distributions from the various node combinations are all non-Gaussian, in the consequently, Kruskal-Wallis (KW non-parametric statistics has been exploited to test variability on continuous density function from each pair, triplet and sextuplet. Results from the above-mentioned statistical analysis have shown randomness of site selection, which has not allowed a reliable generalization of monitoring data to the entire selected territory, except for a single "forced" case (70%; most important, they suggest a possible procedure to optimize network design.

  12. Non-Parametric Model Drift Detection

    Science.gov (United States)

    2016-07-01

    framework on two tasks in NLP domain, topic modeling, and machine translation. Our main findings are summarized as follows: • We can measure important...thank,us,me,hope,today Group num: 4, TC(X;Y_j): 0.407 4:republic,palestinian,israel, arab ,israeli,democratic,congo,mr,president,occupied Group num: 5...support,change,lessons,partnerships,l earned Group num: 35, TC(X;Y_j): 0.094 35:russian,federation,spoke,you,french,spanish, arabic ,your,chinese,sir

  13. Which percentile-based approach should be preferred for calculating normalized citation impact values? An empirical comparison of five approaches including a newly developed citation-rank approach (P100)

    NARCIS (Netherlands)

    Bornmann, L.; Leydesdorff, L.; Wang, J.

    2013-01-01

    For comparisons of citation impacts across fields and over time, bibliometricians normalize the observed citation counts with reference to an expected citation value. Percentile-based approaches have been proposed as a non-parametric alternative to parametric central-tendency statistics. Percentiles

  14. Non-parametric cell-based photometric proxies for galaxy morphology: methodology and application to the morphologically defined star formation-stellar mass relation of spiral galaxies in the local universe

    Science.gov (United States)

    Grootes, M. W.; Tuffs, R. J.; Popescu, C. C.; Robotham, A. S. G.; Seibert, M.; Kelvin, L. S.

    2014-02-01

    We present a non-parametric cell-based method of selecting highly pure and largely complete samples of spiral galaxies using photometric and structural parameters as provided by standard photometric pipelines and simple shape fitting algorithms. The performance of the method is quantified for different parameter combinations, using purely human-based classifications as a benchmark. The discretization of the parameter space allows a markedly superior selection than commonly used proxies relying on a fixed curve or surface of separation. Moreover, we find structural parameters derived using passbands longwards of the g band and linked to older stellar populations, especially the stellar mass surface density μ* and the r-band effective radius re, to perform at least equally well as parameters more traditionally linked to the identification of spirals by means of their young stellar populations, e.g. UV/optical colours. In particular, the distinct bimodality in the parameter μ*, consistent with expectations of different evolutionary paths for spirals and ellipticals, represents an often overlooked yet powerful parameter in differentiating between spiral and non-spiral/elliptical galaxies. We use the cell-based method for the optical parameter set including re in combination with the Sérsic index n and the i-band magnitude to investigate the intrinsic specific star formation rate-stellar mass relation (ψ*-M*) for a morphologically defined volume-limited sample of local Universe spiral galaxies. The relation is found to be well described by ψ _* ∝ M_*^{-0.5} over the range of 109.5 ≤ M* ≤ 1011 M⊙ with a mean interquartile range of 0.4 dex. This is somewhat steeper than previous determinations based on colour-selected samples of star-forming galaxies, primarily due to the inclusion in the sample of red quiescent discs.

  15. Parametric estimation in the wave buoy analogy - an elaborated approach based on energy considerations

    DEFF Research Database (Denmark)

    Montazeri, Najmeh; Nielsen, Ulrik Dam

    2014-01-01

    the ship’s wave-induced responses based on different statistical inferences including parametric and non-parametric approaches. This paper considers a concept to improve the estimate obtained by the parametric method for sea state estimation. The idea is illustrated by an analysis made on full-scale...

  16. Statistical modelling approach to derive quantitative nanowastes classification index; estimation of nanomaterials exposure

    CSIR Research Space (South Africa)

    Ntaka, L

    2013-08-01

    Full Text Available . In this work, statistical inference approach specifically the non-parametric bootstrapping and linear model were applied. Data used to develop the model were sourced from the literature. 104 data points with information on aggregation, natural organic matter...

  17. Source apportionment of speciated PM2.5 and non-parametric regressions of PM2.5 and PM(coarse) mass concentrations from Denver and Greeley, Colorado, and construction and evaluation of dichotomous filter samplers

    Science.gov (United States)

    Piedrahita, Ricardo A.

    The Denver Aerosol Sources and Health study (DASH) was a long-term study of the relationship between the variability in fine particulate mass and chemical constituents (PM2.5, particulate matter less than 2.5mum) and adverse health effects such as cardio-respiratory illnesses and mortality. Daily filter samples were chemically analyzed for multiple species. We present findings based on 2.8 years of DASH data, from 2003 to 2005. Multilinear Engine 2 (ME-2), a receptor-based source apportionment model was applied to the data to estimate source contributions to PM2.5 mass concentrations. This study relied on two different ME-2 models: (1) a 2-way model that closely reflects PMF-2; and (2) an enhanced model with meteorological data that used additional temporal and meteorological factors. The Coarse Rural Urban Sources and Health study (CRUSH) is a long-term study of the relationship between the variability in coarse particulate mass (PMcoarse, particulate matter between 2.5 and 10mum) and adverse health effects such as cardio-respiratory illnesses, pre-term births, and mortality. Hourly mass concentrations of PMcoarse and fine particulate matter (PM2.5) are measured using tapered element oscillating microbalances (TEOMs) with Filter Dynamics Measurement Systems (FDMS), at two rural and two urban sites. We present findings based on nine months of mass concentration data, including temporal trends, and non-parametric regressions (NPR) results, which were used to characterize the wind speed and wind direction relationships that might point to sources. As part of CRUSH, 1-year coarse and fine mode particulate matter filter sampling network, will allow us to characterize the chemical composition of the particulate matter collected and perform spatial comparisons. This work describes the construction and validation testing of four dichotomous filter samplers for this purpose. The use of dichotomous splitters with an approximate 2.5mum cut point, coupled with a 10mum cut

  18. Un estudio no paramétrico de eficiencia para la minería de Zacatecas, México || A Non-Parametric Approach to Efficiency for Mining in Zacatecas, Mexico

    Directory of Open Access Journals (Sweden)

    Rodallegas Portillo, Mayra C.

    2012-01-01

    Full Text Available La intención de este trabajo es abordar el sector minero en el estado de Zacatecas bajo la técnica del análisis envolvente de datos para construir indicadores de eficiencia en los años 1998, 2003 y 2008. Se realiza un análisis entre entidades federativas para comparar el desempeño a nivel nacional; asimismo, se efectúa un estudio comparativo entre tipos específicos de producción minera. El artículo constituye una fuente de información significativa sobre el desempeño de la industria minera en la entidad, pudiendo identificar los productos del estado que son susceptibles de mejora técnica mediante las clases de actividad ineficientes. || The aim of the paper is to use the data envelopment analysis (DEA for mining in Zacatecas and provide efficiency indicators in the years 1998, 2003, and 2008. We compare the performance of the state of Zacatecas with other mining states in Mexico. Then the empirical analysis extends to specific mining. The study is an important source of information about mining behavior. By using the national industries level, it was possible to identify the products that are susceptible of improvement.

  19. Statistical Approaches to Aerosol Dynamics for Climate Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Zhu, Wei

    2014-09-02

    In this work, we introduce two general non-parametric regression analysis methods for errors-in-variable (EIV) models: the compound regression, and the constrained regression. It is shown that these approaches are equivalent to each other and, to the general parametric structural modeling approach. The advantages of these methods lie in their intuitive geometric representations, their distribution free nature, and their ability to offer a practical solution when the ratio of the error variances is unknown. Each includes the classic non-parametric regression methods of ordinary least squares, geometric mean regression, and orthogonal regression as special cases. Both methods can be readily generalized to multiple linear regression with two or more random regressors.

  20. Evaluación de la estabilidad de un cultivar de caña de azúcar (Saccharum spp. en diferentes ambientes agroecológicos a través de una técnica no paramétrica en Tucumán, R. Argentina Assessment of the stability of a sugarcane (Saccharum spp. cultivar in different environments by a non-parametric test in Tucumán, Argentina

    Directory of Open Access Journals (Sweden)

    Santiago Ostengo

    2011-12-01

    considered in a breeding program. It is for that reason that in sugar cane breeding, multienvironmental trials (MET are conducted at the last stage of the selection process. There exist different approaches to study genotype-environment interaction. One of these is the non-parametric technique, a valid and useful tool which allows making an initial exploration that can be easily interpreted. The non-parametric technique called relative consistency of performance enables the classification of genotypes into the following four categories: (i consistently superior; (ii inconsistently superior; (iii inconsistently inferior and (iv consistently inferior. This work aims to evaluate the consistency of performance of TUC 95-10 variety across different agro-ecological environments in the province of Tucumán (Argentina, as regards the variable tons of sugar per hectare and considering different crop ages. Data were obtained from MET of the Sugarcane Breeding Program of Estación Experimental Agroindustrial Obispo Colombres (EEAOC from Tucumán (Argentina, conducted at six sites through four crop ages. Results showed that TUC 95-10, recently released by EEAOC, can be labeled as consistently superior at all ages, i.e. it held the top position in sugar production in all tested environments. Therefore, it can be concluded that TUC 95-10 shows an excellent performance and good adaptation to different agro-ecological environments in Tucumán, at all crop ages.

  1. AucPR: An AUC-based approach using penalized regression for disease prediction with high-dimensional omics data

    OpenAIRE

    Yu, Wenbao; Park, Taesung

    2014-01-01

    Motivation It is common to get an optimal combination of markers for disease classification and prediction when multiple markers are available. Many approaches based on the area under the receiver operating characteristic curve (AUC) have been proposed. Existing works based on AUC in a high-dimensional context depend mainly on a non-parametric, smooth approximation of AUC, with no work using a parametric AUC-based approach, for high-dimensional data. Results We propose an AUC-based approach u...

  2. Comparison of four approaches to a rock facies classification problem

    Science.gov (United States)

    Dubois, M.K.; Bohling, Geoffrey C.; Chakrabarti, S.

    2007-01-01

    In this study, seven classifiers based on four different approaches were tested in a rock facies classification problem: classical parametric methods using Bayes' rule, and non-parametric methods using fuzzy logic, k-nearest neighbor, and feed forward-back propagating artificial neural network. Determining the most effective classifier for geologic facies prediction in wells without cores in the Panoma gas field, in Southwest Kansas, was the objective. Study data include 3600 samples with known rock facies class (from core) with each sample having either four or five measured properties (wire-line log curves), and two derived geologic properties (geologic constraining variables). The sample set was divided into two subsets, one for training and one for testing the ability of the trained classifier to correctly assign classes. Artificial neural networks clearly outperformed all other classifiers and are effective tools for this particular classification problem. Classical parametric models were inadequate due to the nature of the predictor variables (high dimensional and not linearly correlated), and feature space of the classes (overlapping). The other non-parametric methods tested, k-nearest neighbor and fuzzy logic, would need considerable improvement to match the neural network effectiveness, but further work, possibly combining certain aspects of the three non-parametric methods, may be justified. ?? 2006 Elsevier Ltd. All rights reserved.

  3. Decompounding random sums: A nonparametric approach

    DEFF Research Database (Denmark)

    Hansen, Martin Bøgsted; Pitts, Susan M.

    Observations from sums of random variables with a random number of summands, known as random, compound or stopped sums arise within many areas of engineering and science. Quite often it is desirable to infer properties of the distribution of the terms in the random sum. In the present paper we...... review a number of applications and consider the nonlinear inverse problem of inferring the cumulative distribution function of the components in the random sum. We review the existing literature on non-parametric approaches to the problem. The models amenable to the analysis are generalized considerably...

  4. (AJST) RELATIVE EFFICIENCY OF NON-PARAMETRIC ERROR ...

    African Journals Online (AJOL)

    NORBERT OPIYO AKECH

    on 100 bootstrap samples, a sample of size n being taken with replacement in each initial sample of size n. .... the overlap (or optimal error rate) of the populations. However, the expression (2.3) for the computation of ..... Analysis and Machine Intelligence, 9, 628-633. Lachenbruch P. A. (1967). An almost unbiased method ...

  5. Using non-parametric methods in econometric production analysis

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard; Henningsen, Arne

    Econometric estimation of production functions is one of the most common methods in applied economic production analysis. These studies usually apply parametric estimation techniques, which obligate the researcher to specify the functional form of the production function. Most often, the Cobb...... results—including measures that are of interest of applied economists, such as elasticities. Therefore, we propose to use nonparametric econometric methods. First, they can be applied to verify the functional form used in parametric estimations of production functions. Second, they can be directly used...

  6. Non-parametric Bayesian inference for inhomogeneous Markov point processes

    DEFF Research Database (Denmark)

    Berthelsen, Kasper Klitgaard; Møller, Jesper; Johansen, Per Michael

    is a shot noise process, and the interaction function for a pair of points depends only on the distance between the two points and is a piecewise linear function modelled by a marked Poisson process. Simulation of the resulting posterior using a Metropolis-Hastings algorithm in the "conventional" way...

  7. A non-parametric 2D deformable template classifier

    DEFF Research Database (Denmark)

    Schultz, Nette; Nielsen, Allan Aasbjerg; Conradsen, Knut

    2005-01-01

    feature space the ship-master will be able to interactively define a segmentation map, which is refined and optimized by the deformable template algorithms. The deformable templates are defined as two-dimensional vector-cycles. Local random transformations are applied to the vector-cycles, and stochastic...

  8. Active learning and adaptive sampling for non-parametric inference

    NARCIS (Netherlands)

    Castro, R.M.

    2007-01-01

    This thesis presents a general discussion of active learning and adaptive sampling. In many practical scenarios it is possible to use information gleaned from previous observations to focus the sampling process, in the spirit of the "twenty-questions" game. As more samples are collected one can

  9. An integrated approach to estimate storage reliability with initial failures based on E-Bayesian estimates

    International Nuclear Information System (INIS)

    Zhang, Yongjin; Zhao, Ming; Zhang, Shitao; Wang, Jiamei; Zhang, Yanjun

    2017-01-01

    An integrated approach to estimate the storage reliability is proposed. • A non-parametric measure to estimate the number of failures and the reliability at each testing time is presented. • E-Baysian method to estimate the failure probability is introduced. • The possible initial failures in storage are introduced. • The non-parametric estimates of failure numbers can be used into the parametric models.

  10. Seleção de híbridos diplóides (AA de bananeira com base em três índices não paramétricos Selection of (AA diploid banana hybrids using three non-parametric indices

    Directory of Open Access Journals (Sweden)

    Lauro Saraiva Lessa

    2010-01-01

    Full Text Available Objetivou-se selecionar híbridos diplóides (AA de bananeira com base em três índices não paramétricos, a fim de orientar a seleção e aumentar o aproveitamento da variabilidade existente no Banco de Germoplasma de Bananeira da Embrapa Mandioca e Fruticultura Tropical. Foram avaliados 11 híbridos, no delineamento de blocos ao acaso, com quatro repetições. As parcelas constituíram-se de seis plantas, espaçadas de 2,5 m x 2,5 m, tendo na bordadura plantas da cultivar Pacovan. Tomaram-se dados dos seguintes caracteres: altura da planta, diâmetro do pseudocaule, número de filhos na floração, número de folhas na floração, ciclo da planta do plantio à emissão do cacho, presença de pólen, número de pencas, número de frutos, comprimento do fruto e resistência à Sigatoka-amarela. As médias desses 10 caracteres foram empregadas no cálculo dos índices multiplicativos, de soma de classificação e da distância genótipo-ideótipo. Os dois híbridos de melhor desempenho geral, o SH3263 e o 1318-01, foram classificados, respectivamente, em primeiro e segundo lugares pelos índices multiplicativos e de soma de classificação, enquanto o índice da distância genótipo-ideótipo os classificou em primeiro e quarto lugares respectivamente. Embora os três índices tenham demonstrado uma boa correspondência entre o desempenho geral dos híbridos e a sua classificação, os índices multiplicativo e de soma de classificação propiciaram classificação mais adequada desses híbridos.The objective of the present study was to select diploids (AA hybrids of banana based on three non-parametric indices as to guide the selection and increase the use of the variability present in the Banana Germplasm Bank of Embrapa Cassava and Tropical Fruits. Eleven hybrids were evaluated in random blocks with four replicates. The plots consisted of six plants spaced 2.5 m x 2.5 m whereas the border rows were from the Pacovan cultivar. The following

  11. Spectral-spatial classification of hyperspectral data with mutual information based segmented stacked autoencoder approach

    Science.gov (United States)

    Paul, Subir; Nagesh Kumar, D.

    2018-04-01

    Hyperspectral (HS) data comprises of continuous spectral responses of hundreds of narrow spectral bands with very fine spectral resolution or bandwidth, which offer feature identification and classification with high accuracy. In the present study, Mutual Information (MI) based Segmented Stacked Autoencoder (S-SAE) approach for spectral-spatial classification of the HS data is proposed to reduce the complexity and computational time compared to Stacked Autoencoder (SAE) based feature extraction. A non-parametric dependency measure (MI) based spectral segmentation is proposed instead of linear and parametric dependency measure to take care of both linear and nonlinear inter-band dependency for spectral segmentation of the HS bands. Then morphological profiles are created corresponding to segmented spectral features to assimilate the spatial information in the spectral-spatial classification approach. Two non-parametric classifiers, Support Vector Machine (SVM) with Gaussian kernel and Random Forest (RF) are used for classification of the three most popularly used HS datasets. Results of the numerical experiments carried out in this study have shown that SVM with a Gaussian kernel is providing better results for the Pavia University and Botswana datasets whereas RF is performing better for Indian Pines dataset. The experiments performed with the proposed methodology provide encouraging results compared to numerous existing approaches.

  12. A Maximum Entropy Approach to Loss Distribution Analysis

    Directory of Open Access Journals (Sweden)

    Marco Bee

    2013-03-01

    Full Text Available In this paper we propose an approach to the estimation and simulation of loss distributions based on Maximum Entropy (ME, a non-parametric technique that maximizes the Shannon entropy of the data under moment constraints. Special cases of the ME density correspond to standard distributions; therefore, this methodology is very general as it nests most classical parametric approaches. Sampling the ME distribution is essential in many contexts, such as loss models constructed via compound distributions. Given the difficulties in carrying out exact simulation,we propose an innovative algorithm, obtained by means of an extension of Adaptive Importance Sampling (AIS, for the approximate simulation of the ME distribution. Several numerical experiments confirm that the AIS-based simulation technique works well, and an application to insurance data gives further insights in the usefulness of the method for modelling, estimating and simulating loss distributions.

  13. Modelación de episodios críticos de contaminación por material particulado (PM10 en Santiago de Chile: Comparación de la eficiencia predictiva de los modelos paramétricos y no paramétricos Modeling critical episodes of air pollution by PM10 in Santiago, Chile: Comparison of the predictive efficiency of parametric and non-parametric statistical models

    Directory of Open Access Journals (Sweden)

    Sergio A. Alvarado

    2010-12-01

    Full Text Available Objetivo: Evaluar la eficiencia predictiva de modelos estadísticos paramétricos y no paramétricos para predecir episodios críticos de contaminación por material particulado PM10 del día siguiente, que superen en Santiago de Chile la norma de calidad diaria. Una predicción adecuada de tales episodios permite a la autoridad decretar medidas restrictivas que aminoren la gravedad del episodio, y consecuentemente proteger la salud de la comunidad. Método: Se trabajó con las concentraciones de material particulado PM10 registradas en una estación asociada a la red de monitorización de la calidad del aire MACAM-2, considerando 152 observaciones diarias de 14 variables, y con información meteorológica registrada durante los años 2001 a 2004. Se ajustaron modelos estadísticos paramétricos Gamma usando el paquete estadístico STATA v11, y no paramétricos usando una demo del software estadístico MARS v 2.0 distribuida por Salford-Systems. Resultados: Ambos métodos de modelación presentan una alta correlación entre los valores observados y los predichos. Los modelos Gamma presentan mejores aciertos que MARS para las concentraciones de PM10 con valores Objective: To evaluate the predictive efficiency of two statistical models (one parametric and the other non-parametric to predict critical episodes of air pollution exceeding daily air quality standards in Santiago, Chile by using the next day PM10 maximum 24h value. Accurate prediction of such episodes would allow restrictive measures to be applied by health authorities to reduce their seriousness and protect the community´s health. Methods: We used the PM10 concentrations registered by a station of the Air Quality Monitoring Network (152 daily observations of 14 variables and meteorological information gathered from 2001 to 2004. To construct predictive models, we fitted a parametric Gamma model using STATA v11 software and a non-parametric MARS model by using a demo version of Salford

  14. A hierarchical bayesian approach to ecological count data: a flexible tool for ecologists.

    Directory of Open Access Journals (Sweden)

    James A Fordyce

    Full Text Available Many ecological studies use the analysis of count data to arrive at biologically meaningful inferences. Here, we introduce a hierarchical bayesian approach to count data. This approach has the advantage over traditional approaches in that it directly estimates the parameters of interest at both the individual-level and population-level, appropriately models uncertainty, and allows for comparisons among models, including those that exceed the complexity of many traditional approaches, such as ANOVA or non-parametric analogs. As an example, we apply this method to oviposition preference data for butterflies in the genus Lycaeides. Using this method, we estimate the parameters that describe preference for each population, compare the preference hierarchies among populations, and explore various models that group populations that share the same preference hierarchy.

  15. Illiquidity premium and expected stock returns in the UK: A new approach

    Science.gov (United States)

    Chen, Jiaqi; Sherif, Mohamed

    2016-09-01

    This study examines the relative importance of liquidity risk for the time-series and cross-section of stock returns in the UK. We propose a simple way to capture the multidimensionality of illiquidity. Our analysis indicates that existing illiquidity measures have considerable asset specific components, which justifies our new approach. Further, we use an alternative test of the Amihud (2002) measure and parametric and non-parametric methods to investigate whether liquidity risk is priced in the UK. We find that the inclusion of the illiquidity factor in the capital asset pricing model plays a significant role in explaining the cross-sectional variation in stock returns, in particular with the Fama-French three-factor model. Further, using Hansen-Jagannathan non-parametric bounds, we find that the illiquidity-augmented capital asset pricing models yield a small distance error, other non-liquidity based models fail to yield economically plausible distance values. Our findings have important implications for managing the liquidity risk of equity portfolios.

  16. AucPR: an AUC-based approach using penalized regression for disease prediction with high-dimensional omics data.

    Science.gov (United States)

    Yu, Wenbao; Park, Taesung

    2014-01-01

    It is common to get an optimal combination of markers for disease classification and prediction when multiple markers are available. Many approaches based on the area under the receiver operating characteristic curve (AUC) have been proposed. Existing works based on AUC in a high-dimensional context depend mainly on a non-parametric, smooth approximation of AUC, with no work using a parametric AUC-based approach, for high-dimensional data. We propose an AUC-based approach using penalized regression (AucPR), which is a parametric method used for obtaining a linear combination for maximizing the AUC. To obtain the AUC maximizer in a high-dimensional context, we transform a classical parametric AUC maximizer, which is used in a low-dimensional context, into a regression framework and thus, apply the penalization regression approach directly. Two kinds of penalization, lasso and elastic net, are considered. The parametric approach can avoid some of the difficulties of a conventional non-parametric AUC-based approach, such as the lack of an appropriate concave objective function and a prudent choice of the smoothing parameter. We apply the proposed AucPR for gene selection and classification using four real microarray and synthetic data. Through numerical studies, AucPR is shown to perform better than the penalized logistic regression and the nonparametric AUC-based method, in the sense of AUC and sensitivity for a given specificity, particularly when there are many correlated genes. We propose a powerful parametric and easily-implementable linear classifier AucPR, for gene selection and disease prediction for high-dimensional data. AucPR is recommended for its good prediction performance. Beside gene expression microarray data, AucPR can be applied to other types of high-dimensional omics data, such as miRNA and protein data.

  17. A hybrid modeling approach for option pricing

    Science.gov (United States)

    Hajizadeh, Ehsan; Seifi, Abbas

    2011-11-01

    The complexity of option pricing has led many researchers to develop sophisticated models for such purposes. The commonly used Black-Scholes model suffers from a number of limitations. One of these limitations is the assumption that the underlying probability distribution is lognormal and this is so controversial. We propose a couple of hybrid models to reduce these limitations and enhance the ability of option pricing. The key input to option pricing model is volatility. In this paper, we use three popular GARCH type model for estimating volatility. Then, we develop two non-parametric models based on neural networks and neuro-fuzzy networks to price call options for S&P 500 index. We compare the results with those of Black-Scholes model and show that both neural network and neuro-fuzzy network models outperform Black-Scholes model. Furthermore, comparing the neural network and neuro-fuzzy approaches, we observe that for at-the-money options, neural network model performs better and for both in-the-money and an out-of-the money option, neuro-fuzzy model provides better results.

  18. CADDIS Volume 4. Data Analysis: PECBO Appendix - R Scripts for Non-Parametric Regressions

    Science.gov (United States)

    Script for computing nonparametric regression analysis. Overview of using scripts to infer environmental conditions from biological observations, statistically estimating species-environment relationships, statistical scripts.

  19. A Probabilistic, Non-parametric Framework for Inter-modality Label Fusion

    DEFF Research Database (Denmark)

    Iglesias, Juan Eugenio; Sabuncu, Mert Rory; Van Leemput, Koen

    2013-01-01

    Multi-atlas techniques are commonplace in medical image segmentation due to their high performance and ease of implementation. Locally weighting the contributions from the different atlases in the label fusion process can improve the quality of the segmentation. However, how to define these weights...

  20. Non-Parametric, Closed-Loop Testing of Autonomy in Unmanned Aircraft Systems, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed Phase I program aims to develop new methods to support safety testing for integration of Unmanned Aircraft Systems into the National Airspace (NAS) with...

  1. Non-parametric Bayesian models of response function in dynamic image sequences

    Czech Academy of Sciences Publication Activity Database

    Tichý, Ondřej; Šmídl, Václav

    2016-01-01

    Roč. 151, č. 1 (2016), s. 90-100 ISSN 1077-3142 R&D Projects: GA ČR GA13-29225S Institutional support: RVO:67985556 Keywords : Response function * Blind source separation * Dynamic medical imaging * Probabilistic models * Bayesian methods Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 2.498, year: 2016 http://library.utia.cas.cz/separaty/2016/AS/tichy-0456983.pdf

  2. A critique of non-parametric efficiency analysis in energy economics studies

    International Nuclear Information System (INIS)

    Chen, Chien-Ming

    2013-01-01

    The paper reexamines non-additive environmental efficiency models with weakly-disposable undesirable outputs appeared in the literature of energy economics. These efficiency models are used in numerous studies published in this journal and other energy-related outlets. Recent studies, however, have found key limitations of the weak-disposability assumption in its application to environmental efficiency analysis. It is found that efficiency scores obtained from non-additive efficiency models can be non-monotonic in pollution quantities under the weak-disposability assumption — which is against common intuition and the principle of environmental economics. In this paper, I present taxonomy of efficiency models found in the energy economics literature and illustrate the above limitations and discuss implications of monotonicity from a practical viewpoint. Finally, I review the formulations for a variable returns-to-scale technology with weakly-disposable undesirable outputs, which has been misused in a number of papers in the energy economics literature. An application to evaluating the energy efficiencies of 23 European Union states is presented to illustrate the problem. - Highlights: • Review different environmental efficiency model used in energy economics studies • Highlight limitations of these environmental efficiency models • These limitations have not been recognized in the existing energy economics literature. • Data from 23 European Union states are used to illustrate the methodological consequences

  3. A non-parametric estimator for the doubly-periodic Poisson intensity function

    NARCIS (Netherlands)

    R. Helmers (Roelof); I.W. Mangku (Wayan); R. Zitikis

    2007-01-01

    textabstractIn a series of papers, J. Garrido and Y. Lu have proposed and investigated a doubly-periodic Poisson model, and then applied it to analyze hurricane data. The authors have suggested several parametric models for the underlying intensity function. In the present paper we construct and

  4. Inferential, non-parametric statistics to assess the quality of probabilistic forecast systems

    NARCIS (Netherlands)

    Maia, A.H.N.; Meinke, H.B.; Lennox, S.; Stone, R.C.

    2007-01-01

    Many statistical forecast systems are available to interested users. To be useful for decision making, these systems must be based on evidence of underlying mechanisms. Once causal connections between the mechanism and its statistical manifestation have been firmly established, the forecasts must

  5. A non-parametric test for partial monotonicity in multiple regression

    NARCIS (Netherlands)

    van Beek, M.; Daniëls, H.A.M.

    Partial positive (negative) monotonicity in a dataset is the property that an increase in an independent variable, ceteris paribus, generates an increase (decrease) in the dependent variable. A test for partial monotonicity in datasets could (1) increase model performance if monotonicity may be

  6. Parametric methods outperformed non-parametric methods in comparisons of discrete numerical variables

    Directory of Open Access Journals (Sweden)

    Sandvik Leiv

    2011-04-01

    Full Text Available Abstract Background The number of events per individual is a widely reported variable in medical research papers. Such variables are the most common representation of the general variable type called discrete numerical. There is currently no consensus on how to compare and present such variables, and recommendations are lacking. The objective of this paper is to present recommendations for analysis and presentation of results for discrete numerical variables. Methods Two simulation studies were used to investigate the performance of hypothesis tests and confidence interval methods for variables with outcomes {0, 1, 2}, {0, 1, 2, 3}, {0, 1, 2, 3, 4}, and {0, 1, 2, 3, 4, 5}, using the difference between the means as an effect measure. Results The Welch U test (the T test with adjustment for unequal variances and its associated confidence interval performed well for almost all situations considered. The Brunner-Munzel test also performed well, except for small sample sizes (10 in each group. The ordinary T test, the Wilcoxon-Mann-Whitney test, the percentile bootstrap interval, and the bootstrap-t interval did not perform satisfactorily. Conclusions The difference between the means is an appropriate effect measure for comparing two independent discrete numerical variables that has both lower and upper bounds. To analyze this problem, we encourage more frequent use of parametric hypothesis tests and confidence intervals.

  7. Measuring the Influence of Information Networks on Transaction Costs Using a Non-parametric Regression Technique

    DEFF Research Database (Denmark)

    Henningsen, Geraldine; Henningsen, Arne; Henning, Christian

    2011-01-01

    All business transactions as well as achieving innovations take up resources, subsumed under the concept of transaction costs (TAC). One of the major factors in TAC theory is information. Information networks can catalyse the interpersonal information exchange and hence, increase the access...... to nonpublic information. Our analysis shows that information networks have an impact on the level of TAC. Many resources that are sacrificed for TAC are inputs that also enter the technical production process. As most production data do not separate between these two usages of inputs, high transaction costs...

  8. Non-parametric co-clustering of large scale sparse bipartite networks on the GPU

    DEFF Research Database (Denmark)

    Hansen, Toke Jansen; Mørup, Morten; Hansen, Lars Kai

    2011-01-01

    of row and column clusters from a hypothesis space of an infinite number of clusters. To reach large scale applications of co-clustering we exploit that parameter inference for co-clustering is well suited for parallel computing. We develop a generic GPU framework for efficient inference on large scale...... sparse bipartite networks and achieve a speedup of two orders of magnitude compared to estimation based on conventional CPUs. In terms of scalability we find for networks with more than 100 million links that reliable inference can be achieved in less than an hour on a single GPU. To efficiently manage...

  9. Non-parametric analysis of technical efficiency: factors affecting efficiency of West Java rice farms

    Czech Academy of Sciences Publication Activity Database

    Brázdik, František

    -, č. 286 (2006), s. 1-45 ISSN 1211-3298 R&D Projects: GA MŠk LC542 Institutional research plan: CEZ:AV0Z70850503 Keywords : rice farms * data envelopment analysis Subject RIV: AH - Economics http://www.cerge-ei.cz/pdf/wp/Wp286.pdf

  10. Measuring the influence of networks on transaction costs using a non-parametric regression technique

    DEFF Research Database (Denmark)

    Henningsen, Géraldine; Henningsen, Arne; Henning, Christian H.C.A.

    All business transactions as well as achieving innovations take up resources, subsumed under the concept of transaction costs. One of the major factors in transaction costs theory is information. Firm networks can catalyse the interpersonal information exchange and hence, increase the access to non......-public information so that transaction costs are reduced. Many resources that are sacrificed for transaction costs are inputs that also enter the technical production process. As most production data do not distinguish between these two usages of inputs, high transaction costs result in reduced observed productivity...

  11. Non-parametric Bayesian graph models reveal community structure in resting state fMRI

    DEFF Research Database (Denmark)

    Andersen, Kasper Winther; Madsen, Kristoffer H.; Siebner, Hartwig Roman

    2014-01-01

    Modeling of resting state functional magnetic resonance imaging (rs-fMRI) data using network models is of increasing interest. It is often desirable to group nodes into clusters to interpret the communication patterns between nodes. In this study we consider three different nonparametric Bayesian...... models for node clustering in complex networks. In particular, we test their ability to predict unseen data and their ability to reproduce clustering across datasets. The three generative models considered are the Infinite Relational Model (IRM), Bayesian Community Detection (BCD), and the Infinite...... between clusters. BCD restricts the between-cluster link probabilities to be strictly lower than within-cluster link probabilities to conform to the community structure typically seen in social networks. IDM only models a single between-cluster link probability, which can be interpreted as a background...

  12. Non-parametric estimation of the availability in a general repairable system

    International Nuclear Information System (INIS)

    Gamiz, M.L.; Roman, Y.

    2008-01-01

    This work deals with repairable systems with unknown failure and repair time distributions. We focus on the estimation of the instantaneous availability, that is, the probability that the system is functioning at a given time, which we consider as the most significant measure for evaluating the effectiveness of a repairable system. The estimation of the availability function is not, in general, an easy task, i.e., analytical techniques are difficult to apply. We propose a smooth estimation of the availability based on kernel estimator of the cumulative distribution functions (CDF) of the failure and repair times, for which the bandwidth parameters are obtained by bootstrap procedures. The consistency properties of the availability estimator are established by using techniques based on the Laplace transform

  13. Non-parametric comparison of histogrammed two-dimensional data distributions using the Energy Test

    International Nuclear Information System (INIS)

    Reid, Ivan D; Lopes, Raul H C; Hobson, Peter R

    2012-01-01

    When monitoring complex experiments, comparison is often made between regularly acquired histograms of data and reference histograms which represent the ideal state of the equipment. With the larger HEP experiments now ramping up, there is a need for automation of this task since the volume of comparisons could overwhelm human operators. However, the two-dimensional histogram comparison tools available in ROOT have been noted in the past to exhibit shortcomings. We discuss a newer comparison test for two-dimensional histograms, based on the Energy Test of Aslan and Zech, which provides more conclusive discrimination between histograms of data coming from different distributions than methods provided in a recent ROOT release.

  14. The development of a new algorithm to calculate a survival function in non-parametric ways

    International Nuclear Information System (INIS)

    Ahn, Kwang Won; Kim, Yoon Ik; Chung, Chang Hyun; Kim, Kil Yoo

    2001-01-01

    In this study, a generalized formula of the Kaplan-Meier method is developed. The idea of this algorithm is that the result of the Kaplan-Meier estimator is the same as that of the redistribute-to-the right algorithm. Hence, the result of the Kaplan-Meier estimator is used when we redistribute to the right. This can be explained as the following steps, at first, the same mass is distributed to all the points. At second, when you reach the censored points, you must redistribute the mass of that point to the right according to the following rule; to normalize the masses, which are located to the right of the censored point, and redistribute the mass of the censored point to the right according to the ratio of the normalized mass. Until now, we illustrate the main idea of this algorithm.The meaning of that idea is more efficient than PL-estimator in the sense that it decreases the mass of after that area. Just like a redistribute to the right algorithm, this method is enough for the probability theory

  15. Non-parametric estimation of the availability in a general repairable system

    Energy Technology Data Exchange (ETDEWEB)

    Gamiz, M.L. [Departamento de Estadistica e I.O., Facultad de Ciencias, Universidad de Granada, Granada 18071 (Spain)], E-mail: mgamiz@ugr.es; Roman, Y. [Departamento de Estadistica e I.O., Facultad de Ciencias, Universidad de Granada, Granada 18071 (Spain)

    2008-08-15

    This work deals with repairable systems with unknown failure and repair time distributions. We focus on the estimation of the instantaneous availability, that is, the probability that the system is functioning at a given time, which we consider as the most significant measure for evaluating the effectiveness of a repairable system. The estimation of the availability function is not, in general, an easy task, i.e., analytical techniques are difficult to apply. We propose a smooth estimation of the availability based on kernel estimator of the cumulative distribution functions (CDF) of the failure and repair times, for which the bandwidth parameters are obtained by bootstrap procedures. The consistency properties of the availability estimator are established by using techniques based on the Laplace transform.

  16. Comparison of non-parametric methods for ungrouping coarsely aggregated data

    DEFF Research Database (Denmark)

    Rizzi, Silvia; Thinggaard, Mikael; Engholm, Gerda

    2016-01-01

    group at the highest ages. When histogram intervals are too coarse, information is lost and comparison between histograms with different boundaries is arduous. In these cases it is useful to estimate detailed distributions from grouped data. Methods From an extensive literature search we identify five...

  17. A new non-parametric stationarity test of time series in the time domain

    KAUST Repository

    Jin, Lei

    2014-11-07

    © 2015 The Royal Statistical Society and Blackwell Publishing Ltd. We propose a new double-order selection test for checking second-order stationarity of a time series. To develop the test, a sequence of systematic samples is defined via Walsh functions. Then the deviations of the autocovariances based on these systematic samples from the corresponding autocovariances of the whole time series are calculated and the uniform asymptotic joint normality of these deviations over different systematic samples is obtained. With a double-order selection scheme, our test statistic is constructed by combining the deviations at different lags in the systematic samples. The null asymptotic distribution of the statistic proposed is derived and the consistency of the test is shown under fixed and local alternatives. Simulation studies demonstrate well-behaved finite sample properties of the method proposed. Comparisons with some existing tests in terms of power are given both analytically and empirically. In addition, the method proposed is applied to check the stationarity assumption of a chemical process viscosity readings data set.

  18. Non-parametric method for separating domestic hot water heating spikes and space heating

    DEFF Research Database (Denmark)

    Bacher, Peder; de Saint-Aubain, Philip Anton; Christiansen, Lasse Engbo

    2016-01-01

    In this paper a method for separating spikes from a noisy data series, where the data change and evolve over time, is presented. The method is applied on measurements of the total heat load for a single family house. It relies on the fact that the domestic hot water heating is a process generating...

  19. Tremor Detection Using Parametric and Non-Parametric Spectral Estimation Methods : A Comparison with Clinical Assessment

    NARCIS (Netherlands)

    Martinez Manzanera, Octavio; Elting, Jan Willem; van der Hoeven, Johannes H; Maurits, Natasha M

    2016-01-01

    In the clinic, tremor is diagnosed during a time-limited process in which patients are observed and the characteristics of tremor are visually assessed. For some tremor disorders, a more detailed analysis of these characteristics is needed. Accelerometry and electromyography can be used to obtain a

  20. Bayesian Non-Parametric Mixtures of GARCH(1,1 Models

    Directory of Open Access Journals (Sweden)

    John W. Lau

    2012-01-01

    Full Text Available Traditional GARCH models describe volatility levels that evolve smoothly over time, generated by a single GARCH regime. However, nonstationary time series data may exhibit abrupt changes in volatility, suggesting changes in the underlying GARCH regimes. Further, the number and times of regime changes are not always obvious. This article outlines a nonparametric mixture of GARCH models that is able to estimate the number and time of volatility regime changes by mixing over the Poisson-Kingman process. The process is a generalisation of the Dirichlet process typically used in nonparametric models for time-dependent data provides a richer clustering structure, and its application to time series data is novel. Inference is Bayesian, and a Markov chain Monte Carlo algorithm to explore the posterior distribution is described. The methodology is illustrated on the Standard and Poor's 500 financial index.

  1. A new non-parametric stationarity test of time series in the time domain

    KAUST Repository

    Jin, Lei; Wang, Suojin; Wang, Haiyan

    2014-01-01

    © 2015 The Royal Statistical Society and Blackwell Publishing Ltd. We propose a new double-order selection test for checking second-order stationarity of a time series. To develop the test, a sequence of systematic samples is defined via Walsh

  2. Distributed Non-Parametric Representations for Vital Filtering: UW at TREC KBA 2014

    Science.gov (United States)

    2014-11-01

    formation about the entity, every new document would drive an update to the entity profile, strongly suggesting vitalness. Figure 3 represents...of the Twenty-Second Text REtrieval Conference (TREC 2013), 2013. Caruana, Richard. Multitask Learning: A Knowledge-Based Source of Inductive Bias. In

  3. Comparing non-parametric methods for ungrouping coarsely aggregated age-specific distributions

    DEFF Research Database (Denmark)

    Rizzi, Silvia; Thinggaard, Mikael; Vaupel, James W.

    2016-01-01

    Demographers have often access to vital statistics that are less than ideal for the purpose of their research. In many instances demographic data are reported in coarse histograms, where the values given are only the summation of true latent values, thereby making detailed analysis troublesome. O...

  4. A non-parametric/parametric analysis of the universal soil loss equation

    NARCIS (Netherlands)

    Sonneveld, B.G.J.S.; Nearing, M.A.

    2002-01-01

    Due to its modest data demands and transparent model structure, the Universal Soil Loss Equation (USLE) remains the most popular tool for water erosion hazard assessment. However, the model has several shortcomings, two of which are likely to have prominent implications for the model results. First,

  5. A non-parametric/parametric analysis of the universal soil loss equation

    NARCIS (Netherlands)

    Sonneveld, B.G.J.S.; Nearing, M.A.

    2003-01-01

    Due to its modest data demands and transparent model structure, the Universal Soil Loss Equation (USLE) remains the most popular tool for water erosion hazard assessment. However, the model has several shortcomings, two of which are likely to have prominent implications for the model results. First,

  6. Non-parametric causality detection: An application to social media and financial data

    Science.gov (United States)

    Tsapeli, Fani; Musolesi, Mirco; Tino, Peter

    2017-10-01

    According to behavioral finance, stock market returns are influenced by emotional, social and psychological factors. Several recent works support this theory by providing evidence of correlation between stock market prices and collective sentiment indexes measured using social media data. However, a pure correlation analysis is not sufficient to prove that stock market returns are influenced by such emotional factors since both stock market prices and collective sentiment may be driven by a third unmeasured factor. Controlling for factors that could influence the study by applying multivariate regression models is challenging given the complexity of stock market data. False assumptions about the linearity or non-linearity of the model and inaccuracies on model specification may result in misleading conclusions. In this work, we propose a novel framework for causal inference that does not require any assumption about a particular parametric form of the model expressing statistical relationships among the variables of the study and can effectively control a large number of observed factors. We apply our method in order to estimate the causal impact that information posted in social media may have on stock market returns of four big companies. Our results indicate that social media data not only correlate with stock market returns but also influence them.

  7. Classification of thermal-hydraulics using scenarios of non-parametric techniques

    International Nuclear Information System (INIS)

    Villamizar, M.; Martorell, S.; Sanchez-Saez, F.; Villanueva, J. F.; Carlos, S.; Sanchez, A.

    2013-01-01

    The objective of the study is to apply a probabilistic neural network (PNN) that allow to classify sheath temperature trajectory from the beginning of the accident to the stabilization of the plant from a certain inputs and a starting from these groups establish models that predict the clad temperature peaking.

  8. A Non-parametric Method for Calculating Conditional Stressed Value at Risk

    Directory of Open Access Journals (Sweden)

    Kohei Marumo

    2017-01-01

    Full Text Available We consider the Value at Risk (VaR of a portfolio under stressed conditions. In practice, the stressed VaR (sVaR is commonly calculated using the data set that includes the stressed period. It tells us how much the risk amount increases if we use the stressed data set. In this paper, we consider the VaR under stress scenarios. Technically, this can be done by deriving the distribution of profit or loss conditioned on the value of risk factors. We use two methods; the one that uses the linear model and the one that uses the Hermite expansion discussed by Marumo and Wolff (2013, 2016. Numerical examples shows that the method using the Hermite expansion is capable of capturing the non-linear effects such as correlation collapse and volatility clustering, which are often observed in the markets.

  9. Measuring the Influence of Information Networks on Transaction Costs Using a Non-parametric Regression Technique

    DEFF Research Database (Denmark)

    Henningsen, Geraldine; Henningsen, Arne; Henning, Christian

    2011-01-01

    to nonpublic information. Our analysis shows that information networks have an impact on the level of TAC. Many resources that are sacrificed for TAC are inputs that also enter the technical production process. As most production data do not separate between these two usages of inputs, high transaction costs...

  10. Quantifying biometric life insurance risks with non-parametric smoothing methods

    NARCIS (Netherlands)

    Tomas, J.

    2013-01-01

    Julien Tomas beschrijft verdelingsvrije afrondingsmethoden van de sterfte-ervaring bij levensverzekeringen. Net als parametrische methoden neigen ook deze methoden naar onzuivere schattingen, maar zodanig dat het mogelijk is een grotere onzuiverheid op te laten wegen tegen een lagere

  11. Non-parametric classification of esophagus motility by means of neural networks

    DEFF Research Database (Denmark)

    Thøgersen, C; Rasmussen, C; Rutz, K

    1997-01-01

    . The aim of the present work has been to test the ability of neural networks to identify abnormal contraction patterns in patients with non-obstructive dysphagia (NOBD). Nineteen volunteers and 22 patients with NOBD underwent simultaneous recordings of four pressures in the esophagus for at least 23 hours......Automatic long-term recording of esophageal pressures by means of intraluminal transducers is used increasingly for evaluation of esophageal function. Most automatic analysis techniques are based on detection of derived parameters from the time series by means of arbitrary rule-based criterions...

  12. Equipment Health Monitoring with Non-Parametric Statistics for Online Early Detection and Scoring of Degradation

    Science.gov (United States)

    2014-10-02

    defined by Eqs. (3)–(4) (Greenwell & Finch , 2004) (Kar & Mohanty, 2006). The p value provides the metric for novelty scoring. p = QKS(z) = 2 ∞∑ j=1 (−1...provides early detection of degradation and ability to score its significance in order to inform maintenance planning and consequently reduce disruption ...actionable information, sig- nals are typically processed from raw measurements into a reduced dimension novelty summary value that may be more easily

  13. Non-parametric probabilistic forecasts of wind power: required properties and evaluation

    DEFF Research Database (Denmark)

    Pinson, Pierre; Nielsen, Henrik Aalborg; Møller, Jan Kloppenborg

    2007-01-01

    of a single or a set of quantile forecasts. The required and desirable properties of such probabilistic forecasts are defined and a framework for their evaluation is proposed. This framework is applied for evaluating the quality of two statistical methods producing full predictive distributions from point...

  14. Non-parametric photic entrainment of Djungarian hamsters with different rhythmic phenotypes

    Czech Academy of Sciences Publication Activity Database

    Schöttner, Konrad; Hauer, J.; Weinert, D.

    2016-01-01

    Roč. 33, č. 5 (2016), s. 506-519 ISSN 0742-0528 Institutional support: RVO:60077344 Keywords : delayed activity onset * Djungarian hamster * free- running period Subject RIV: ED - Physiology Impact factor: 2.562, year: 2016

  15. Using a Praxeology Approach to the Rational Choice of Specialized Software in the Preparation of the Computer Science Teacher

    Directory of Open Access Journals (Sweden)

    Elena Semenikhina

    2018-02-01

    Full Text Available From the positions of Praxeology, the aspects of the forming of the future teachers’ abilities to choose software that turns out to be the most rational for the solving of the professional task are considered in the article (on the example of computer science teachers’ preparation to use dynamic mathematics software (DMS. The forming of such abilities based on a formula "one task – different software" is described. The methodology of organization of such experimental studies on the base of Sumy State Pedagogical University (Ukraine is described. The results of the statistical analysis of data are given on the basis of non-parametric sign test for dependent sample. It is educed that taking into account such approach provides the positive dynamics of the level of future informatics teachers’ preparation at the significance level of 0.05.

  16. [Approaches to medical training among physicians who teach; analysis of two different educational strategies].

    Science.gov (United States)

    Loría-Castellanos, Jorge; Rivera-lbarra, Doris Beatriz; Márquez-Avila, Guadalupe

    2009-01-01

    Compare the outreach of a promotional educational strategy that focuses on active participation and compare it with a more traditional approach to medical training. A quasi-experimental design was approved by the research committee. We compared the outreach of two different approaches to medical training. We administered a validated instrument that included 72 items that analyze statements used to measure educational tasks in the form of duplets through 3 indicators. A group that included seven physicians that were actively participating in teaching activities was stratified according to teaching approaches. One of the approaches was a traditional one and the other included a promotional strategy aimed at increasing participation. All participants signed informed consent before answering the research instruments. Statistical analysis was done using non-parametric tests. Mann-Whitney results did not show differences among the group in the preliminary analysis. A second analysis with the same test after the interventions found significant differences (p d" 0.018) in favor of those subjects that had participated in the promotional approach mainly in the indicator measuring "consequence". The Wilcoxon test showed that all participants in the promotional approach increased significantly (pd" 0.018) in 3 main indicators as compared with the control group. A promotional strategy aimed at increasing physician participation constitutes a more profitable approach when compared with traditional teaching methods.

  17. Self-consistent approach to the eletronic problem in disordered solids

    International Nuclear Information System (INIS)

    Taguena-Martinez, J.; Barrio, R.A.; Martinez, E.; Yndurain, F.

    1984-01-01

    It is developed a simple formalism which allows us to perform a self consistent non-parametrized calculation in a non-periodic system, by finding out the thermodynamically averaged Green's function of a cluster Bethe lattice system. (Author) [pt

  18. Modeling gene expression measurement error: a quasi-likelihood approach

    Directory of Open Access Journals (Sweden)

    Strimmer Korbinian

    2003-03-01

    Full Text Available Abstract Background Using suitable error models for gene expression measurements is essential in the statistical analysis of microarray data. However, the true probabilistic model underlying gene expression intensity readings is generally not known. Instead, in currently used approaches some simple parametric model is assumed (usually a transformed normal distribution or the empirical distribution is estimated. However, both these strategies may not be optimal for gene expression data, as the non-parametric approach ignores known structural information whereas the fully parametric models run the risk of misspecification. A further related problem is the choice of a suitable scale for the model (e.g. observed vs. log-scale. Results Here a simple semi-parametric model for gene expression measurement error is presented. In this approach inference is based an approximate likelihood function (the extended quasi-likelihood. Only partial knowledge about the unknown true distribution is required to construct this function. In case of gene expression this information is available in the form of the postulated (e.g. quadratic variance structure of the data. As the quasi-likelihood behaves (almost like a proper likelihood, it allows for the estimation of calibration and variance parameters, and it is also straightforward to obtain corresponding approximate confidence intervals. Unlike most other frameworks, it also allows analysis on any preferred scale, i.e. both on the original linear scale as well as on a transformed scale. It can also be employed in regression approaches to model systematic (e.g. array or dye effects. Conclusions The quasi-likelihood framework provides a simple and versatile approach to analyze gene expression data that does not make any strong distributional assumptions about the underlying error model. For several simulated as well as real data sets it provides a better fit to the data than competing models. In an example it also

  19. The quantile regression approach to efficiency measurement: insights from Monte Carlo simulations.

    Science.gov (United States)

    Liu, Chunping; Laporte, Audrey; Ferguson, Brian S

    2008-09-01

    In the health economics literature there is an ongoing debate over approaches used to estimate the efficiency of health systems at various levels, from the level of the individual hospital - or nursing home - up to that of the health system as a whole. The two most widely used approaches to evaluating the efficiency with which various units deliver care are non-parametric data envelopment analysis (DEA) and parametric stochastic frontier analysis (SFA). Productivity researchers tend to have very strong preferences over which methodology to use for efficiency estimation. In this paper, we use Monte Carlo simulation to compare the performance of DEA and SFA in terms of their ability to accurately estimate efficiency. We also evaluate quantile regression as a potential alternative approach. A Cobb-Douglas production function, random error terms and a technical inefficiency term with different distributions are used to calculate the observed output. The results, based on these experiments, suggest that neither DEA nor SFA can be regarded as clearly dominant, and that, depending on the quantile estimated, the quantile regression approach may be a useful addition to the armamentarium of methods for estimating technical efficiency.

  20. Species richness in soil bacterial communities: a proposed approach to overcome sample size bias.

    Science.gov (United States)

    Youssef, Noha H; Elshahed, Mostafa S

    2008-09-01

    Estimates of species richness based on 16S rRNA gene clone libraries are increasingly utilized to gauge the level of bacterial diversity within various ecosystems. However, previous studies have indicated that regardless of the utilized approach, species richness estimates obtained are dependent on the size of the analyzed clone libraries. We here propose an approach to overcome sample size bias in species richness estimates in complex microbial communities. Parametric (Maximum likelihood-based and rarefaction curve-based) and non-parametric approaches were used to estimate species richness in a library of 13,001 near full-length 16S rRNA clones derived from soil, as well as in multiple subsets of the original library. Species richness estimates obtained increased with the increase in library size. To obtain a sample size-unbiased estimate of species richness, we calculated the theoretical clone library sizes required to encounter the estimated species richness at various clone library sizes, used curve fitting to determine the theoretical clone library size required to encounter the "true" species richness, and subsequently determined the corresponding sample size-unbiased species richness value. Using this approach, sample size-unbiased estimates of 17,230, 15,571, and 33,912 were obtained for the ML-based, rarefaction curve-based, and ACE-1 estimators, respectively, compared to bias-uncorrected values of 15,009, 11,913, and 20,909.

  1. Family members' involvement in psychiatric care: experiences of the healthcare professionals' approach and feeling of alienation.

    Science.gov (United States)

    Ewertzon, M; Lützén, K; Svensson, E; Andershed, B

    2010-06-01

    The involvement of family members in psychiatric care is important for the recovery of persons with psychotic disorders and subsequently reduces the burden on the family. Earlier qualitative studies suggest that the participation of family members can be limited by how they experience the professionals' approach, which suggests a connection to the concept of alienation. Thus, the aim of this study was in a national sample investigate family members' experiences of the psychiatric health care professionals' approach. Data were collected by the Family Involvement and Alienation Questionnaire. The median level and quartiles were used to describe the distributions and data were analysed with non-parametric statistical methods. Seventy family members of persons receiving psychiatric care participated in the study. The results indicate that a majority of the participants respond that they have experiencing a negative approach from the professionals, indicating lack of confirmation and cooperation. The results also indicate that a majority of the participants felt powerlessness and social isolation in the care being provided, indicating feelings of alienation. A significant but weak association was found between the family members' experiences of the professionals' approach and their feelings of alienation.

  2. An obstructive sleep apnea detection approach using kernel density classification based on single-lead electrocardiogram.

    Science.gov (United States)

    Chen, Lili; Zhang, Xi; Wang, Hui

    2015-05-01

    Obstructive sleep apnea (OSA) is a common sleep disorder that often remains undiagnosed, leading to an increased risk of developing cardiovascular diseases. Polysomnogram (PSG) is currently used as a golden standard for screening OSA. However, because it is time consuming, expensive and causes discomfort, alternative techniques based on a reduced set of physiological signals are proposed to solve this problem. This study proposes a convenient non-parametric kernel density-based approach for detection of OSA using single-lead electrocardiogram (ECG) recordings. Selected physiologically interpretable features are extracted from segmented RR intervals, which are obtained from ECG signals. These features are fed into the kernel density classifier to detect apnea event and bandwidths for density of each class (normal or apnea) are automatically chosen through an iterative bandwidth selection algorithm. To validate the proposed approach, RR intervals are extracted from ECG signals of 35 subjects obtained from a sleep apnea database ( http://physionet.org/cgi-bin/atm/ATM ). The results indicate that the kernel density classifier, with two features for apnea event detection, achieves a mean accuracy of 82.07 %, with mean sensitivity of 83.23 % and mean specificity of 80.24 %. Compared with other existing methods, the proposed kernel density approach achieves a comparably good performance but by using fewer features without significantly losing discriminant power, which indicates that it could be widely used for home-based screening or diagnosis of OSA.

  3. An information-theoretic approach to motor action decoding with a reconfigurable parallel architecture.

    Science.gov (United States)

    Craciun, Stefan; Brockmeier, Austin J; George, Alan D; Lam, Herman; Príncipe, José C

    2011-01-01

    Methods for decoding movements from neural spike counts using adaptive filters often rely on minimizing the mean-squared error. However, for non-Gaussian distribution of errors, this approach is not optimal for performance. Therefore, rather than using probabilistic modeling, we propose an alternate non-parametric approach. In order to extract more structure from the input signal (neuronal spike counts) we propose using minimum error entropy (MEE), an information-theoretic approach that minimizes the error entropy as part of an iterative cost function. However, the disadvantage of using MEE as the cost function for adaptive filters is the increase in computational complexity. In this paper we present a comparison between the decoding performance of the analytic Wiener filter and a linear filter trained with MEE, which is then mapped to a parallel architecture in reconfigurable hardware tailored to the computational needs of the MEE filter. We observe considerable speedup from the hardware design. The adaptation of filter weights for the multiple-input, multiple-output linear filters, necessary in motor decoding, is a highly parallelizable algorithm. It can be decomposed into many independent computational blocks with a parallel architecture readily mapped to a field-programmable gate array (FPGA) and scales to large numbers of neurons. By pipelining and parallelizing independent computations in the algorithm, the proposed parallel architecture has sublinear increases in execution time with respect to both window size and filter order.

  4. The Wally plot approach to assess the calibration of clinical prediction models.

    Science.gov (United States)

    Blanche, Paul; Gerds, Thomas A; Ekstrøm, Claus T

    2017-12-06

    A prediction model is calibrated if, roughly, for any percentage x we can expect that x subjects out of 100 experience the event among all subjects that have a predicted risk of x%. Typically, the calibration assumption is assessed graphically but in practice it is often challenging to judge whether a "disappointing" calibration plot is the consequence of a departure from the calibration assumption, or alternatively just "bad luck" due to sampling variability. We propose a graphical approach which enables the visualization of how much a calibration plot agrees with the calibration assumption to address this issue. The approach is mainly based on the idea of generating new plots which mimic the available data under the calibration assumption. The method handles the common non-trivial situations in which the data contain censored observations and occurrences of competing events. This is done by building on ideas from constrained non-parametric maximum likelihood estimation methods. Two examples from large cohort data illustrate our proposal. The 'wally' R package is provided to make the methodology easily usable.

  5. A Gaussian process regression based hybrid approach for short-term wind speed prediction

    International Nuclear Information System (INIS)

    Zhang, Chi; Wei, Haikun; Zhao, Xin; Liu, Tianhong; Zhang, Kanjian

    2016-01-01

    Highlights: • A novel hybrid approach is proposed for short-term wind speed prediction. • This method combines the parametric AR model with the non-parametric GPR model. • The relative importance of different inputs is considered. • Different types of covariance functions are considered and combined. • It can provide both accurate point forecasts and satisfactory prediction intervals. - Abstract: This paper proposes a hybrid model based on autoregressive (AR) model and Gaussian process regression (GPR) for probabilistic wind speed forecasting. In the proposed approach, the AR model is employed to capture the overall structure from wind speed series, and the GPR is adopted to extract the local structure. Additionally, automatic relevance determination (ARD) is used to take into account the relative importance of different inputs, and different types of covariance functions are combined to capture the characteristics of the data. The proposed hybrid model is compared with the persistence model, artificial neural network (ANN), and support vector machine (SVM) for one-step ahead forecasting, using wind speed data collected from three wind farms in China. The forecasting results indicate that the proposed method can not only improve point forecasts compared with other methods, but also generate satisfactory prediction intervals.

  6. Quality assessment of Isfahan Medical Faculty web site electronic services and prioritizing solutions using analytic hierarchy process approach.

    Science.gov (United States)

    Hajrahimi, Nafiseh; Dehaghani, Sayed Mehdi Hejazi; Hajrahimi, Nargess; Sarmadi, Sima

    2014-01-01

    Implementing information technology in the best possible way can bring many advantages such as applying electronic services and facilitating tasks. Therefore, assessment of service providing systems is a way to improve the quality and elevate these systems including e-commerce, e-government, e-banking, and e-learning. This study was aimed to evaluate the electronic services in the website of Isfahan University of Medical Sciences in order to propose solutions to improve them. Furthermore, we aim to rank the solutions based on the factors that enhance the quality of electronic services by using analytic hierarchy process (AHP) method. Non-parametric test was used to assess the quality of electronic services. The assessment of propositions was based on Aqual model and they were prioritized using AHP approach. The AHP approach was used because it directly applies experts' deductions in the model, and lead to more objective results in the analysis and prioritizing the risks. After evaluating the quality of the electronic services, a multi-criteria decision making frame-work was used to prioritize the proposed solutions. Non-parametric tests and AHP approach using Expert Choice software. The results showed that students were satisfied in most of the indicators. Only a few indicators received low satisfaction from students including, design attractiveness, the amount of explanation and details of information, honesty and responsiveness of authorities, and the role of e-services in the user's relationship with university. After interviewing with Information and Communications Technology (ICT) experts at the university, measurement criteria, and solutions to improve the quality were collected. The best solutions were selected by EC software. According to the results, the solution "controlling and improving the process in handling users complaints" is of the utmost importance and authorities have to have it on the website and place great importance on updating this process

  7. Quality assessment of Isfahan Medical Faculty web site electronic services and prioritizing solutions using analytic hierarchy process approach

    Science.gov (United States)

    Hajrahimi, Nafiseh; Dehaghani, Sayed Mehdi Hejazi; Hajrahimi, Nargess; Sarmadi, Sima

    2014-01-01

    Context: Implementing information technology in the best possible way can bring many advantages such as applying electronic services and facilitating tasks. Therefore, assessment of service providing systems is a way to improve the quality and elevate these systems including e-commerce, e-government, e-banking, and e-learning. Aims: This study was aimed to evaluate the electronic services in the website of Isfahan University of Medical Sciences in order to propose solutions to improve them. Furthermore, we aim to rank the solutions based on the factors that enhance the quality of electronic services by using analytic hierarchy process (AHP) method. Materials and Methods: Non-parametric test was used to assess the quality of electronic services. The assessment of propositions was based on Aqual model and they were prioritized using AHP approach. The AHP approach was used because it directly applies experts’ deductions in the model, and lead to more objective results in the analysis and prioritizing the risks. After evaluating the quality of the electronic services, a multi-criteria decision making frame-work was used to prioritize the proposed solutions. Statistical Analysis Used: Non-parametric tests and AHP approach using Expert Choice software. Results: The results showed that students were satisfied in most of the indicators. Only a few indicators received low satisfaction from students including, design attractiveness, the amount of explanation and details of information, honesty and responsiveness of authorities, and the role of e-services in the user's relationship with university. After interviewing with Information and Communications Technology (ICT) experts at the university, measurement criteria, and solutions to improve the quality were collected. The best solutions were selected by EC software. According to the results, the solution “controlling and improving the process in handling users complaints” is of the utmost importance and authorities

  8. Censoring: a new approach for detection limits in total-reflection X-ray fluorescence

    International Nuclear Information System (INIS)

    Pajek, M.; Kubala-Kukus, A.; Braziewicz, J.

    2004-01-01

    It is shown that the detection limits in the total-reflection X-ray fluorescence (TXRF), which restrict quantification of very low concentrations of trace elements in the samples, can be accounted for using the statistical concept of censoring. We demonstrate that the incomplete TXRF measurements containing the so-called 'nondetects', i.e. the non-measured concentrations falling below the detection limits and represented by the estimated detection limit values, can be viewed as the left random-censored data, which can be further analyzed using the Kaplan-Meier (KM) method correcting for nondetects. Within this approach, which uses the Kaplan-Meier product-limit estimator to obtain the cumulative distribution function corrected for the nondetects, the mean value and median of the detection limit censored concentrations can be estimated in a non-parametric way. The Monte Carlo simulations performed show that the Kaplan-Meier approach yields highly accurate estimates for the mean and median concentrations, being within a few percent with respect to the simulated, uncensored data. This means that the uncertainties of KM estimated mean value and median are limited in fact only by the number of studied samples and not by the applied correction procedure for nondetects itself. On the other hand, it is observed that, in case when the concentration of a given element is not measured in all the samples, simple approaches to estimate a mean concentration value from the data yield erroneous, systematically biased results. The discussed random-left censoring approach was applied to analyze the TXRF detection-limit-censored concentration measurements of trace elements in biomedical samples. We emphasize that the Kaplan-Meier approach allows one to estimate the mean concentrations being substantially below the mean level of detection limits. Consequently, this approach gives a new access to lower the effective detection limits for TXRF method, which is of prime interest for

  9. Assessing the effectiveness of sustainable land management policies for combating desertification: A data mining approach.

    Science.gov (United States)

    Salvati, L; Kosmas, C; Kairis, O; Karavitis, C; Acikalin, S; Belgacem, A; Solé-Benet, A; Chaker, M; Fassouli, V; Gokceoglu, C; Gungor, H; Hessel, R; Khatteli, H; Kounalaki, A; Laouina, A; Ocakoglu, F; Ouessar, M; Ritsema, C; Sghaier, M; Sonmez, H; Taamallah, H; Tezcan, L; de Vente, J; Kelly, C; Colantoni, A; Carlucci, M

    2016-12-01

    This study investigates the relationship between fine resolution, local-scale biophysical and socioeconomic contexts within which land degradation occurs, and the human responses to it. The research draws on experimental data collected under different territorial and socioeconomic conditions at 586 field sites in five Mediterranean countries (Spain, Greece, Turkey, Tunisia and Morocco). We assess the level of desertification risk under various land management practices (terracing, grazing control, prevention of wildland fires, soil erosion control measures, soil water conservation measures, sustainable farming practices, land protection measures and financial subsidies) taken as possible responses to land degradation. A data mining approach, incorporating principal component analysis, non-parametric correlations, multiple regression and canonical analysis, was developed to identify the spatial relationship between land management conditions, the socioeconomic and environmental context (described using 40 biophysical and socioeconomic indicators) and desertification risk. Our analysis identified a number of distinct relationships between the level of desertification experienced and the underlying socioeconomic context, suggesting that the effectiveness of responses to land degradation is strictly dependent on the local biophysical and socioeconomic context. Assessing the latent relationship between land management practices and the biophysical/socioeconomic attributes characterizing areas exposed to different levels of desertification risk proved to be an indirect measure of the effectiveness of field actions contrasting land degradation. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. A neural-fuzzy approach to classify the ecological status in surface waters

    International Nuclear Information System (INIS)

    Ocampo-Duque, William; Schuhmacher, Marta; Domingo, Jose L.

    2007-01-01

    A methodology based on a hybrid approach that combines fuzzy inference systems and artificial neural networks has been used to classify ecological status in surface waters. This methodology has been proposed to deal efficiently with the non-linearity and highly subjective nature of variables involved in this serious problem. Ecological status has been assessed with biological, hydro-morphological, and physicochemical indicators. A data set collected from 378 sampling sites in the Ebro river basin has been used to train and validate the hybrid model. Up to 97.6% of sampling sites have been correctly classified with neural-fuzzy models. Such performance resulted very competitive when compared with other classification algorithms. With non-parametric classification-regression trees and probabilistic neural networks, the predictive capacities were 90.7% and 97.0%, respectively. The proposed methodology can support decision-makers in evaluation and classification of ecological status, as required by the EU Water Framework Directive. - Fuzzy inference systems can be used as environmental classifiers

  11. Energy consumption and economic growth: Parametric and non-parametric causality testing for the case of Greece

    International Nuclear Information System (INIS)

    Dergiades, Theologos; Martinopoulos, Georgios; Tsoulfidis, Lefteris

    2013-01-01

    The objective of this paper is to contribute towards the understanding of the linear and non-linear causal linkages between energy consumption and economic activity, making use of annual time series data of Greece for the period 1960–2008. Two are the salient features of our study: first, the total energy consumption has been adjusted for qualitative differences among its constituent components through the thermodynamics of energy conversion. In doing so, we rule out the possibility of a misleading inference with respect to causality due to aggregation bias. Second, the investigation of the causal linkage between economic growth and the adjusted for quality total energy consumption is conducted within a non-linear context. Our empirical results reveal significant unidirectional both linear and non-linear causal linkages running from total useful energy to economic growth. These findings may provide valuable information for the contemplation of more effective energy policies with respect to both the consumption of energy and environmental protection. - Highlights: ► The energy consumption and economic growth nexus is investigated for Greece. ► A quality-adjusted energy series is used in our analysis. ► The causality testing procedure is conducted within a non-linear context. ► A causality running from energy consumption to economic growth is verified

  12. Causal independence between energy consumption and economic growth in Liberia: Evidence from a non-parametric bootstrapped causality test

    International Nuclear Information System (INIS)

    Wesseh, Presley K.; Zoumara, Babette

    2012-01-01

    This contribution investigates causal interdependence between energy consumption and economic growth in Liberia and proposes application of a bootstrap methodology. To better reflect causality, employment is incorporated as additional variable. The study demonstrates evidence of distinct bidirectional Granger causality between energy consumption and economic growth. Additionally, the results show that employment in Liberia Granger causes economic growth and apply irrespective of the short-run or long-run. Evidence from a Monte Carlo experiment reveals that the asymptotic Granger causality test suffers size distortion problem for Liberian data, suggesting that the bootstrap technique employed in this study is more appropriate. Given the empirical results, implications are that energy expansion policies like energy subsidy or low energy tariff for instance, would be necessary to cope with demand exerted as a result of economic growth in Liberia. Furthermore, Liberia might have the performance of its employment generation on the economy partly determined by adequate energy. Therefore, it seems fully justified that a quick shift towards energy production based on clean energy sources may significantly slow down economic growth in Liberia. Hence, the government’s target to implement a long-term strategy to make Liberia a carbon neutral country, and eventually less carbon dependent by 2050 is understandable. - Highlights: ► Causality between energy consumption and economic growth in Liberia investigated. ► There is bidirectional causality between energy consumption and economic growth. ► Energy expansion policies are necessary to cope with demand from economic growth. ► Asymptotic Granger causality test suffers size distortion problem for Liberian data. ► The bootstrap methodology employed in our study is more appropriate.

  13. A cross-country non parametric estimation of the returns to factors of production and the elasticity of scale

    Directory of Open Access Journals (Sweden)

    Adalmir Marquetti

    2009-06-01

    and 1995. The results support the hypotheses of constant returns to scale to factors and decreasing returns to accumulable factors. The low capital-labor ratio countries have important differences in factor elasticities in relation to other countries. The augmentation of the production function by human capital did not reduce the elasticity of physical capital as suggested by Mankiw, Romer and Weil (1992. Moreover, it is investigated if thefactors shares are really equal to their output elasticity. The wage share raises with the capital labor ratio and the sum of the output elasticity of labor and human capital is below the wage share for high capital labor ratio countries, happening the inverse for low capital labor ratio countries. It indicates the presence of externalities, or imperfect competition or that the marginal theory of distribution is inaccurate.

  14. Advances in data-driven optimization of parametric and non-parametric feedforward control designs with industrial applications

    NARCIS (Netherlands)

    Tousain, R.L.; Meulen, van der S.H.; Hof, van den P.M.J.; Scherer, C.; Heuberger, P.S.C.

    2009-01-01

    The performance of many industrial control systems is determined to a large extent by the quality of both setpoint and disturbance feedforward signals. The quality that is required for a high tracking performance is generally not achieved when the controller parameters are determined on the basis of

  15. A non-parametric, supervised classification of vegetation types on the Kaibab National Forest using decision trees

    Science.gov (United States)

    Suzanne M. Joy; R. M. Reich; Richard T. Reynolds

    2003-01-01

    Traditional land classification techniques for large areas that use Landsat Thematic Mapper (TM) imagery are typically limited to the fixed spatial resolution of the sensors (30m). However, the study of some ecological processes requires land cover classifications at finer spatial resolutions. We model forest vegetation types on the Kaibab National Forest (KNF) in...

  16. The Effect of Twenty Years of Exclosure on Parametric and Non-parametric Diversity Indices in Chadegan Rangelands- Isfahan

    Directory of Open Access Journals (Sweden)

    A. Sheikhzadeh

    2016-12-01

    Full Text Available Exclosure is considered as a management method to improve vegetation. This study aimed to evaluate the effects of exclosure on species diversity in Zayanderod dam station in Chadegan, Isfahan. The study area was stratified based on the various management types and slope directions, and samples were collected randomly in each layer. Four perpendicular transects, 500m in length were established along the gradients. Fifteen plots (1×1.5m were established along each transect. The cover percentage, density and scientific name of the perennial species and the management condition were recorded in each plot. Diversity index of Simpson and Shannon, richness indices of Margalef and Menhinick, Simpson evenness indice, and Parametric methods (curve of frequency were calculated in the two grazed and ungrazed areas. Independent t-test was used to compare the diversity indices. CCA Analysis was used to evaluate the relationships between species and management factors with the diversity indices. The results showed that although the diversity, richness and evenness indices in the exclosure area were more than the grazed area, there was no significant difference between the diversity and evenness indices in the areas. The log-normal was the significant fitted graph in the study area which represents relatively stable societies. The ordination results showed that grazing area and exclosure area are well separated from each other and confirmed the higher richness in the exclosure site.

  17. A simple method for optimising transformation of non-parametric data: an illustration by reference to cortisol assays.

    Science.gov (United States)

    Clark, James E; Osborne, Jason W; Gallagher, Peter; Watson, Stuart

    2016-07-01

    Neuroendocrine data are typically positively skewed and rarely conform to the expectations of a Gaussian distribution. This can be a problem when attempting to analyse results within the framework of the general linear model, which relies on assumptions that residuals in the data are normally distributed. One frequently used method for handling violations of this assumption is to transform variables to bring residuals into closer alignment with assumptions (as residuals are not directly manipulated). This is often attempted through ad hoc traditional transformations such as square root, log and inverse. However, Box and Cox (Box & Cox, ) observed that these are all special cases of power transformations and proposed a more flexible method of transformation for researchers to optimise alignment with assumptions. The goal of this paper is to demonstrate the benefits of the infinitely flexible Box-Cox transformation on neuroendocrine data using syntax in spss. When applied to positively skewed data typical of neuroendocrine data, the majority (~2/3) of cases were brought into strict alignment with Gaussian distribution (i.e. a non-significant Shapiro-Wilks test). Those unable to meet this challenge showed substantial improvement in distributional properties. The biggest challenge was distributions with a high ratio of kurtosis to skewness. We discuss how these cases might be handled, and we highlight some of the broader issues associated with transformation. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  18. Accurate Traffic Flow Prediction in Heterogeneous Vehicular Networks in an Intelligent Transport System Using a Supervised Non-Parametric Classifier

    Directory of Open Access Journals (Sweden)

    Hesham El-Sayed

    2018-05-01

    Full Text Available Heterogeneous vehicular networks (HETVNETs evolve from vehicular ad hoc networks (VANETs, which allow vehicles to always be connected so as to obtain safety services within intelligent transportation systems (ITSs. The services and data provided by HETVNETs should be neither interrupted nor delayed. Therefore, Quality of Service (QoS improvement of HETVNETs is one of the topics attracting the attention of researchers and the manufacturing community. Several methodologies and frameworks have been devised by researchers to address QoS-prediction service issues. In this paper, to improve QoS, we evaluate various traffic characteristics of HETVNETs and propose a new supervised learning model to capture knowledge on all possible traffic patterns. This model is a refinement of support vector machine (SVM kernels with a radial basis function (RBF. The proposed model produces better results than SVMs, and outperforms other prediction methods used in a traffic context, as it has lower computational complexity and higher prediction accuracy.

  19. A non-parametric mixture model for genome-enabled prediction of genetic value for a quantitative trait.

    Science.gov (United States)

    Gianola, Daniel; Wu, Xiao-Lin; Manfredi, Eduardo; Simianer, Henner

    2010-10-01

    A Bayesian nonparametric form of regression based on Dirichlet process priors is adapted to the analysis of quantitative traits possibly affected by cryptic forms of gene action, and to the context of SNP-assisted genomic selection, where the main objective is to predict a genomic signal on phenotype. The procedure clusters unknown genotypes into groups with distinct genetic values, but in a setting in which the number of clusters is unknown a priori, so that standard methods for finite mixture analysis do not work. The central assumption is that genetic effects follow an unknown distribution with some "baseline" family, which is a normal process in the cases considered here. A Bayesian analysis based on the Gibbs sampler produces estimates of the number of clusters, posterior means of genetic effects, a measure of credibility in the baseline distribution, as well as estimates of parameters of the latter. The procedure is illustrated with a simulation representing two populations. In the first one, there are 3 unknown QTL, with additive, dominance and epistatic effects; in the second, there are 10 QTL with additive, dominance and additive × additive epistatic effects. In the two populations, baseline parameters are inferred correctly. The Dirichlet process model infers the number of unique genetic values correctly in the first population, but it produces an understatement in the second one; here, the true number of clusters is over 900, and the model gives a posterior mean estimate of about 140, probably because more replication of genotypes is needed for correct inference. The impact on inferences of the prior distribution of a key parameter (M), and of the extent of replication, was examined via an analysis of mean body weight in 192 paternal half-sib families of broiler chickens, where each sire was genotyped for nearly 7,000 SNPs. In this small sample, it was found that inference about the number of clusters was affected by the prior distribution of M. For a set of combinations of parameters of a given prior distribution, the effects of the prior dissipated when the number of replicate samples per genotype was increased. Thus, the Dirichlet process model seems to be useful for gauging the number of QTLs affecting the trait: if the number of clusters inferred is small, probably just a few QTLs code for the trait. If the number of clusters inferred is large, this may imply that standard parametric models based on the baseline distribution may suffice. However, priors may be influential, especially if sample size is not large and if only a few genotypic configurations have replicate phenotypes in the sample.

  20. A comparison of selected parametric and non-parametric imputation methods for estimating forest biomass and basal area

    Science.gov (United States)

    Donald Gagliasso; Susan Hummel; Hailemariam. Temesgen

    2014-01-01

    Various methods have been used to estimate the amount of above ground forest biomass across landscapes and to create biomass maps for specific stands or pixels across ownership or project areas. Without an accurate estimation method, land managers might end up with incorrect biomass estimate maps, which could lead them to make poorer decisions in their future...

  1. SPECIES-SPECIFIC FOREST VARIABLE ESTIMATION USING NON-PARAMETRIC MODELING OF MULTI-SPECTRAL PHOTOGRAMMETRIC POINT CLOUD DATA

    Directory of Open Access Journals (Sweden)

    J. Bohlin

    2012-07-01

    Full Text Available The recent development in software for automatic photogrammetric processing of multispectral aerial imagery, and the growing nation-wide availability of Digital Elevation Model (DEM data, are about to revolutionize data capture for forest management planning in Scandinavia. Using only already available aerial imagery and ALS-assessed DEM data, raster estimates of the forest variables mean tree height, basal area, total stem volume, and species-specific stem volumes were produced and evaluated. The study was conducted at a coniferous hemi-boreal test site in southern Sweden (lat. 58° N, long. 13° E. Digital aerial images from the Zeiss/Intergraph Digital Mapping Camera system were used to produce 3D point-cloud data with spectral information. Metrics were calculated for 696 field plots (10 m radius from point-cloud data and used in k-MSN to estimate forest variables. For these stands, the tree height ranged from 1.4 to 33.0 m (18.1 m mean, stem volume from 0 to 829 m3 ha-1 (249 m3 ha-1 mean and basal area from 0 to 62.2 m2 ha-1 (26.1 m2 ha-1 mean, with mean stand size of 2.8 ha. Estimates made using digital aerial images corresponding to the standard acquisition of the Swedish National Land Survey (Lantmäteriet showed RMSEs (in percent of the surveyed stand mean of 7.5% for tree height, 11.4% for basal area, 13.2% for total stem volume, 90.6% for pine stem volume, 26.4 for spruce stem volume, and 72.6% for deciduous stem volume. The results imply that photogrammetric matching of digital aerial images has significant potential for operational use in forestry.

  2. Characterizing rainfall of hot arid region by using time-series modeling and sustainability approaches: a case study from Gujarat, India

    Science.gov (United States)

    Machiwal, Deepesh; Kumar, Sanjay; Dayal, Devi

    2016-05-01

    This study aimed at characterization of rainfall dynamics in a hot arid region of Gujarat, India by employing time-series modeling techniques and sustainability approach. Five characteristics, i.e., normality, stationarity, homogeneity, presence/absence of trend, and persistence of 34-year (1980-2013) period annual rainfall time series of ten stations were identified/detected by applying multiple parametric and non-parametric statistical tests. Furthermore, the study involves novelty of proposing sustainability concept for evaluating rainfall time series and demonstrated the concept, for the first time, by identifying the most sustainable rainfall series following reliability ( R y), resilience ( R e), and vulnerability ( V y) approach. Box-whisker plots, normal probability plots, and histograms indicated that the annual rainfall of Mandvi and Dayapar stations is relatively more positively skewed and non-normal compared with that of other stations, which is due to the presence of severe outlier and extreme. Results of Shapiro-Wilk test and Lilliefors test revealed that annual rainfall series of all stations significantly deviated from normal distribution. Two parametric t tests and the non-parametric Mann-Whitney test indicated significant non-stationarity in annual rainfall of Rapar station, where the rainfall was also found to be non-homogeneous based on the results of four parametric homogeneity tests. Four trend tests indicated significantly increasing rainfall trends at Rapar and Gandhidham stations. The autocorrelation analysis suggested the presence of persistence of statistically significant nature in rainfall series of Bhachau (3-year time lag), Mundra (1- and 9-year time lag), Nakhatrana (9-year time lag), and Rapar (3- and 4-year time lag). Results of sustainability approach indicated that annual rainfall of Mundra and Naliya stations ( R y = 0.50 and 0.44; R e = 0.47 and 0.47; V y = 0.49 and 0.46, respectively) are the most sustainable and dependable

  3. MULTI-TEMPORAL LAND USE ANALYSIS OF AN EPHEMERAL RIVER AREA USING AN ARTIFICIAL NEURAL NETWORK APPROACH ON LANDSAT IMAGERY

    Directory of Open Access Journals (Sweden)

    M. Aquilino

    2014-01-01

    The historical archive of LANDSAT imagery dating back to the launch of ERTS in 1972 provides a comprehensive and permanent data source for tracking change on the planet‟s land surface. In this study case the imagery acquisition dates of 1987, 2002 and 2011 were selected to cover a time trend of 24 years. Land cover categories were based on classes outlined by the Curve Number method with the aim of characterizing land use according to the level of surface imperviousness. After comparing two land use classification methods, i.e. Maximum Likelihood Classifier (MLC and Multi-Layer Perceptron (MLP neural network, the Artificial Neural Networks (ANN approach was found the best reliable and efficient method in the absence of ground reference data. The ANN approach has a distinct advantage over statistical classification methods in that it is non-parametric and requires little or no a priori knowledge on the distribution model of input data. The results quantify land cover change patterns in the river basin area under study and demonstrate the potential of multitemporal LANDSAT data to provide an accurate and cost-effective means to map and analyse land cover changes over time that can be used as input in land management and policy decision-making.

  4. A nonparametric approach to medical survival data: Uncertainty in the context of risk in mortality analysis

    International Nuclear Information System (INIS)

    Janurová, Kateřina; Briš, Radim

    2014-01-01

    Medical survival right-censored data of about 850 patients are evaluated to analyze the uncertainty related to the risk of mortality on one hand and compare two basic surgery techniques in the context of risk of mortality on the other hand. Colorectal data come from patients who underwent colectomy in the University Hospital of Ostrava. Two basic surgery operating techniques are used for the colectomy: either traditional (open) or minimally invasive (laparoscopic). Basic question arising at the colectomy operation is, which type of operation to choose to guarantee longer overall survival time. Two non-parametric approaches have been used to quantify probability of mortality with uncertainties. In fact, complement of the probability to one, i.e. survival function with corresponding confidence levels is calculated and evaluated. First approach considers standard nonparametric estimators resulting from both the Kaplan–Meier estimator of survival function in connection with Greenwood's formula and the Nelson–Aalen estimator of cumulative hazard function including confidence interval for survival function as well. The second innovative approach, represented by Nonparametric Predictive Inference (NPI), uses lower and upper probabilities for quantifying uncertainty and provides a model of predictive survival function instead of the population survival function. The traditional log-rank test on one hand and the nonparametric predictive comparison of two groups of lifetime data on the other hand have been compared to evaluate risk of mortality in the context of mentioned surgery techniques. The size of the difference between two groups of lifetime data has been considered and analyzed as well. Both nonparametric approaches led to the same conclusion, that the minimally invasive operating technique guarantees the patient significantly longer survival time in comparison with the traditional operating technique

  5. Stochastic identification of temperature effects on the dynamics of a smart composite beam: assessment of multi-model and global model approaches

    International Nuclear Information System (INIS)

    Hios, J D; Fassois, S D

    2009-01-01

    The temperature effects on the dynamics of a smart composite beam are experimentally studied via conventional multi-model and novel global model identification approaches. The multi-model approaches are based on non-parametric and parametric VARX representations, whereas the global model approaches are based on novel constant coefficient pooled (CCP) and functionally pooled (FP) VARX parametric representations. The analysis indicates that the obtained multi-model and global model representations are in rough overall agreement. Nevertheless, the latter simultaneously use all available data records offering more compact descriptions of the dynamics, improved numerical robustness and estimation accuracy, which is reflected in significantly reduced modal parameter uncertainties. Although the CCP-VARX representations provide only 'averaged' descriptions of the structural dynamics over temperature, their FP-VARX counterparts allow for the explicit, analytical modeling of temperature dependence exhibiting a 'smooth' deterministic dependence of the dynamics on temperature which is compatible with the physics of the problem. In accordance with previous studies, the obtained natural frequencies decrease with temperature in a weakly nonlinear or approximately linear fashion. The damping factors are less affected, although their dependence on temperature may be of a potentially more complex nature

  6. An information-theoretic approach to assess practical identifiability of parametric dynamical systems.

    Science.gov (United States)

    Pant, Sanjay; Lombardi, Damiano

    2015-10-01

    A new approach for assessing parameter identifiability of dynamical systems in a Bayesian setting is presented. The concept of Shannon entropy is employed to measure the inherent uncertainty in the parameters. The expected reduction in this uncertainty is seen as the amount of information one expects to gain about the parameters due to the availability of noisy measurements of the dynamical system. Such expected information gain is interpreted in terms of the variance of a hypothetical measurement device that can measure the parameters directly, and is related to practical identifiability of the parameters. If the individual parameters are unidentifiable, correlation between parameter combinations is assessed through conditional mutual information to determine which sets of parameters can be identified together. The information theoretic quantities of entropy and information are evaluated numerically through a combination of Monte Carlo and k-nearest neighbour methods in a non-parametric fashion. Unlike many methods to evaluate identifiability proposed in the literature, the proposed approach takes the measurement-noise into account and is not restricted to any particular noise-structure. Whilst computationally intensive for large dynamical systems, it is easily parallelisable and is non-intrusive as it does not necessitate re-writing of the numerical solvers of the dynamical system. The application of such an approach is presented for a variety of dynamical systems--ranging from systems governed by ordinary differential equations to partial differential equations--and, where possible, validated against results previously published in the literature. Copyright © 2015 Elsevier Inc. All rights reserved.

  7. [Severe idiopathic scoliosis. Does the approach and the instruments used modify the results?].

    Science.gov (United States)

    Sánchez-Márquez, J M; Sánchez Pérez-Grueso, F J; Pérez Martín-Buitrago, M; Fernández-Baíllo, N; García-Fernández, A; Quintáns-Rodríguez, J

    2014-01-01

    The aim of this work is to evaluate and compare the radiographic results and complications of the surgical treatment of adolescents with idiopathic scoliosis greater than 75 degrees, using a double approach (DA) or an isolated posterior approach with hybrid instruments (posterior hybrid [PH]), or with «all-pedicle screws» (posterior screws [PS]). A retrospective review was performed on 69 patients with idiopathic scoliosis greater than 75°, with a follow-up of more than 2 years, to analyze the flexibility of the curves, the correction obtained, and the complications depending on the type of surgery. The Kruskal-Wallis test for non-parametric variables was used for the statistical analysis. There were no statistically significant differences between the 3 patient groups in the pre-surgical Cobb angle values (DA=89°, PH=83°, PS=83°), in the immediate post-surgical (DA=34°, PH=33°, PS=30°), nor at the end of follow-up (DA=36°, PH=36°, PS=33°) (P>.05). The percentage correction (DA=60%, PH=57%, PS=60%) was similar between groups (P>.05). The percentage of complications associated with the procedure was 20.8% in DA, 10% in PH and 20% in PS. Two patients in the PS group showed changes, with no neurological lesions, in the spinal cord monitoring, and one patient in the same group suffered a delayed and transient incomplete lesion. No significant differences were observed in the correction of severe idiopathic scoliosis between patients operated using the double or isolated posterior approach, regardless of the type of instrumentation used. Copyright © 2013 SECOT. Published by Elsevier Espana. All rights reserved.

  8. Maternal and infant activity: Analytic approaches for the study of circadian rhythm.

    Science.gov (United States)

    Thomas, Karen A; Burr, Robert L; Spieker, Susan

    2015-11-01

    The study of infant and mother circadian rhythm entails choice of instruments appropriate for use in the home environment as well as selection of analytic approach that characterizes circadian rhythm. While actigraphy monitoring suits the needs of home study, limited studies have examined mother and infant rhythm derived from actigraphy. Among this existing research a variety of analyses have been employed to characterize 24-h rhythm, reducing ability to evaluate and synthesize findings. Few studies have examined the correspondence of mother and infant circadian parameters for the most frequently cited approaches: cosinor, non-parametric circadian rhythm analysis (NPCRA), and autocorrelation function (ACF). The purpose of this research was to examine analytic approaches in the study of mother and infant circadian activity rhythm. Forty-three healthy mother and infant pairs were studied in the home environment over a 72h period at infant age 4, 8, and 12 weeks. Activity was recorded continuously using actigraphy monitors and mothers completed a diary. Parameters of circadian rhythm were generated from cosinor analysis, NPCRA, and ACF. The correlation among measures of rhythm center (cosinor mesor, NPCRA mid level), strength or fit of 24-h period (cosinor magnitude and R(2), NPCRA amplitude and relative amplitude (RA)), phase (cosinor acrophase, NPCRA M10 and L5 midpoint), and rhythm stability and variability (NPCRA interdaily stability (IS) and intradaily variability (IV), ACF) was assessed, and additionally the effect size (eta(2)) for change over time evaluated. Results suggest that cosinor analysis, NPCRA, and autocorrelation provide several comparable parameters of infant and maternal circadian rhythm center, fit, and phase. IS and IV were strongly correlated with the 24-h cycle fit. The circadian parameters analyzed offer separate insight into rhythm and differing effect size for the detection of change over time. Findings inform selection of analysis and

  9. Exploring prediction uncertainty of spatial data in geostatistical and machine learning Approaches

    Science.gov (United States)

    Klump, J. F.; Fouedjio, F.

    2017-12-01

    Geostatistical methods such as kriging with external drift as well as machine learning techniques such as quantile regression forest have been intensively used for modelling spatial data. In addition to providing predictions for target variables, both approaches are able to deliver a quantification of the uncertainty associated with the prediction at a target location. Geostatistical approaches are, by essence, adequate for providing such prediction uncertainties and their behaviour is well understood. However, they often require significant data pre-processing and rely on assumptions that are rarely met in practice. Machine learning algorithms such as random forest regression, on the other hand, require less data pre-processing and are non-parametric. This makes the application of machine learning algorithms to geostatistical problems an attractive proposition. The objective of this study is to compare kriging with external drift and quantile regression forest with respect to their ability to deliver reliable prediction uncertainties of spatial data. In our comparison we use both simulated and real world datasets. Apart from classical performance indicators, comparisons make use of accuracy plots, probability interval width plots, and the visual examinations of the uncertainty maps provided by the two approaches. By comparing random forest regression to kriging we found that both methods produced comparable maps of estimated values for our variables of interest. However, the measure of uncertainty provided by random forest seems to be quite different to the measure of uncertainty provided by kriging. In particular, the lack of spatial context can give misleading results in areas without ground truth data. These preliminary results raise questions about assessing the risks associated with decisions based on the predictions from geostatistical and machine learning algorithms in a spatial context, e.g. mineral exploration.

  10. Improving the reliability of POD curves in NDI methods using a Bayesian inversion approach for uncertainty quantification

    Science.gov (United States)

    Ben Abdessalem, A.; Jenson, F.; Calmon, P.

    2016-02-01

    This contribution provides an example of the possible advantages of adopting a Bayesian inversion approach to uncertainty quantification in nondestructive inspection methods. In such problem, the uncertainty associated to the random parameters is not always known and needs to be characterised from scattering signal measurements. The uncertainties may then correctly propagated in order to determine a reliable probability of detection curve. To this end, we establish a general Bayesian framework based on a non-parametric maximum likelihood function formulation and some priors from expert knowledge. However, the presented inverse problem is time-consuming and computationally intensive. To cope with this difficulty, we replace the real model by a surrogate one in order to speed-up the model evaluation and to make the problem to be computationally feasible for implementation. The least squares support vector regression is adopted as metamodelling technique due to its robustness to deal with non-linear problems. We illustrate the usefulness of this methodology through the control of tube with enclosed defect using ultrasonic inspection method.

  11. Measuring multi-joint stiffness during single movements: numerical validation of a novel time-frequency approach.

    Science.gov (United States)

    Piovesan, Davide; Pierobon, Alberto; DiZio, Paul; Lackner, James R

    2012-01-01

    This study presents and validates a Time-Frequency technique for measuring 2-dimensional multijoint arm stiffness throughout a single planar movement as well as during static posture. It is proposed as an alternative to current regressive methods which require numerous repetitions to obtain average stiffness on a small segment of the hand trajectory. The method is based on the analysis of the reassigned spectrogram of the arm's response to impulsive perturbations and can estimate arm stiffness on a trial-by-trial basis. Analytic and empirical methods are first derived and tested through modal analysis on synthetic data. The technique's accuracy and robustness are assessed by modeling the estimation of stiffness time profiles changing at different rates and affected by different noise levels. Our method obtains results comparable with two well-known regressive techniques. We also test how the technique can identify the viscoelastic component of non-linear and higher than second order systems with a non-parametrical approach. The technique proposed here is very impervious to noise and can be used easily for both postural and movement tasks. Estimations of stiffness profiles are possible with only one perturbation, making our method a useful tool for estimating limb stiffness during motor learning and adaptation tasks, and for understanding the modulation of stiffness in individuals with neurodegenerative diseases.

  12. Measuring multi-joint stiffness during single movements: numerical validation of a novel time-frequency approach.

    Directory of Open Access Journals (Sweden)

    Davide Piovesan

    Full Text Available This study presents and validates a Time-Frequency technique for measuring 2-dimensional multijoint arm stiffness throughout a single planar movement as well as during static posture. It is proposed as an alternative to current regressive methods which require numerous repetitions to obtain average stiffness on a small segment of the hand trajectory. The method is based on the analysis of the reassigned spectrogram of the arm's response to impulsive perturbations and can estimate arm stiffness on a trial-by-trial basis. Analytic and empirical methods are first derived and tested through modal analysis on synthetic data. The technique's accuracy and robustness are assessed by modeling the estimation of stiffness time profiles changing at different rates and affected by different noise levels. Our method obtains results comparable with two well-known regressive techniques. We also test how the technique can identify the viscoelastic component of non-linear and higher than second order systems with a non-parametrical approach. The technique proposed here is very impervious to noise and can be used easily for both postural and movement tasks. Estimations of stiffness profiles are possible with only one perturbation, making our method a useful tool for estimating limb stiffness during motor learning and adaptation tasks, and for understanding the modulation of stiffness in individuals with neurodegenerative diseases.

  13. Application of the positive matrix factorization approach to identify heavy metal sources in sediments. A case study on the Mexican Pacific Coast.

    Science.gov (United States)

    González-Macías, C; Sánchez-Reyna, G; Salazar-Coria, L; Schifter, I

    2014-01-01

    During the last two decades, sediments collected in different sources of water bodies of the Tehuantepec Basin, located in the southeast of the Mexican Pacific Coast, showed that concentrations of heavy metals may pose a risk to the environment and human health. The extractable organic matter, geoaccumulation index, and enrichment factors were quantified for arsenic, cadmium, copper, chromium, nickel, lead, vanadium, zinc, and the fine-grained sediment fraction. The non-parametric SiZer method was applied to assess the statistical significance of the reconstructed metal variation along time. This inference method appears to be particularly natural and well suited to temperature and other environmental reconstructions. In this approach, a collection of smooth of the reconstructed metal concentrations is considered simultaneously, and inferences about the significance of the metal trends can be made with respect to time. Hence, the database represents a consolidated set of available and validated water and sediment data of an urban industrialized area, which is very useful as case study site. The positive matrix factorization approach was used in identification and source apportionment of the anthropogenic heavy metals in the sediments. Regionally, metals and organic matter are depleted relative to crustal abundance in a range of 45-55 %, while there is an inorganic enrichment from lithogenous/anthropogenic sources of around 40 %. Only extractable organic matter, Pb, As, and Cd can be related with non-crustal sources, suggesting that additional input cannot be explained by local runoff or erosion processes.

  14. ADVANCED EARTH OBSERVATION APPROACH FOR MULTISCALE FOREST ECOSYSTEM SERVICES MODELING AND MAPPING (MIMOSE

    Directory of Open Access Journals (Sweden)

    G. Chirici

    2014-04-01

    Full Text Available In the last decade ecosystem services (ES have been proposed as a method for quantifying the multifunctional role of forest ecosystems. Their spatial distribution on large areas is frequently limited by the lack of information, because field data collection with traditional methods requires much effort in terms of time and cost.  In this contribution we propose a methodology (namely, MultIscale Mapping Of ecoSystem servicEs - MIMOSE based on the integration of remotely sensed images and field observation to produce a wall-to-wall geodatabase of forest parcels accompanied with several information useful as a basis for future trade-off analysis of different ES. Here, we present the application of the MIMOSE approach to a study area of 443,758 hectares  coincident with administrative Molise Region in Central Italy. The procedure is based on a local high resolution forest types map integrated with information on the main forest management approaches. Through the non-parametric k-Nearest Neighbors techniques, we produced a growing stock volume map integrating a local forest inventory with a multispectral satellite IRS LISS III imagery. With the growing stock volume map we derived a forest age map for even-aged forest types. Later these information were used to automatically create a vector forest parcels map by multidimensional image segmentation that were finally populated with a number of information useful for ES spatial estimation. The contribution briefly introduce to the MIMOSE methodology presenting the preliminary results we achieved which constitute the basis for a future implementation of ES modeling.

  15. Examining the Feasibility and Utility of Estimating Partial Expected Value of Perfect Information (via a Nonparametric Approach) as Part of the Reimbursement Decision-Making Process in Ireland: Application to Drugs for Cancer.

    Science.gov (United States)

    McCullagh, Laura; Schmitz, Susanne; Barry, Michael; Walsh, Cathal

    2017-11-01

    In Ireland, all new drugs for which reimbursement by the healthcare payer is sought undergo a health technology assessment by the National Centre for Pharmacoeconomics. The National Centre for Pharmacoeconomics estimate expected value of perfect information but not partial expected value of perfect information (owing to computational expense associated with typical methodologies). The objective of this study was to examine the feasibility and utility of estimating partial expected value of perfect information via a computationally efficient, non-parametric regression approach. This was a retrospective analysis of evaluations on drugs for cancer that had been submitted to the National Centre for Pharmacoeconomics (January 2010 to December 2014 inclusive). Drugs were excluded if cost effective at the submitted price. Drugs were excluded if concerns existed regarding the validity of the applicants' submission or if cost-effectiveness model functionality did not allow required modifications to be made. For each included drug (n = 14), value of information was estimated at the final reimbursement price, at a threshold equivalent to the incremental cost-effectiveness ratio at that price. The expected value of perfect information was estimated from probabilistic analysis. Partial expected value of perfect information was estimated via a non-parametric approach. Input parameters with a population value at least €1 million were identified as potential targets for research. All partial estimates were determined within minutes. Thirty parameters (across nine models) each had a value of at least €1 million. These were categorised. Collectively, survival analysis parameters were valued at €19.32 million, health state utility parameters at €15.81 million and parameters associated with the cost of treating adverse effects at €6.64 million. Those associated with drug acquisition costs and with the cost of care were valued at €6.51 million and €5.71

  16. Testing for heteroscedasticity in jumpy and noisy high-frequency data: A resampling approach

    DEFF Research Database (Denmark)

    Christensen, Kim; Hounyo, Ulrich; Podolskij, Mark

    -frequency data. We document the importance of jump-robustness, when measuring heteroscedasticity in practice. We also find that a large fraction of variation in intraday volatility is accounted for by seasonality. This suggests that, once we control for jumps and deate asset returns by a non-parametric estimate...

  17. Segmenting Multiple Sclerosis Lesions using a Spatially Constrained K-Nearest Neighbour approach

    DEFF Research Database (Denmark)

    Lyksborg, Mark; Larsen, Rasmus; Sørensen, Per Soelberg

    2012-01-01

    We propose a method for the segmentation of Multiple Sclerosis lesions. The method is based on probability maps derived from a K-Nearest Neighbours classication. These are used as a non parametric likelihood in a Bayesian formulation with a prior that assumes connectivity of neighbouring voxels. ...

  18. The Effects of Learning Activities Corresponding with Students’ Learning Styles on Academic Success and Attitude within the Scope of Constructivist Learning Approach: The Case of the Concepts of Function and Derivative

    Directory of Open Access Journals (Sweden)

    Kemal Özgen

    2014-04-01

    Full Text Available The aim of this study was to identify the effects of learning activities according to students’ learning styles on students’ academic success and attitude towards mathematics within a scope of constructivist learning approach. The study had a semi-experimental research design based on the pre test-post test model with a control group. The participants of the study were students studying at a state high school in the 2010-2011 academic year. As part of the study, activities which were suitable to the students’ learning styles were developed within the scope of constructivist learning approach in line with McCarthy’s 4MAT system with 8 steps of learning and used for the learning of the concepts of function and derivative. Data were collected using data collection tools such as a personal information form, non-routine problems, and a mathematics attitude scale. Descriptive and non-parametric statistics were used for the analysis of quantitative data. Data analysis indicated that, the learning process in which activities appropriate for students’ learning styles were used to contribute to an increase in the students’ academic success and problem solving skills. Yet, there was no statistically significant difference in students’ attitudes towards mathematics.Key Words:    Constructivist learning approach, learning style, learning activity, success, attitude

  19. A new approach to hierarchical data analysis: Targeted maximum likelihood estimation for the causal effect of a cluster-level exposure.

    Science.gov (United States)

    Balzer, Laura B; Zheng, Wenjing; van der Laan, Mark J; Petersen, Maya L

    2018-01-01

    We often seek to estimate the impact of an exposure naturally occurring or randomly assigned at the cluster-level. For example, the literature on neighborhood determinants of health continues to grow. Likewise, community randomized trials are applied to learn about real-world implementation, sustainability, and population effects of interventions with proven individual-level efficacy. In these settings, individual-level outcomes are correlated due to shared cluster-level factors, including the exposure, as well as social or biological interactions between individuals. To flexibly and efficiently estimate the effect of a cluster-level exposure, we present two targeted maximum likelihood estimators (TMLEs). The first TMLE is developed under a non-parametric causal model, which allows for arbitrary interactions between individuals within a cluster. These interactions include direct transmission of the outcome (i.e. contagion) and influence of one individual's covariates on another's outcome (i.e. covariate interference). The second TMLE is developed under a causal sub-model assuming the cluster-level and individual-specific covariates are sufficient to control for confounding. Simulations compare the alternative estimators and illustrate the potential gains from pairing individual-level risk factors and outcomes during estimation, while avoiding unwarranted assumptions. Our results suggest that estimation under the sub-model can result in bias and misleading inference in an observational setting. Incorporating working assumptions during estimation is more robust than assuming they hold in the underlying causal model. We illustrate our approach with an application to HIV prevention and treatment.

  20. A novel approach for analyzing data on recurrent events with duration to estimate the combined cumulative rate of both variables over time

    Directory of Open Access Journals (Sweden)

    Sudipta Bhattacharya

    2018-06-01

    Full Text Available Recurrent adverse events, once occur often continue for some duration of time in clinical trials; and the number of events along with their durations is clinically considered as a measure of severity of a disease under study. While there are methods available for analyzing recurrent events or durations or for analyzing both side by side, no effort has been made so far to combine them and present as a single measure. However, this single-valued combined measure may help clinicians assess the wholesome effect of recurrence of incident comprising events and durations. Non-parametric approach is adapted here to develop an estimator for estimating the combined rate of both, the recurrence of events as well as the event-continuation, that is the duration per event. The proposed estimator produces a single numerical value, the interpretation and meaningfulness of which are discussed through the analysis of a real-life clinical dataset. The algebraic expression of variance is derived, asymptotic normality of the estimator is noted, and demonstration is provided on how the estimator can be used in the setup of testing of statistical hypothesis. Further possible development of the estimator is also noted, to adjust for the dependence of event occurrences on the history of the process generating recurrent events through covariates and for the case of dependent censoring. Keywords: Recurrent events, Duration per event, Intensity, Nelson-aalen estimator

  1. Survival dimensionality reduction (SDR: development and clinical application of an innovative approach to detect epistasis in presence of right-censored data

    Directory of Open Access Journals (Sweden)

    Beretta Lorenzo

    2010-08-01

    Full Text Available Abstract Background Epistasis is recognized as a fundamental part of the genetic architecture of individuals. Several computational approaches have been developed to model gene-gene interactions in case-control studies, however, none of them is suitable for time-dependent analysis. Herein we introduce the Survival Dimensionality Reduction (SDR algorithm, a non-parametric method specifically designed to detect epistasis in lifetime datasets. Results The algorithm requires neither specification about the underlying survival distribution nor about the underlying interaction model and proved satisfactorily powerful to detect a set of causative genes in synthetic epistatic lifetime datasets with a limited number of samples and high degree of right-censorship (up to 70%. The SDR method was then applied to a series of 386 Dutch patients with active rheumatoid arthritis that were treated with anti-TNF biological agents. Among a set of 39 candidate genes, none of which showed a detectable marginal effect on anti-TNF responses, the SDR algorithm did find that the rs1801274 SNP in the FcγRIIa gene and the rs10954213 SNP in the IRF5 gene non-linearly interact to predict clinical remission after anti-TNF biologicals. Conclusions Simulation studies and application in a real-world setting support the capability of the SDR algorithm to model epistatic interactions in candidate-genes studies in presence of right-censored data. Availability: http://sourceforge.net/projects/sdrproject/

  2. Efficiency of mutual funds in Croatia: a DEA-based approach applied in the pre-crisis, crisis and post crisis period

    Directory of Open Access Journals (Sweden)

    Margareta Gardijan

    2017-01-01

    Full Text Available The aim of this paper is to estimate the overall performance of mutual funds in Croatia in terms of their relative efficiency based on several performance indicators using data envelopment analysis (DEA. DEA is a non-parametric method that can provide an overall relative efficiency score of a certain fund given a number of risk, cost or reward or profitability measures. Since traditional mutual fund performance indicators are mostly based on the CAPM paradigm that demands using rigid assumptions and questionable benchmarks, we endeavor to overcome the limitations of such an approach by considering more appropriate risk and reward measures, such as Expected Shortfall, stochastic dominance and higher order moments. In this way, we developed an adjusted DEA-based mutual fund performance index. The efficiency scores obtained from the DEA model help in identifying efficient funds and ranking the funds based on certain criteria. DEA also identifies mutual fund(s that can be benchmarks for other mutual funds that have similar investment strategies. These results were compared to various traditional indicators of absolute and relative risk-adjusted performance of mutual funds. The analysis was divided into three periods: the pre-crisis period, crisis period and post-crisis period with different conclusions for mutual fund performances in Croatia. The analysis includes altogether 60 UCITS funds in Croatia, in the period from the beginning of 2005 until the end of 2015, and was conducted on daily data of share prices, available from the Bloomberg terminal.

  3. Measuring the CO2 shadow price for wastewater treatment: A directional distance function approach

    International Nuclear Information System (INIS)

    Molinos-Senante, María; Hanley, Nick; Sala-Garrido, Ramón

    2015-01-01

    Highlights: • The shadow price of CO 2 informs about the marginal abatement cost of this pollutant. • It is estimated the shadow price of CO 2 for wastewater treatment plants. • The shadow prices depend on the setting of the directional vectors of the distance function. • Sewage sludge treatment technology affects the CO 2 shadow price. - Abstract: The estimation of the value of carbon emissions has become a major research and policy topic since the establishment of the Kyoto Protocol. The shadow price of CO 2 provides information about the marginal abatement cost of this pollutant. It is an essential element in guiding environmental policy issues, since the CO 2 shadow price can be used when fixing carbon tax rates, in environmental cost-benefit analysis and in ascertaining an initial market price for a trading system. The water industry could play an important role in the reduction of greenhouse gas (GHG) emissions. This paper estimates the shadow price of CO 2 for a sample of wastewater treatment plants (WWTPs), using a parametric quadratic directional distance function. Following this, in a sensitivity analysis, the paper evaluates the impact of different settings of directional vectors on the shadow prices. Applying the Mann–Whitney and Kruskal–Wallis non-parametric tests, factors affecting CO 2 prices are investigated. The variation of CO 2 shadow prices across the WWTPs evaluated argues in favour of a market-based approach to CO 2 mitigation as opposed to command-and-control regulation. The paper argues that the estimation of the shadow price of CO 2 for non-power enterprises can provide incentives for reducing GHG emissions

  4. Pedagogical approaches

    DEFF Research Database (Denmark)

    Lund Larsen, Lea

    questions are: How (much) and what do teachers learn from experience? And how do teachers of adults develop their pedagogical approach? I examine the field of adult learners from the teachers’ perspective. Firstly, I identify some of the commonly described characteristics of adults as learners...... in formal settings, but in most teaching settings, the teachers act alone and develop their pedagogical approaches/- teaching strategies with no synchronous sparring from a colleague. Adult learners have particular needs and characteristics that their teachers must be able to address (cf. Knowles...

  5. Capability approach

    DEFF Research Database (Denmark)

    Jensen, Niels Rosendal; Kjeldsen, Christian Christrup

    Lærebogen er den første samlede danske præsentation af den af Amartya Sen og Martha Nussbaum udviklede Capability Approach. Bogen indeholder en præsentation og diskussion af Sen og Nussbaums teoretiske platform. I bogen indgår eksempler fra såvel uddannelse/uddannelsespolitik, pædagogik og omsorg....

  6. Developing awareness of sustainability in nursing and midwifery using a scenario-based approach: Evidence from a pre and post educational intervention study.

    Science.gov (United States)

    Richardson, Janet; Grose, Jane; Bradbury, Martyn; Kelsey, Janet

    2017-07-01

    The delivery of healthcare has an impact on the environment and contributes to climate change. As a consequence, the way in which nurses and midwives use and dispose of natural resources in clinical practice, and the subsequent impact on the environment, should be integral component of nursing and midwifery education. Opportunities need to be found to embed such issues into nursing curricula; thus bringing sustainability issues 'closer to home' and making them more relevant for clinical practice. The study was designed to measure the impact of a sustainability-focussed, scenario-based learning educational intervention on the attitudes and knowledge of student nurses and midwives. Pre test/Post test intervention study using scenario-based learning as the educational intervention. The Sustainability Attitudes in Nursing Survey (SANS_2) was used as the outcome measure. Clinical skills session in a UK University School of Nursing and Midwifery. 676 second year undergraduate nursing and midwifery students. The 7-point scale SANS survey was completed before and after the teaching session; standard non-parametric analysis compared pre and post intervention scores. Changes were observed in attitude towards climate change and sustainability and to the inclusion of these topics within the nursing curricula (p=0.000). Participants demonstrated greater knowledge of natural resource use and the cost of waste disposal following the session (p=0.000). Participants also reported that sessions were realistic, and levels of agreement with statements supporting the value of the session and the interactive nature of delivery were higher following the session. Using a scenario-based learning approach with nursing and midwifery students can change attitudes and knowledge towards sustainability and climate change. Embedding this approach in the context of clinical skills provides a novel and engaging approach that is both educationally sound and clinically relevant. Copyright © 2017

  7. Narrative approaches

    DEFF Research Database (Denmark)

    Stelter, Reinhard

    2012-01-01

    Narrative coaching is representative of the new wave – or third generation – of coaching practice . The theory and practice of narrative coaching takes into account the social and cultural conditions of late modern society, and must be seen as intertwined with them. Some initial conceptualizations...... of narrative coaching were developed by David Drake (2006, 2007, 2008, 2009) in the USA and Australia, by Ho Law in the UK (Law, 2007a + b; Law & Stelter, 2009) and by Reinhard Stelter (2007, 2009, 2012, in preparation; Stelter & Law, 2010) in Denmark. In the following chapter the aim is to present coaching...... as a narrative-collaborative practice, an approach that is based on phenomenology, social constructionism and narrative theory. Seeing narrative coaching as a collaborative practice also leads to reflecting on the relationship between coach and coachee(s) in a new way, where both parts contribute to the dialogue...

  8. Oriented Approach

    Directory of Open Access Journals (Sweden)

    Seyed Mohammad Moghimi

    2013-12-01

    Full Text Available Promoting productivity is one of the goals of usinginformation technology in organizations. The purpose of this research isexamining the impact of IT on organizational productivity andrecognizing its mechanisms based on process-oriented approach. For thisend, by reviewing the literature of the subject a number of impacts of ITon organizational processes were identified. Then, through interviewswith IT experts, seven main factors were selected and presented in aconceptual model. This model was tested through a questionnaire in 148industrial companies. Data analysis shows that impact of IT onproductivity can be included in the eight major categories: Increasing ofthe Automation, Tracking, Communication, Improvement, Flexibility,Analytic, Coordination and Monitoring in organizational processes.Finally, to improve the impact of information technology onorganizational productivity, some suggestions are presented.

  9. Predictive capacity of a non-radioisotopic local lymph node assay using flow cytometry, LLNA:BrdU-FCM: Comparison of a cutoff approach and inferential statistics.

    Science.gov (United States)

    Kim, Da-Eun; Yang, Hyeri; Jang, Won-Hee; Jung, Kyoung-Mi; Park, Miyoung; Choi, Jin Kyu; Jung, Mi-Sook; Jeon, Eun-Young; Heo, Yong; Yeo, Kyung-Wook; Jo, Ji-Hoon; Park, Jung Eun; Sohn, Soo Jung; Kim, Tae Sung; Ahn, Il Young; Jeong, Tae-Cheon; Lim, Kyung-Min; Bae, SeungJin

    2016-01-01

    In order for a novel test method to be applied for regulatory purposes, its reliability and relevance, i.e., reproducibility and predictive capacity, must be demonstrated. Here, we examine the predictive capacity of a novel non-radioisotopic local lymph node assay, LLNA:BrdU-FCM (5-bromo-2'-deoxyuridine-flow cytometry), with a cutoff approach and inferential statistics as a prediction model. 22 reference substances in OECD TG429 were tested with a concurrent positive control, hexylcinnamaldehyde 25%(PC), and the stimulation index (SI) representing the fold increase in lymph node cells over the vehicle control was obtained. The optimal cutoff SI (2.7≤cutoff <3.5), with respect to predictive capacity, was obtained by a receiver operating characteristic curve, which produced 90.9% accuracy for the 22 substances. To address the inter-test variability in responsiveness, SI values standardized with PC were employed to obtain the optimal percentage cutoff (42.6≤cutoff <57.3% of PC), which produced 86.4% accuracy. A test substance may be diagnosed as a sensitizer if a statistically significant increase in SI is elicited. The parametric one-sided t-test and non-parametric Wilcoxon rank-sum test produced 77.3% accuracy. Similarly, a test substance could be defined as a sensitizer if the SI means of the vehicle control, and of the low, middle, and high concentrations were statistically significantly different, which was tested using ANOVA or Kruskal-Wallis, with post hoc analysis, Dunnett, or DSCF (Dwass-Steel-Critchlow-Fligner), respectively, depending on the equal variance test, producing 81.8% accuracy. The absolute SI-based cutoff approach produced the best predictive capacity, however the discordant decisions between prediction models need to be examined further. Copyright © 2015 Elsevier Inc. All rights reserved.

  10. Systematic pain assessment in nursing homes: a cluster-randomized trial using mixed-methods approach.

    Science.gov (United States)

    Mamhidir, Anna-Greta; Sjölund, Britt-Marie; Fläckman, Birgitta; Wimo, Anders; Sköldunger, Anders; Engström, Maria

    2017-02-28

    Chronic pain affects nursing home residents' daily life. Pain assessment is central to adequate pain management. The overall aim was to investigate effects of a pain management intervention on nursing homes residents and to describe staffs' experiences of the intervention. A cluster-randomized trial and a mixed-methods approach. Randomized nursing home assignment to intervention or comparison group. The intervention group after theoretical and practical training sessions, performed systematic pain assessments using predominately observational scales with external and internal facilitators supporting the implementation. No measures were taken in the comparison group; pain management continued as before, but after the study corresponding training was provided. Resident data were collected baseline and at two follow-ups using validated scales and record reviews. Nurse group interviews were carried out twice. Primary outcome measures were wellbeing and proxy-measured pain. Secondary outcome measures were ADL-dependency and pain documentation. Using both non-parametric statistics on residential level and generalized estimating equation (GEE) models to take clustering effects into account, the results revealed non-significant interaction effects for the primary outcome measures, while for ADL-dependency using Katz-ADL there was a significant interaction effect. Comparison group (n = 66 residents) Katz-ADL values showed increased dependency over time, while the intervention group demonstrated no significant change over time (n = 98). In the intervention group, 13/44 residents showed decreased pain scores over the period, 14/44 had no pain score changes ≥ 30% in either direction measured with Doloplus-2. Furthermore, 17/44 residents showed increased pain scores ≥ 30% over time, indicating pain/risk for pain; 8 identified at the first assessment and 9 were new, i.e. developed pain over time. No significant changes in the use of drugs was found in any of

  11. The evolution of illness phases in schizophrenia: A non-parametric item response analysis of the Positive and Negative Syndrome Scale

    Directory of Open Access Journals (Sweden)

    Anzalee Khan

    2014-06-01

    Conclusion: Findings confirm differences in symptom presentation and predominance of particular domains in subpopulations of schizophrenia. Identifying symptom domains characteristic of subpopulations may be more useful in assessing efficacy endpoints than total or subscale scores.

  12. A novel method for non-parametric identification of nonlinear restoring forces in nonlinear vibrations from noisy response data: A conservative system

    International Nuclear Information System (INIS)

    Jang, T. S.; Kwon, S. H.; Han, S. L.

    2009-01-01

    A novel procedure is proposed to identify the functional form of nonlinear restoring forces in the nonlinear oscillatory motion of a conservative system. Although the problem of identification has a unique solution, formulation results in a Volterra-type of integral equation of the 'first' kind: the solution lacks stability because the integral equation is the 'first' kind. Thus, the new problem at hand is ill-posed. Inevitable small errors during the identification procedure can make the prediction of nonlinear restoring forces useless. We overcome the difficulty by using a stabilization technique of Landweber's regularization in this study. The capability of the proposed procedure is investigated through numerical examples

  13. Binding affinity toward human prion protein of some anti-prion compounds - Assessment based on QSAR modeling, molecular docking and non-parametric ranking.

    Science.gov (United States)

    Kovačević, Strahinja; Karadžić, Milica; Podunavac-Kuzmanović, Sanja; Jevrić, Lidija

    2018-01-01

    The present study is based on the quantitative structure-activity relationship (QSAR) analysis of binding affinity toward human prion protein (huPrP C ) of quinacrine, pyridine dicarbonitrile, diphenylthiazole and diphenyloxazole analogs applying different linear and non-linear chemometric regression techniques, including univariate linear regression, multiple linear regression, partial least squares regression and artificial neural networks. The QSAR analysis distinguished molecular lipophilicity as an important factor that contributes to the binding affinity. Principal component analysis was used in order to reveal similarities or dissimilarities among the studied compounds. The analysis of in silico absorption, distribution, metabolism, excretion and toxicity (ADMET) parameters was conducted. The ranking of the studied analogs on the basis of their ADMET parameters was done applying the sum of ranking differences, as a relatively new chemometric method. The main aim of the study was to reveal the most important molecular features whose changes lead to the changes in the binding affinities of the studied compounds. Another point of view on the binding affinity of the most promising analogs was established by application of molecular docking analysis. The results of the molecular docking were proven to be in agreement with the experimental outcome. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Comparing parametric and non-parametric classifiers for remote sensing of tree species across a land use gradient in a Savanna landscape

    CSIR Research Space (South Africa)

    Cho, Moses A

    2012-11-01

    Full Text Available ) and Random Forest (RF)). The spectral data used consisted of 8 WorldView-2 multispectral bands simulated from 72 VNIR bands image acquired over the study areas using the Carnegie Airborne Observatory (CAO) system. With the exception of SAM, the nonparametric...

  15. PARAMETRIC AND NON PARAMETRIC (MARS: MULTIVARIATE ADDITIVE REGRESSION SPLINES) LOGISTIC REGRESSIONS FOR PREDICTION OF A DICHOTOMOUS RESPONSE VARIABLE WITH AN EXAMPLE FOR PRESENCE/ABSENCE OF AMPHIBIANS

    Science.gov (United States)

    The purpose of this report is to provide a reference manual that could be used by investigators for making informed use of logistic regression using two methods (standard logistic regression and MARS). The details for analyses of relationships between a dependent binary response ...

  16. The impact of ICT on educational performance and its efficiency in selected EU and OECD countries: a non-parametric analysis

    OpenAIRE

    Aristovnik, Aleksander

    2012-01-01

    The purpose of the paper is to review some previous researches examining ICT efficiency and the impact of ICT on educational output/outcome as well as different conceptual and methodological issues related to performance measurement. Moreover, a definition, measurements and the empirical application of a model measuring the efficiency of ICT use and its impact at national levels will be considered. For this purpose, the Data Envelopment Analysis (DEA) technique is presented and then applied t...

  17. Are Public-Private Partnerships a Source of Greater Efficiency in Water Supply? Results of a Non-Parametric Performance Analysis Relating to the Italian Industry

    Directory of Open Access Journals (Sweden)

    Corrado lo Storto

    2013-12-01

    Full Text Available This article reports the outcome of a performance study of the water service provision industry in Italy. The study evaluates the efficiency of 21 “private or public-private” equity and 32 “public” equity water service operators and investigates controlling factors. In particular, the influence that the operator typology and service management nature - private vs. public - has on efficiency is assessed. The study employed a two-stage Data Envelopment Analysis methodology. In the first stage, the operational efficiency of water supply operators is calculated by implementing a conventional BCC DEA model, that uses both physical infrastructure and financial input and output variables to explore economies of scale. In the second stage, bootstrapped DEA and Tobit regression are performed to estimate the influence that a number of environmental factors have on water supplier efficiency. The results show that the integrated water provision industry in Italy is characterized by operational inefficiencies of service operators, and scale and agglomeration economies may have a not negligible effect on efficiency. In addition, the operator typology and its geographical location affect efficiency.

  18. A statistical approach to bioclimatic trend detection in the airborne pollen records of Catalonia (NE Spain)

    Science.gov (United States)

    Fernández-Llamazares, Álvaro; Belmonte, Jordina; Delgado, Rosario; De Linares, Concepción

    2014-04-01

    Airborne pollen records are a suitable indicator for the study of climate change. The present work focuses on the role of annual pollen indices for the detection of bioclimatic trends through the analysis of the aerobiological spectra of 11 taxa of great biogeographical relevance in Catalonia over an 18-year period (1994-2011), by means of different parametric and non-parametric statistical methods. Among others, two non-parametric rank-based statistical tests were performed for detecting monotonic trends in time series data of the selected airborne pollen types and we have observed that they have similar power in detecting trends. Except for those cases in which the pollen data can be well-modeled by a normal distribution, it is better to apply non-parametric statistical methods to aerobiological studies. Our results provide a reliable representation of the pollen trends in the region and suggest that greater pollen quantities are being liberated to the atmosphere in the last years, specially by Mediterranean taxa such as Pinus, Total Quercus and Evergreen Quercus, although the trends may differ geographically. Longer aerobiological monitoring periods are required to corroborate these results and survey the increasing levels of certain pollen types that could exert an impact in terms of public health.

  19. Approaches for the accurate definition of geological time boundaries

    Science.gov (United States)

    Schaltegger, Urs; Baresel, Björn; Ovtcharova, Maria; Goudemand, Nicolas; Bucher, Hugo

    2015-04-01

    of the ash, therefore masking the true age of deposition. Trace element ratios such as Th/U, Yb/Gd, as well as Hf isotope analysis of dated zircon can be used to decipher the temporal evolution of the magmatic system before the eruption and deposition of the studied ashes, and resolve the complex system behaviour of the zircons. b) Changes in the source of the magma may happen between the deposition of two stratigraphically consecutive ash beds. They result in the modification of the trace element signature of zircon, but also of apatite (Ca5 (F, Cl, OH) (PO4)3). Trace element characteristics in apatite (e.g. Mg, Mn, Fe, F, Cl, Ce, and Y) are a reliable tool for distinguishing chemically similar groups of apatite crystals to unravel the geochemical fingerprint of one single ash bed. By establishing this fingerprint, ash beds of geographically separated geologic sections can be correlated even if they have not all been dated by U-Pb techniques. c) The ultimate goal of quantitative stratigraphy is to establish an age model that predicts the age of a synchronous time line with an associated 95% confidence interval for any such line within a stratigraphic sequence. We show how a Bayesian, non-parametric interpolation approach can be applied to very complex data sets and leads to a well-defined age solution, possibly identifying changes in sedimentation rate. The age of a geological time boundary bracketed by dated samples in such an age model can be defined with an associated uncertainty.

  20. Screen Wars, Star Wars, and Sequels: Nonparametric Reanalysis of Movie Profitability

    OpenAIRE

    W. D. Walls

    2012-01-01

    In this paper we use nonparametric statistical tools to quantify motion-picture profit. We quantify the unconditional distribution of profit, the distribution of profit conditional on stars and sequels, and we also model the conditional expectation of movie profits using a non- parametric data-driven regression model. The flexibility of the non-parametric approach accommodates the full range of possible relationships among the variables without prior specification of a functional form, thereb...

  1. Density forecasts of crude-oil prices using option-implied and ARCH-type models

    DEFF Research Database (Denmark)

    Høg, Esben; Tsiaras, Leonicas

    2011-01-01

    of derivative contracts. Risk-neutral densities, obtained from panels of crude-oil option prices, are adjusted to reflect real-world risks using either a parametric or a non-parametric calibration approach. The relative performance of the models is evaluated for the entire support of the density, as well...... obtained by option prices and non-parametric calibration methods over those constructed using historical returns and simulated ARCH processes. © 2010 Wiley Periodicals, Inc. Jrl Fut Mark...

  2. Estimation of Esfarayen Farmers Risk Aversion Coefficient and Its Influencing Factors (Nonparametric Approach

    Directory of Open Access Journals (Sweden)

    Z. Nematollahi

    2016-03-01

    Full Text Available Introduction: Due to existence of the risk and uncertainty in agriculture, risk management is crucial for management in agriculture. Therefore the present study was designed to determine the risk aversion coefficient for Esfarayens farmers. Materials and Methods: The following approaches have been utilized to assess risk attitudes: (1 direct elicitation of utility functions, (2 experimental procedures in which individuals are presented with hypothetical questionnaires regarding risky alternatives with or without real payments and (3: Inference from observation of economic behavior. In this paper, we focused on approach (3: inference from observation of economic behavior, based on this assumption of existence of the relationship between the actual behavior of a decision maker and the behavior predicted from empirically specified models. A new non-parametric method and the QP method were used to calculate the coefficient of risk aversion. We maximized the decision maker expected utility with the E-V formulation (Freund, 1956. Ideally, in constructing a QP model, the variance-covariance matrix should be formed for each individual farmer. For this purpose, a sample of 100 farmers was selected using random sampling and their data about 14 products of years 2008- 2012 were assembled. The lowlands of Esfarayen were used since within this area, production possibilities are rather homogeneous. Results and Discussion: The results of this study showed that there was low correlation between some of the activities, which implies opportunities for income stabilization through diversification. With respect to transitory income, Ra, vary from 0.000006 to 0.000361 and the absolute coefficient of risk aversion in our sample were 0.00005. The estimated Ra values vary considerably from farm to farm. The results showed that the estimated Ra for the subsample existing of 'non-wealthy' farmers was 0.00010. The subsample with farmers in the 'wealthy' group had an

  3. A discriminative model-constrained EM approach to 3D MRI brain tissue classification and intensity non-uniformity correction

    International Nuclear Information System (INIS)

    Wels, Michael; Hornegger, Joachim; Zheng Yefeng; Comaniciu, Dorin; Huber, Martin

    2011-01-01

    We describe a fully automated method for tissue classification, which is the segmentation into cerebral gray matter (GM), cerebral white matter (WM), and cerebral spinal fluid (CSF), and intensity non-uniformity (INU) correction in brain magnetic resonance imaging (MRI) volumes. It combines supervised MRI modality-specific discriminative modeling and unsupervised statistical expectation maximization (EM) segmentation into an integrated Bayesian framework. While both the parametric observation models and the non-parametrically modeled INUs are estimated via EM during segmentation itself, a Markov random field (MRF) prior model regularizes segmentation and parameter estimation. Firstly, the regularization takes into account knowledge about spatial and appearance-related homogeneity of segments in terms of pairwise clique potentials of adjacent voxels. Secondly and more importantly, patient-specific knowledge about the global spatial distribution of brain tissue is incorporated into the segmentation process via unary clique potentials. They are based on a strong discriminative model provided by a probabilistic boosting tree (PBT) for classifying image voxels. It relies on the surrounding context and alignment-based features derived from a probabilistic anatomical atlas. The context considered is encoded by 3D Haar-like features of reduced INU sensitivity. Alignment is carried out fully automatically by means of an affine registration algorithm minimizing cross-correlation. Both types of features do not immediately use the observed intensities provided by the MRI modality but instead rely on specifically transformed features, which are less sensitive to MRI artifacts. Detailed quantitative evaluations on standard phantom scans and standard real-world data show the accuracy and robustness of the proposed method. They also demonstrate relative superiority in comparison to other state-of-the-art approaches to this kind of computational task: our method achieves average

  4. Life Span Developmental Approach

    OpenAIRE

    Ali Eryilmaz

    2011-01-01

    The Life Span Developmental Approach examines development of individuals which occurs from birth to death. Life span developmental approach is a multi-disciplinary approach related with disciplines like psychology, psychiatry, sociology, anthropology and geriatrics that indicates the fact that development is not completed in adulthood, it continues during the life course. Development is a complex process that consists of dying and death. This approach carefully investigates the development of...

  5. The sustainable livelihoods approach

    DEFF Research Database (Denmark)

    Oelofse, Myles; Jensen, Henning Høgh

    2008-01-01

    food chain has on producers and their families, an analysis was conducted of the use of the Sustainable Livelihoods Approach (SLA). The SLA provides a holistic and integrative approach which researchers can use as the overriding frame for their research. The application of the approach is recommended...

  6. A study of different approaches for multi-scale sensitivity analysis of the TALL-3D experiment using thermal-hydraulic computer codes

    International Nuclear Information System (INIS)

    Geffray, Clotaire; Macian-Juan, Rafael

    2014-01-01

    In the context of the FP7 European THINS Project, complex thermal-hydraulic phenomena relevant for the Generation IV of nuclear reactors are investigated. KTH (Sweden) built the TALL-3D facility to investigate the transition from forced to natural circulation of the Lead-Bismuth Eutectic (LBE) in a pool connected to a 3-leg primary circuit with two heaters and a heat exchanger. The simulation of such 3D phenomena is a challenging task. GRS (Germany) developed the coupling between the Computational Fluid Dynamics (CFD) code ANSYS CFX and the System Analysis code ATHLET. Such coupled codes combine the advantages of CFD, which allow a fine resolution of 3D phenomena, and of System Analysis codes, which are fast running. TUM (Germany) is responsible for the Uncertainty and Sensitivity Analysis of the coupled ATHLET-CFX model in the THINS Project. The influence of modeling uncertainty on simulation results needs to be assessed to characterize and to improve the model and, eventually, to assess its performance against experimental data. TUM has developed a computational framework capable of propagating model input uncertainty through coupled codes. This framework can also be used to apply different approaches for the assessment of the influence of the uncertain input parameters on the model output (Sensitivity Analysis). The work reported in this paper focuses on three methods for the assessment of the sensitivity of the results to the modeling uncertainty. The first method (Morris) allows for the computation of the Elementary Effects resulting from the input parameters. This method is widely used to perform Screening Analysis. The second method (Spearman's rank correlation) relies on regression-based non-parametric measures. This method is suitable if the relation between the input and the output variables is at least monotonic, with the advantage of a low computational cost. The last method (Sobol') computes so-called total effect indices which account for

  7. Alternative Auditing Approaches

    Energy Technology Data Exchange (ETDEWEB)

    Kandt, Alicen J [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-09-15

    This presentation for the 2017 Energy Exchange in Tampa, Florida, offers information about advanced auditing technologies and techniques including alternative auditing approaches and considerations and caveats.

  8. Evaluating six soft approaches

    DEFF Research Database (Denmark)

    Sørensen, Lene; Vidal, Rene Victor Valqui

    2006-01-01

    ’s interactive planning principles to be supported by soft approaches in carrying out the principles in action. These six soft approaches are suitable for supporting various steps of the strategy development and planning process. These are the SWOT analysis, the Future Workshop, the Scenario methodology......, Strategic Option Development and Analysis, Strategic Choice Approach and Soft Systems Methodology. Evaluations of each methodology are carried out using a conceptual framework in which the organisation, the result, the process and the technology of the specific approach are taken into consideration. Using...

  9. Evaluating Six Soft Approaches

    DEFF Research Database (Denmark)

    Sørensen, Lene Tolstrup; Valqui Vidal, René Victor

    2008-01-01

    's interactive planning principles to be supported by soft approaches in carrying out the principles in action. These six soft approaches are suitable forsupporting various steps of the strategy development and planning process. These are the SWOT analysis, the Future Workshop, the Scenario methodology......, Strategic Option Development and Analysis, Strategic Choice Approach and Soft Systems Methodology. Evaluations of each methodology are carried out using a conceptual framework in which the organisation, the result, the process and the technology of the specific approach are taken into consideration. Using...

  10. Evaluating six soft approaches

    DEFF Research Database (Denmark)

    Sørensen, Lene Tolstrup; Vidal, Rene Victor Valqui

    2008-01-01

    's interactive planning principles to be supported by soft approaches in carrying out the principles in action. These six soft approaches are suitable forsupporting various steps of the strategy development and planning process. These are the SWOT analysis, the Future Workshop, the Scenario methodology......, Strategic Option Development and Analysis, Strategic Choice Approach and Soft Systems Methodology. Evaluations of each methodology are carried out using a conceptual framework in which the organisation, the result, the process and the technology of the specific approach are taken into consideration. Using...

  11. A new approach for product cost estimation using data envelopment analysis

    Directory of Open Access Journals (Sweden)

    Adil Salam

    2012-10-01

    Full Text Available Cost estimation of new products has always been difficult as only few design, manufacturing and operational features will be known. In these situations, parametric or non-parametric methods are commonly used to estimate the cost of a product given the corresponding cost drivers. The parametric models use priori determined cost function where the parameters of the function are evaluated from historical data. Non-parametric methods, on the other hand, attempt to fit curves to the historic data without predetermined function. In both methods, it is assumed that the historic data used in the analysis is a true representation of the relation between the cost drivers and the corresponding costs. However, because of efficiency variations of the manufacturers and suppliers, changes in supplier selections, market fluctuations, and several other reasons, certain costs in the historic data may be too high whereas other costs may represent better deals for their corresponding cost drivers. Thus, it may be important to rank the historic data and identify benchmarks and estimate the target costs of the product based on these benchmarks. In this paper, a novel adaptation of cost drivers and cost data is introduced in order to use data envelopment analysis for the purpose of ranking cost data and identify benchmarks, and then estimate the target costs of a new product based on these benchmarks. An illustrative case study has been presented for the cost estimation of landing gears of an aircraft manufactured by an aerospace company located in Montreal, CANADA.

  12. Stuttering-Psycholinguistic Approach

    Science.gov (United States)

    Hategan, Carolina Bodea; Anca, Maria; Prihoi, Lacramioara

    2012-01-01

    This research promotes psycholinguistic paradigm, it focusing in delimitating several specific particularities in stuttering pathology. Structural approach, on language sides proves both the recurrent aspects found within specialized national and international literature and the psycholinguistic approaches dependence on the features of the…

  13. Approaches to understand culture

    DEFF Research Database (Denmark)

    Rasmussen, Lauge Baungaard; Rauner, Felix

    1996-01-01

    Different approaches to understand the concept ofculture are presented and evaluated. The author'sconcept of culture is defined. Different aspectsof the concept are discussed.......Different approaches to understand the concept ofculture are presented and evaluated. The author'sconcept of culture is defined. Different aspectsof the concept are discussed....

  14. Flipped Classroom Approach

    Science.gov (United States)

    Ozdamli, Fezile; Asiksoy, Gulsum

    2016-01-01

    Flipped classroom is an active, student-centered approach that was formed to increase the quality of period within class. Generally this approach whose applications are done mostly in Physical Sciences, also attracts the attention of educators and researchers in different disciplines recently. Flipped classroom learning which wide-spreads rapidly…

  15. The Geodynamic Approach

    DEFF Research Database (Denmark)

    Steenfelt, Jørgen S.; Ibsen, Lars Bo

    1996-01-01

    The Danish National lecture: The Geodynamic approach - problem or possibility? - mirrors the authors involvement in projects and research focusing on the impact of the geodynamic approach. The lecture discusses the why and how of some of the geotechnical anomalies and the differences in traditional...

  16. Medicinal plants in the cultural landscape of a Mapuche-Tehuelche community in arid Argentine Patagonia: an eco-sensorial approach.

    Science.gov (United States)

    Molares, Soledad; Ladio, Ana

    2014-08-26

    The taste and smell of medicinal plants and their relation to the cultural landscape of a Mapuche-Tehuelche community in the Patagonian steppe was investigated. We assume that the landscapes as a source of therapeutic resources is perceived, classified and named according to different symbolic, ecological and utilitarian criteria which are influenced by chemosensorial appearance of medicinal plants which are valued by inhabitants. Information relating to the cultural landscape experienced by 18 inhabitants, all representing 85% of the families, in terms of medicinal plants, knowledge of species and their organoleptic perception was obtained through participant observation, interviews and free listing. The data were examined using cualitative and quantitative approach, including discourse analysis and non-parametric statistics. Informants use 121 medicinal species, obtained from both wild and non-wild environments, most of which (66%) present aroma and/or taste. It was found that the plants with highest use consensus used for digestive, respiratory, cardio-vascular, analgesic-anti-inflammatory, obstetric-gynaecological and genito-unrinary complaints, have the highest frequencies of cites reporting flavor; and those with the highest frequencies relating to digestive, analgesic-anti-inflammatory and cultural syndromes present the highest frequencies of aroma. Flavor and/or aroma are interpreted as strong or soft, and the strongest are associated with treatment of supernatural ailments. Also, taste is a distinctive trait for the most of the species collected in all natural units of the landscape, while aroma is more closely associated with species growing at higher altitudes. The local pharmacopeia is also enriched with plants that come from more distant phytogeographical environments, such as the Andean forest and the Patagonian Monte, which are obtained through barter with neighboring populations. Herbal products are also obtained in regional shop. The practices of

  17. Combining empirical approaches and error modelling to enhance predictive uncertainty estimation in extrapolation for operational flood forecasting. Tests on flood events on the Loire basin, France.

    Science.gov (United States)

    Berthet, Lionel; Marty, Renaud; Bourgin, François; Viatgé, Julie; Piotte, Olivier; Perrin, Charles

    2017-04-01

    An increasing number of operational flood forecasting centres assess the predictive uncertainty associated with their forecasts and communicate it to the end users. This information can match the end-users needs (i.e. prove to be useful for an efficient crisis management) only if it is reliable: reliability is therefore a key quality for operational flood forecasts. In 2015, the French flood forecasting national and regional services (Vigicrues network; www.vigicrues.gouv.fr) implemented a framework to compute quantitative discharge and water level forecasts and to assess the predictive uncertainty. Among the possible technical options to achieve this goal, a statistical analysis of past forecasting errors of deterministic models has been selected (QUOIQUE method, Bourgin, 2014). It is a data-based and non-parametric approach based on as few assumptions as possible about the forecasting error mathematical structure. In particular, a very simple assumption is made regarding the predictive uncertainty distributions for large events outside the range of the calibration data: the multiplicative error distribution is assumed to be constant, whatever the magnitude of the flood. Indeed, the predictive distributions may not be reliable in extrapolation. However, estimating the predictive uncertainty for these rare events is crucial when major floods are of concern. In order to improve the forecasts reliability for major floods, an attempt at combining the operational strength of the empirical statistical analysis and a simple error modelling is done. Since the heteroscedasticity of forecast errors can considerably weaken the predictive reliability for large floods, this error modelling is based on the log-sinh transformation which proved to reduce significantly the heteroscedasticity of the transformed error in a simulation context, even for flood peaks (Wang et al., 2012). Exploratory tests on some operational forecasts issued during the recent floods experienced in

  18. Life Span Developmental Approach

    Directory of Open Access Journals (Sweden)

    Ali Eryilmaz

    2011-03-01

    Full Text Available The Life Span Developmental Approach examines development of individuals which occurs from birth to death. Life span developmental approach is a multi-disciplinary approach related with disciplines like psychology, psychiatry, sociology, anthropology and geriatrics that indicates the fact that development is not completed in adulthood, it continues during the life course. Development is a complex process that consists of dying and death. This approach carefully investigates the development of individuals with respect to developmental stages. This developmental approach suggests that scientific disciplines should not explain developmental facts only with age changes. Along with aging, cognitive, biological, and socioemotional development throughout life should also be considered to provide a reasonable and acceptable context, guideposts, and reasonable expectations for the person. There are three important subjects whom life span developmental approach deals with. These are nature vs nurture, continuity vs discontinuity, and change vs stability. Researchers using life span developmental approach gather and produce knowledge on these three most important domains of individual development with their unique scientific methodology.

  19. Otoplasty: A graduated approach.

    Science.gov (United States)

    Foda, H M

    1999-01-01

    Numerous otoplastic techniques have been described for the correction of protruding ears. Technique selection in otoplasty should be done only after careful analysis of the abnormal anatomy responsible for the protruding ear deformity. A graduated surgical approach is presented which is designed to address all contributing factors to the presenting auricular deformity. The approach starts with the more conservative cartilage-sparing suturing techniques, then proceeds to incorporate other more aggressive cartilage weakening maneuvers. Applying this approach resulted in better long-term results with less postoperative lateralization than that encountered on using the cartilage-sparing techniques alone.

  20. Introducing Systems Approaches

    Science.gov (United States)

    Reynolds, Martin; Holwell, Sue

    Systems Approaches to Managing Change brings together five systems approaches to managing complex issues, each having a proven track record of over 25 years. The five approaches are: System Dynamics (SD) developed originally in the late 1950s by Jay Forrester Viable Systems Model (VSM) developed originally in the late 1960s by Stafford Beer Strategic Options Development and Analysis (SODA: with cognitive mapping) developed originally in the 1970s by Colin Eden Soft Systems Methodology (SSM) developed originally in the 1970s by Peter Checkland Critical Systems Heuristics (CSH) developed originally in the late 1970s by Werner Ulrich

  1. Flipped Classroom Approach

    Directory of Open Access Journals (Sweden)

    Fezile Ozdamli

    2016-07-01

    Full Text Available Flipped classroom is an active, student-centered approach that was formed to increase the quality of period within class. Generally this approach whose applications are done mostly in Physical Sciences, also attracts the attention of educators and researchers in different disciplines recently. Flipped classroom learning which wide-spreads rapidly in the world, is not well recognized in our country. That is why the aim of study is to attract attention to its potential in education field and provide to make it recognize more by educators and researchers. With this aim, in the study what flipped classroom approach is, flipped classroom technology models, its advantages and limitations were explained.

  2. Multidisciplinary Approaches to Allergies

    NARCIS (Netherlands)

    Gao Zhongshan,; Shen, Hua-Hao; Zheng, M.; Frewer, L.J.; Gilissen, L.J.W.J.

    2012-01-01

    Allergy is an immunological disease caused by multiple factors and characterized by variability, specificity and complexity. "Multidisciplinary Approaches to Allergies" covers diverse aspects ranging from basic molecular mechanisms to societal issues within the framework of multidisciplinary

  3. Behavioral based safety approaches

    International Nuclear Information System (INIS)

    Maria Michael Raj, I.

    2009-01-01

    Approach towards the establishment of positive safety culture at Heavy Water Plant, Tuticorin includes the adoption of several important methodologies focused on human behavior and culminates with achievement of Total Safety Culture where Quality and Productivity are integrated with Safety

  4. The transformativity approach

    DEFF Research Database (Denmark)

    Holm, Isak Winkel; Lauta, Kristian Cedervall

    2017-01-01

    During the last five to ten years, a considerable body of research has begun to explore how disasters, real and imagined, trigger social transformations. Even if the contributions to this this research stems from a multitude of academic disciplines, we argue in the article, they constitute...... an identifiable and promising approach for future disaster research. We suggest naming it the transformativity approach. Whereas the vulnerability approach explores the social causation of disasters, the transformativity approach reverses the direction of the gaze and investigates the social transformation...... brought about by disasters. Put simply, the concept of vulnerability is about the upstream causes of disaster and the concept of transformativity about the downstream effects. By discussing three recent contributions (by the historian Greg Bankoff, the legal sociologist Michelle Dauber...

  5. RNA/PNA Approach

    Indian Academy of Sciences (India)

    In this approach we want to develop structural analogue of the leader that might have higher affinity towards the Phosphoprotein, but would impair the dimerization process and viral leader RNA binding.

  6. a Capability approach

    African Journals Online (AJOL)

    efforts towards gender equality in education as a means of achieving social justice. ... should mean that a lot of capability approach-oriented commentators are ... processes, their forms of exercising power, and their rules, unwritten cultures, ...

  7. A Theoretical Approach

    African Journals Online (AJOL)

    NICO

    L-rhamnose and L-fucose: A Theoretical Approach ... L-ramnose and L-fucose, by means of the Monte Carlo conformational search method. The energy of the conformers ..... which indicates an increased probability for the occurrence of.

  8. Approach To Absolute Zero

    Indian Academy of Sciences (India)

    more and more difficult to remove heat as one approaches absolute zero. This is the ... A new and active branch of engineering ... This temperature is called the critical temperature, Te' For sulfur dioxide the critical ..... adsorbent charcoal.

  9. Revitalizing the setting approach

    DEFF Research Database (Denmark)

    Bloch, Paul; Toft, Ulla; Reinbach, Helene Christine

    2014-01-01

    BackgroundThe concept of health promotion rests on aspirations aiming at enabling people to increase control over and improve their health. Health promotion action is facilitated in settings such as schools, homes and work places. As a contribution to the promotion of healthy lifestyles, we have ...... approach is based on ecological and whole-systems thinking, and stipulates important principles and values of integration, participation, empowerment, context and knowledge-based development....... further developed the setting approach in an effort to harmonise it with contemporary realities (and complexities) of health promotion and public health action. The paper introduces a modified concept, the supersetting approach, which builds on the optimised use of diverse and valuable resources embedded...... in local community settings and on the strengths of social interaction and local ownership as drivers of change processes. Interventions based on a supersetting approach are first and foremost characterised by being integrated, but also participatory, empowering, context-sensitive and knowledge...

  10. Flipped Classroom Approach

    OpenAIRE

    Fezile Ozdamli; Gulsum Asiksoy

    2016-01-01

    Flipped classroom is an active, student-centered approach that was formed to increase the quality of period within class. Generally this approach whose applications are done mostly in Physical Sciences, also attracts the attention of educators and researchers in different disciplines recently. Flipped classroom learning which wide-spreads rapidly in the world, is not well recognized in our country. That is why the aim of study is to attract attention to its potential in education field and pr...

  11. STUTTERING-PSYCHOLINGUISTIC APPROACH

    OpenAIRE

    Maria Anca; Carolina Bodea Haţegan; Lăcrămioara Prihoi

    2012-01-01

    This research promotes psycholinguistic paradigm, it focusing in delimitating several specific particularities in stuttering pathology. Structural approach, on language sides proves both the recurrent aspects found within specialized national and international literature and the psycholinguistic approaches dependence on the features of the linguistic material. Thus, the conclusions of this research study offer the possibility of promoting cross-cultural and cross-linguistic researches, with t...

  12. An approach to measurement

    International Nuclear Information System (INIS)

    Gudder, S.P.

    1984-01-01

    A new approach to measurement theory is presented. The definition of measurement is motivated by direct laboratory procedures as they are carried out in practice. The theory is developed within the quantum logic framework. The work clarifies an important problem in the quantum logic approach; namely, where the Hilbert space comes from. The relationship between measurements and observables is considered, and a Hilbert space embedding theorem is presented. Charge systems are also discussed. (author)

  13. The Knowledge Governance Approach

    DEFF Research Database (Denmark)

    Foss, Nicolai J.

    with diverse capabilities of handling these transactions. Various open research issues that a knowledge governance approach may illuminate are sketched. Although knowledge governance draws clear inspiration from organizational economics and `rational' organization theory, it recognizes that knowledge......An attempt is made to characterize a `knowledge governance approach' as a distinctive, emerging field that cuts across the fields of knowledge management, organisation studies, strategy and human resource management. Knowledge governance is taken up with how the deployment of administrative...

  14. A BSC-DEA approach to measure the relative efficiency of service industry: A case study of banking sector

    Directory of Open Access Journals (Sweden)

    M. B. Aryanezhad

    2011-04-01

    Full Text Available Performance evaluation plays an important role in determining faults and difficulties of any organization as well as attempting to increase capabilities and improve activities. Data envelopment analysis (DEA, as a non-parametric method, has been one of the most important and significant management tools for measuring output or efficiency. In this paper, we propose a method to utilize balanced score card (BSC as a tool for designing performance evaluation indices of an organization. The integrated BSC-DEA has been applied as an empirical case for a major private bank organization and the results are analyzed.

  15. Theoretical Approaches to Coping

    Directory of Open Access Journals (Sweden)

    Sofia Zyga

    2013-01-01

    Full Text Available Introduction: Dealing with stress requires conscious effort, it cannot be perceived as equal to individual's spontaneous reactions. The intentional management of stress must not be confused withdefense mechanisms. Coping differs from adjustment in that the latter is more general, has a broader meaning and includes diverse ways of facing a difficulty.Aim: An exploration of the definition of the term "coping", the function of the coping process as well as its differentiation from other similar meanings through a literature review.Methodology: Three theoretical approaches of coping are introduced; the psychoanalytic approach; approaching by characteristics; and the Lazarus and Folkman interactive model.Results: The strategic methods of the coping approaches are described and the article ends with a review of the approaches including the functioning of the stress-coping process , the classificationtypes of coping strategies in stress-inducing situations and with a criticism of coping approaches.Conclusions: The comparison of coping in different situations is difficult, if not impossible. The coping process is a slow process, so an individual may select one method of coping under one set ofcircumstances and a different strategy at some other time. Such selection of strategies takes place as the situation changes.

  16. Market and Style Timing: German Equity and Bond Funds

    OpenAIRE

    Hayley, S.; Nitzsche, D.; Cuthbertson, K.

    2016-01-01

    We apply parametric and non-parametric estimates to test market and style timing ability of individual German equity and bond mutual funds using a sample of over 500 equity and 350 bond funds, over the period 1990-2009. For equity funds, both approaches indicate no successful market timers in the 1990-1999 or 2000-2009 periods, but in 2000-2009 the non-parametric approach gives fewer unsuccessful market timers than the parametric approach. There is evidence of successful style timing using th...

  17. Approaching a Postcolonial Arctic

    DEFF Research Database (Denmark)

    Jensen, Lars

    2016-01-01

    This article explores different postcolonially configured approaches to the Arctic. It begins by considering the Arctic as a region, an entity, and how the customary political science informed approaches are delimited by their focus on understanding the Arctic as a region at the service...... of the contemporary neoliberal order. It moves on to explore how different parts of the Arctic are inscribed in a number of sub-Arctic nation-state binds, focusing mainly on Canada and Denmark. The article argues that the postcolonial can be understood as a prism or a methodology that asks pivotal questions to all...... approaches to the Arctic. Yet the postcolonial itself is characterised by limitations, not least in this context its lack of interest in the Arctic, and its bias towards conventional forms of representation in art. The article points to the need to develop a more integrated critique of colonial and neo...

  18. Life History Approach

    DEFF Research Database (Denmark)

    Olesen, Henning Salling

    2015-01-01

    as in everyday life. Life histories represent lived lives past, present and anticipated future. As such they are interpretations of individuals’ experiences of the way in which societal dynamics take place in the individual body and mind, either by the individual him/herself or by another biographer. The Life...... History approach was developing from interpreting autobiographical and later certain other forms of language interactive material as moments of life history, i.e. it is basically a hermeneutic approach. Talking about a psycho-societal approach indicates the ambition of attacking the dichotomy...... of the social and the psychic, both in the interpretation procedure and in some main theoretical understandings of language, body and mind. My article will present the reflections on the use of life history based methodology in learning and education research as a kind of learning story of research work....

  19. Technical approach document

    International Nuclear Information System (INIS)

    1989-12-01

    The Uranium Mill Tailings Radiation Control Act (UMTRCA) of 1978, Public Law 95-604 (PL95-604), grants the Secretary of Energy the authority and responsibility to perform such actions as are necessary to minimize radiation health hazards and other environmental hazards caused by inactive uranium mill sites. This Technical Approach Document (TAD) describes the general technical approaches and design criteria adopted by the US Department of Energy (DOE) in order to implement remedial action plans (RAPS) and final designs that comply with EPA standards. It does not address the technical approaches necessary for aquifer restoration at processing sites; a guidance document, currently in preparation, will describe aquifer restoration concerns and technical protocols. This document is a second revision to the original document issued in May 1986; the revision has been made in response to changes to the groundwater standards of 40 CFR 192, Subparts A--C, proposed by EPA as draft standards. New sections were added to define the design approaches and designs necessary to comply with the groundwater standards. These new sections are in addition to changes made throughout the document to reflect current procedures, especially in cover design, water resources protection, and alternate site selection; only minor revisions were made to some of the sections. Sections 3.0 is a new section defining the approach taken in the design of disposal cells; Section 4.0 has been revised to include design of vegetated covers; Section 8.0 discusses design approaches necessary for compliance with the groundwater standards; and Section 9.0 is a new section dealing with nonradiological hazardous constituents. 203 refs., 18 figs., 26 tabs

  20. Homogenization approach in engineering

    International Nuclear Information System (INIS)

    Babuska, I.

    1975-10-01

    Homogenization is an approach which studies the macrobehavior of a medium by its microproperties. Problems with a microstructure play an essential role in such fields as mechanics, chemistry, physics, and reactor engineering. Attention is concentrated on a simple specific model problem to illustrate results and problems typical of the homogenization approach. Only the diffusion problem is treated here, but some statements are made about the elasticity of composite materials. The differential equation is solved for linear cases with and without boundaries and for the nonlinear case. 3 figures, 1 table

  1. [The novel approaches to the rehabilitation of the patients presenting with gastroesophageal reflux disease and co-morbid pathology].

    Science.gov (United States)

    Komleva, N E; Marjanovsky, A A; Danilov, A N; Agasarov, L G

    placental compositum, co-enzyme compositum, and ubichinon compositum (Biologische Heilmittel Heel GmbH, Germany). The statistical analysis was performed with the use of the non-parametric methods based on the Statistica application software package ('StatSoft Inc.', США). The present article reports the results of the study that demonstrate significant strong positive correlation between the clinical manifestations of gastroesophageal reflux disease (the frequency and severity of symptoms) and thoracalgia and provides a rationale for the assessment of the vertebro-neurological status of the thoracic spine in the patients exhibiting the clinical signs of gastroesophageal reflux disease. The study substantiated the inclusion of pharmacopuncture with placenta compositum, co-enzyme compositum, and ubichinon compositum anti-homotoxic medications in the complex rehabilitation programs for the patients presenting with the clinical signs of gastroesophageal reflux disease and the concomitant thoracalgia symptoms. To assess the effectiveness of the proposed method for the treatment of this condition, such diagnostic criteria as the vertebro-neurological symptoms, coefficient of the thoracalgia-associated pain syndrome, the frequency and intensity of the gastroesophageal reflux disease symptoms, and the quality of life parameters were used. The results of the present study provide strong evidence that pharmacopuncture helps to improve the quality of life of the patients, alleviate thoracalgia symptoms and clinical signs of gastroesophageal reflux disease, and reduce the intensity of the pain syndrome caused by thoracalgia.

  2. Thematic curriculum approach

    Directory of Open Access Journals (Sweden)

    Šefer Jasmina P.

    2003-01-01

    Full Text Available Thematic curriculum combines disciplines and media. The process is problem-oriented and the scenario most often follows the logic of exploring or storytelling. Those two approaches to teaching are appropriate because they fit into interdisciplinary and creative open-ended problem solving through play, as insisted upon by thematic curriculum. The matrix, where seven types of abilities intersect with five types of problems according to their degree of openness, defines well the outcomes of teaching. However, it did not prove to be suitable for planning the majority of activities in thematic curriculum, for it follows with difficulty the process of exploring or storytelling i.e. it disrupts the subject matter coherence of thematic curriculum. Therefore, it is suggested that matrix should be used for disciplinary curriculum planning but for that of thematic curriculum only in exclusive cases. The matrix should be used primarily as a framework for evaluating the distribution of various types of abilities and problem situations in teaching. The logic of diverse approaches to teaching reflects itself in the manner of planning and organizing the teaching process. Conceptual, visual-graphic, structural and other aids employed during educational process planning should suit the nature of the approach chosen. On the basis of qualitative investigations of educational process, in the present paper considerations are given to various approaches to teaching development of various drafts for the planning of teaching, and recognition of the logic of storytelling and exploring in thematic curriculum.

  3. Islamic approach in counseling.

    Science.gov (United States)

    Hanin Hamjah, Salasiah; Mat Akhir, Noor Shakirah

    2014-02-01

    A religious approach is one of the matters emphasized in counseling today. Many researchers find that there is a need to apply the religious element in counseling because religion is important in a client's life. The purpose of this research is to identify aspects of the Islamic approach applied in counseling clients by counselors at Pusat Kaunseling Majlis Agama Islam Negeri Sembilan (PKMAINS). In addition, this research also analyses the Islamic approach applied in counseling at PKMAINS with reference to al-Quran and al-Sunnah. This is a qualitative research in the form of case study at PKMAINS. The main method used in this research is interview. The research instrument used is interview protocol. The respondents in this study include 9 counselors who serve in one of the counseling centers in Malaysia. This study also uses questionnaire as an additional instrument, distributed to 36 clients who receive counseling service at the center. The findings of the study show that the Islamic approach applied in counseling at PKMAINS may be categorized into three main aspects: aqidah (faith), ibadah (worship/ultimate devotion and love for God) and akhlaq (moral conduct). Findings also show that the counseling in these aspects is in line with Islamic teachings as contained in al-Quran and al-Sunnah.

  4. A green chemistry approach

    Indian Academy of Sciences (India)

    Administrator

    One-pot synthesis of quinaldine derivatives by using microwave irradiation without any solvent – A green chemistry approach. JAVAD SAFARI*, SAYED HOSSEIN BANITABA and SEPEHR SADEGH SAMIEI. Department of Chemistry, The Faculty of sciences, University of Kashan, Kashan,. P.O. Box 87317-51167, I.R. Iran.

  5. Realistic Approach to Innovation.

    Science.gov (United States)

    Dawson, Garth C.

    Part of the Omaha police in-service training program was devoted to innovative approaches to solving police department problems and improving community relations. The sessions were an attempt to use the brainstorming technique to elicit new solutions to everyday problems faced by the rank-and-file members of the police department. The report…

  6. Salt repository design approach

    International Nuclear Information System (INIS)

    Matthews, S.C.

    1983-01-01

    This paper presents a summary discussion of the approaches that have been and will be taken in design of repository facilities for use with disposal of radioactive wastes in salt formations. Since specific sites have yet to be identified, the discussion is at a general level, supplemented with illustrative examples where appropriate. 5 references, 1 figure

  7. The Capability Approach

    NARCIS (Netherlands)

    I.A.M. Robeyns (Ingrid)

    2011-01-01

    textabstract In its most general description, the capability approach is a flexible and multi-purpose normative framework, rather than a precise theory of well-being, freedom or justice. At its core are two normative claims: first, the claim that the freedom to achieve well-being is of primary

  8. Orion Emergency Mask Approach

    Science.gov (United States)

    Tuan, George C.; Graf, John C.

    2009-01-01

    Emergency mask approach on Orion poses a challenge to the traditional Shuttle or Station approaches. Currently, in the case of a fire or toxic spill event, the crew utilizes open loop oxygen masks that provide the crew with oxygen to breath, but also dumps the exhaled oxygen into the cabin. For Orion, with a small cabin volume, the extra oxygen will exceed the flammability limit within a short period of time, unless a nitrogen purge is also provided. Another approach to a fire or toxic spill event is the use of a filtering emergency masks. These masks utilize some form of chemical beds to scrub the air clean of toxic providing the crew safe breathing air for a period without elevating the oxygen level in the cabin. Using the masks and a form of smoke-eater filter, it may be possible to clean the cabin completely or to a level for safe transition to a space suit to perform a cabin purge. Issues with filters in the past have been the reaction time, breakthroughs, and high breathing resistance. Development in a new form of chemical filters has shown promise to make the filtering approach feasible.

  9. Approach to neonatal sepsis

    Directory of Open Access Journals (Sweden)

    Shankar Narayan

    2015-01-01

    The treatment includes supportive care along with administration of appropriate antibiotics. Adjuvant treatment includes IVIG, GCSF, exchange transfusion and pentoxifylline administration. This paper aims to present an algorithmic approach to neonatal sepsis to expedite the diagnosis along with providing appropriate and adequate treatment.

  10. Approaches to acceptable risk

    International Nuclear Information System (INIS)

    Whipple, C.

    1997-01-01

    Several alternative approaches to address the question open-quotes How safe is safe enough?close quotes are reviewed and an attempt is made to apply the reasoning behind these approaches to the issue of acceptability of radiation exposures received in space. The approaches to the issue of the acceptability of technological risk described here are primarily analytical, and are drawn from examples in the management of environmental health risks. These include risk-based approaches, in which specific quantitative risk targets determine the acceptability of an activity, and cost-benefit and decision analysis, which generally focus on the estimation and evaluation of risks, benefits and costs, in a framework that balances these factors against each other. These analytical methods tend by their quantitative nature to emphasize the magnitude of risks, costs and alternatives, and to downplay other factors, especially those that are not easily expressed in quantitative terms, that affect acceptance or rejection of risk. Such other factors include the issues of risk perceptions and how and by whom risk decisions are made

  11. Health: An Ecosystem Approach

    International Development Research Centre (IDRC) Digital Library (Canada)

    In the early days, the research supported was largely biomedical: vaccines, ... editor of Découvrir magazine in Montréal, who produced the first draft of the book, ...... Their transdisciplinary approach included gender-specific parameters. ..... Over time, some progress was made against malaria, but the war was far from won.

  12. NEW APPROACHES: Toppling trains

    Science.gov (United States)

    Parry, Malcolm

    1998-03-01

    This article explains a novel way of approaching centripetal force: theory is used to predict an orbital period at which a toy train will topple from a circular track. The demonstration has proved useful in A-level, GNVQ and undergraduate Physics and Engineering schemes.

  13. Approach to Absolute Zero

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 2; Issue 10. Approach to Absolute Zero Below 10 milli-Kelvin. R Srinivasan. Series Article Volume 2 Issue 10 October 1997 pp 8-16. Fulltext. Click here to view fulltext PDF. Permanent link: https://www.ias.ac.in/article/fulltext/reso/002/10/0008-0016 ...

  14. Fuel cycle oriented approach

    International Nuclear Information System (INIS)

    Petit, A.

    1987-01-01

    The term fuel cycle oriented approach is currently used to designate two quite different things: the attempt to consider all or part of a national fuel cycle as one material balance area (MBA) or to consider individual MBAs existing in a state while designing a unique safeguards approach for each and applying the principle of nondiscrimination to fuel cycles as a whole, rather than to individual facilities. The merits of such an approach are acceptability by the industry and comparison with the contemplated establishment of long-term criteria. The following points concern the acceptability by the industry: (1) The main interest of the industry is to keep an open international market and therefore, to have effective and efficient safeguards. (2) The main concerns of the industry regarding international safeguards are economic burden, intrusiveness, and discrimination. Answers to these legitimate concerns, which retain the benefits of a fuel cycle oriented approach, are needed. More specifically, the problem of reimbursing the operator the costs that he has incurred for the safeguards must be considered

  15. Financial Management: An Organic Approach

    Science.gov (United States)

    Laux, Judy

    2013-01-01

    Although textbooks present corporate finance using a topical approach, good financial management requires an organic approach that integrates the various assignments financial managers confront every day. Breaking the tasks into meaningful subcategories, the current article offers one approach.

  16. Ecosystem approach in education

    Science.gov (United States)

    Nabiullin, Iskander

    2017-04-01

    Environmental education is a base for sustainable development. Therefore, in our school we pay great attention to environmental education. Environmental education in our school is based on ecosystem approach. What is an ecosystem approach? Ecosystem is a fundamental concept of ecology. Living organisms and their non-living environments interact with each other as a system, and the biosphere planet functions as a global ecosystem. Therefore, it is necessary for children to understand relationships in ecosystems, and we have to develop systems thinking in our students. Ecosystem approach and systems thinking should help us to solve global environmental problems. How do we implement the ecosystem approach? Students must understand that our biosphere functions as a single ecosystem and even small changes can lead to environmental disasters. Even the disappearance of one plant or animal species can lead to irreversible consequences. So in the classroom we learn the importance of each living organism for the nature. We pay special attention to endangered species, which are listed in the Red Data List. Kids are doing projects about these organisms, make videos, print brochures and newspapers. Fieldwork also plays an important role for ecosystem approach. Every summer, we go out for expeditions to study species of plants and animals listed in the Red Data List of Tatarstan. In class, students often write essays on behalf of any endangered species of plants or animals, this also helps them to understand the importance of each living organism in nature. Each spring we organise a festival of environmental projects among students. Groups of 4-5 students work on a solution of environmental problems, such as water, air or soil pollution, waste recycling, the loss of biodiversity, etc. Participants shoot a clip about their project, print brochures. Furthermore, some of the students participate in national and international scientific Olympiads with their projects. In addition to

  17. Proportional representation apportionment methods and their applications

    CERN Document Server

    Pukelsheim, Friedrich

    2017-01-01

    The book offers an in-depth study of the translation of vote counts into seat numbers in proportional representation systems  – an approach guided by practical needs. It also provides plenty of empirical instances illustrating the results. It analyzes in detail the 2014 elections to the European Parliament in the 28 member states, as well as the 2009 and 2013 elections to the German Bundestag. This second edition is a complete revision and expanded version of the first edition published in 2014, and many empirical election results that serve as examples have been updated. Further, a final chapter has been added assembling biographical sketches and authoritative quotes from individuals who pioneered the development of apportionment methodology. The mathematical exposition and the interrelations with political science and constitutional jurisprudence make this an apt resource for interdisciplinary courses and seminars on electoral systems and apportionment methods.

  18. Domain Approach: An Alternative Approach in Moral Education

    Science.gov (United States)

    Vengadasalam, Chander; Mamat, Wan Hasmah Wan; Mail, Fauziah; Sudramanian, Munimah

    2014-01-01

    This paper discusses the use of the domain approach in moral education in an upper secondary school in Malaysia. Moral Education needs a creative and an innovative approach. Therefore, a few forms of approaches are used in the teaching-learning of Moral Education. This research describes the use of domain approach which comprises the moral domain…

  19. Segmentation of Nonstationary Time Series with Geometric Clustering

    DEFF Research Database (Denmark)

    Bocharov, Alexei; Thiesson, Bo

    2013-01-01

    We introduce a non-parametric method for segmentation in regimeswitching time-series models. The approach is based on spectral clustering of target-regressor tuples and derives a switching regression tree, where regime switches are modeled by oblique splits. Such models can be learned efficiently...... from data, where clustering is used to propose one single split candidate at each split level. We use the class of ART time series models to serve as illustration, but because of the non-parametric nature of our segmentation approach, it readily generalizes to a wide range of time-series models that go...

  20. Rail transport systems approach

    CERN Document Server

    2017-01-01

    This book shows how the systems approach is employed by scientists in various countries to solve specific problems concerning railway transport. In particular, the book describes the experiences of scientists from Romania, Germany, the Czech Republic, the UK, Russia, Ukraine, Lithuania and Poland. For many of these countries there is a problem with the historical differences between the railways. In particular, there are railways with different rail gauges, with different signaling and communication systems, with different energy supplies and, finally, with different political systems, which are reflected in the different approaches to the management of railway economies. The book’s content is divided into two main parts, the first of which provides a systematic analysis of individual means of providing and maintaining rail transport. In turn, the second part addresses infrastructure and management development, with particular attention to security issues. Though primarily written for professionals involved...

  1. Approach to reliability assessment

    International Nuclear Information System (INIS)

    Green, A.E.; Bourne, A.J.

    1975-01-01

    Experience has shown that reliability assessments can play an important role in the early design and subsequent operation of technological systems where reliability is at a premium. The approaches to and techniques for such assessments, which have been outlined in the paper, have been successfully applied in variety of applications ranging from individual equipments to large and complex systems. The general approach involves the logical and systematic establishment of the purpose, performance requirements and reliability criteria of systems. This is followed by an appraisal of likely system achievment based on the understanding of different types of variational behavior. A fundamental reliability model emerges from the correlation between the appropriate Q and H functions for performance requirement and achievement. This model may cover the complete spectrum of performance behavior in all the system dimensions

  2. Thermodynamics an engineering approach

    CERN Document Server

    Cengel, Yunus A

    2014-01-01

    Thermodynamics, An Engineering Approach, eighth edition, covers the basic principles of thermodynamics while presenting a wealth of real-world engineering examples so students get a feel for how thermodynamics is applied in engineering practice. This text helps students develop an intuitive understanding by emphasizing the physics and physical arguments. Cengel and Boles explore the various facets of thermodynamics through careful explanations of concepts and use of numerous practical examples and figures, having students develop necessary skills to bridge the gap between knowledge and the confidence to properly apply their knowledge. McGraw-Hill is proud to offer Connect with the eighth edition of Cengel/Boles, Thermodynamics, An Engineering Approach. This innovative and powerful new system helps your students learn more efficiently and gives you the ability to assign homework problems simply and easily. Problems are graded automatically, and the results are recorded immediately. Track individual stude...

  3. Electronics a systems approach

    CERN Document Server

    Storey, Neil

    2017-01-01

    Electronics plays a central role in our everyday lives. It is at the heart of almost all of today's essential technology, from mobile phones to computers and from cars to power stations. As such, all engineers, scientists and technologists need to have a fundamental understanding of this exciting subject, and for many this will just be the beginning. Now in its sixth edition, Electronics: A Systems Approach provides an outstanding introduction to this fast-moving and important field. Comprehensively revised and updated to cover the latest developments in the world of electronics, the text continues to use Neil Storey's established and well-respected systems approach. It introduces the basic concepts first before progressing to a more advanced analysis, enabling you to contextualise what a system is designed to achieve before tackling the intricacies of designing or analysing its various components with confidence. This book is accompanied by a website which contains over 100 video tutorials to help explain ke...

  4. Transaction based approach

    Science.gov (United States)

    Hunka, Frantisek; Matula, Jiri

    2017-07-01

    Transaction based approach is utilized in some methodologies in business process modeling. Essential parts of these transactions are human beings. The notion of agent or actor role is usually used for them. The paper on a particular example describes possibilities of Design Engineering Methodology for Organizations (DEMO) and Resource-Event-Agent (REA) methodology. Whereas the DEMO methodology can be regarded as a generic methodology having its foundation in the theory of Enterprise Ontology the REA methodology is regarded as the domain specific methodology and has its origin in accountancy systems. The results of these approaches is that the DEMO methodology captures everything that happens in the reality with a good empirical evidence whereas the REA methodology captures only changes connected with economic events. Economic events represent either change of the property rights to economic resource or consumption or production of economic resources. This results from the essence of economic events and their connection to economic resources.

  5. Approaching comparative company law

    OpenAIRE

    Donald, David C.

    2008-01-01

    This paper identifies some common errors that occur in comparative law, offers some guidelines to help avoid such errors, and provides a framework for entering into studies of the company laws of three major jurisdictions. The first section illustrates why a conscious approach to comparative company law is useful. Part I discusses some of the problems that can arise in comparative law and offers a few points of caution that can be useful for practical, theoretical and legislative comparative ...

  6. Approaching Service Innovation Patterns

    OpenAIRE

    Andrea NAGY

    2013-01-01

    The present paper aims at analyzing the types of innovation in the field of services. First, the concept of innovation is defined and second, field literature is reviewed from the perspective of service innovation. The main types of innovation are identified based on several attempts at defining innovation, the most notable being Schumpeter’s. Thus, it is possible to approach concepts such as product and process innovation, incremental and radical innovation. Another aim has been to regard se...

  7. Robust Approaches to Forecasting

    OpenAIRE

    Jennifer Castle; David Hendry; Michael P. Clements

    2014-01-01

    We investigate alternative robust approaches to forecasting, using a new class of robust devices, contrasted with equilibrium correction models. Their forecasting properties are derived facing a range of likely empirical problems at the forecast origin, including measurement errors, implulses, omitted variables, unanticipated location shifts and incorrectly included variables that experience a shift. We derive the resulting forecast biases and error variances, and indicate when the methods ar...

  8. The Branding Management Approaches

    Institute of Scientific and Technical Information of China (English)

    YAKOUBI; Mohamed Lamine

    2014-01-01

    [Abstract]We wil try to present,through the display of the various branding concepts and theories, the different branding management approaches.This wil present the different visions of the discipline depending on the author to try and demonstrate their differences,at first, and their complementarities at last to help the different branding management practitioners (brand managers,marketing managers,advertisers,media-planners……) apprehend the right brand positioning strategy to engage.

  9. Towards a Dual Approach

    DEFF Research Database (Denmark)

    Holli, Anne Maria; Harder, Mette Marie Stæhr

    2016-01-01

    Drawing on insights from state feminism and legislative studies on parliamentary committees, this article develops a dual approach for the comparative analysis of committees on gender equality. Empirically, it compares the standing committees on gender equality in Denmark and Finland, two Nordic...... as measured by legislative outputs and oversight differs, however, in line with differing committee system characteristics: the Finnish committee has more impact on legislative outputs while the Danish committee has more impact on overseeing government....

  10. APPROACHES FOR SUSTAINABLE MANUFACTURING

    Institute of Scientific and Technical Information of China (English)

    G(U)NTHER Seliger; SEBASTIAN Kernbaum; MARCO Zettl

    2007-01-01

    Sustainable development is a holistic approach harmonizing ecological, economical and socio-political needs with respect to the superior objective of enhancing human living standards. Thereby the availability of natural resources and the conservation of the ecosystems have to be considered that future generations have the possibility to meet their own needs. A long-term economical development demands the transition from a source-sink economy to a cycle economy as a result of limited resources, limited environmental capacities to absorb waste and emissions as well as increasing needs of a growing population. A reference model for sustainability in manufacturing is presented and used to illustrate sustainable approaches with respect to management, technology, process and product. Adaptation of products and components is a vital element for supporting efficient reuse of products and components. Consequently adaptation contributes to the ambitious goals of sustainability. Technological enablers for adaptation as modularity, information and communication technology are exemplarily introduced. Moreover, approaches for disseminating knowledge in sustainability are given.

  11. Peritonitis: laparoscopic approach

    Directory of Open Access Journals (Sweden)

    Agresta Ferdinando

    2006-03-01

    Full Text Available Abstract Background Laparoscopy has became as the preferred surgical approach to a number of different diseases because it allows a correct diagnosis and treatment at the same time. In abdominal emergencies, both components of treatment – exploration to identify the causative pathology and performance of an appropriate operation – can often be accomplished via laparoscopy. There is still a debate of peritonitis as a contraindication to this kind of approach. Aim of the present work is to illustrate retrospectively the results of a case-control experience of laparoscopic vs. open surgery for abdominal peritonitis emergencies carried out at our institution. Methods From January 1992 and January 2002 a total of 935 patients (mean age 42.3 ± 17.2 years underwent emergent and/or urgent surgery. Among them, 602 (64.3% were operated on laparoscopically (of whom 112 -18.7% – with peritonitis, according to the presence of a surgical team trained in laparoscopy. Patients with a history of malignancy, more than two previous major abdominal surgeries or massive bowel distension were not treated Laparoscopically. Peritonitis was not considered contraindication to Laparoscopy. Results The conversion rate was 23.2% in patients with peritonitis and was mainly due to the presence of dense intra-abdominal adhesions. Major complications ranged as high as 5.3% with a postoperative mortality of 1.7%. A definitive diagnosis was accomplished in 85.7% (96 pat. of cases, and 90.6% (87 of these patients were treated successfully by Laparoscopy. Conclusion Even if limited by its retrospective feature, the present experience let us to consider the Laparoscopic approach to abdominal peritonitis emergencies a safe and effective as conventional surgery, with a higher diagnostic yield and allows for lesser trauma and a more rapid postoperative recovery. Such features make Laparoscopy a challenging alternative to open surgery in the management algorithm for abdominal

  12. Standardised approach to optimisation

    International Nuclear Information System (INIS)

    Warren-Forward, Helen M.; Beckhaus, Ronald

    2004-01-01

    Optimisation of radiographic images is said to have been obtained if the patient has achieved an acceptable level of dose and the image is of diagnostic value. In the near future, it will probably be recommended that radiographers measure patient doses and compare them to reference levels. The aim of this paper is to describe a standardised approach to optimisation of radiographic examinations in a diagnostic imaging department. A three-step approach is outlined with specific examples for some common examinations (chest, abdomen, pelvis and lumbar spine series). Step One: Patient doses are calculated. Step Two: Doses are compared to existing reference levels and the technique used compared to image quality criteria. Step Three: Appropriate action is taken if doses are above the reference level. Results: Average entrance surface doses for two rooms were as follows AP Abdomen (6.3mGy and 3.4mGy); AP Lumbar Spine (6.4mGy and 4.1mGy) for AP Pelvis (4.8mGy and 2.6mGy) and PA chest (0.19mGy and 0.20mGy). Comparison with the Commission of the European Communities (CEC) recommended techniques identified large differences in the applied potential. The kVp values in this study were significantly lower (by up to lOkVp) than the CEC recommendations. The results of this study have indicated that there is a need to monitor radiation doses received by patients undergoing diagnostic radiography examinations. Not only has the assessment allowed valuable comparison with International Diagnostic Reference Levels and Radiography Good Practice but has demonstrated large variations in mean doses being delivered from different rooms of the same radiology department. Following the simple 3-step approach advocated in this paper should either provide evidence that department are practising the ALARA principle or assist in making suitable changes to current practice. Copyright (2004) Australian Institute of Radiography

  13. Experimental approaches and applications

    CERN Document Server

    Crasemann, Bernd

    1975-01-01

    Atomic Inner-Shell Processes, Volume II: Experimental Approaches and Applications focuses on the physics of atomic inner shells, with emphasis on experimental aspects including the use of radioactive atoms for studies of atomic transition probabilities. Surveys of modern techniques of electron and photon spectrometry are also presented, and selected practical applications of inner-shell processes are outlined. Comprised of six chapters, this volume begins with an overview of the general principles underlying the experimental techniques that make use of radioactive isotopes for inner-sh

  14. Craniopharyngioma - Transnasal Endoscopic Approach

    Directory of Open Access Journals (Sweden)

    Sanjeev Bhagat,

    2011-01-01

    Full Text Available Craniopharyngiomas are slow growing tumours arising from remnants of the craniopharyngeal duct and occupy the sellar region. The patients may remain asymptomatic for long duration or present with headache or visual disturbances. Surgery is the mainstay of the treatment. Traditionally these tumours have been removed by neurosurgeons through the cranial approach but the advent of nasal endoscopes has opened new avenues for ENT surgeons to treat such patients. We hereby present a case of craniopharyngioma who was successfully treated by Trans-nasal Hypophysectomy.

  15. URBAN POLITICS: KEY APPROACHES

    Directory of Open Access Journals (Sweden)

    Ledyaeva Ol'ga Mikhaylovna

    2012-10-01

    Full Text Available Several approaches that underlie urban politics are discussed in the paper. They include neo-liberalism, political economy discourse, elitist/pluralist debates, and postmodernism. The neoliberal approach focuses on the limited role of the state and individual responsibility. The legal framework protects both the rights and responsibilities of individuals and regulates the operation of the market. It is the market that fosters individual choices and provides goods and services by virtue of the processes which are flexible, efficient and transparent. The political economy approaches (regulation theory, public choice theory, neo-Marxism explain urban politics via the analysis of national and international economic processes and changes in contemporary capitalism. Changes in national and international economies determine what solutions are possible. The discourse has been influenced by the debate on globalization of capital and labour markets. Modern elitism and neopluralism are represented by theories of "growth machines" and "urban regimes". The former focuses on bargaining alliances between political and business leaders in order to manage the urban system and to promote its growth. The latter develops neopluralist explanations of power within local communities with an emphasis on the fragmented nature of the government where local authorities lack comprehensive governing powers. Postmodernism views the city as the site of the crisis of late capitalism which leads to segregation of neighbourhoods onto prosperous areas and ghettoes. In contrast to the modern city, the postmodern city is not defined by its industrial base; rather, it is determined by its consumerist environment of malls and museums, characterized by revivalist architecture. At the same time, the suburban shopping mall and a motorway network make nonsense of the idea of the city as a unique and well-defined space. These and other approaches encompass a wide spectrum of possibilities

  16. New Encrypted Steganography Approach

    Directory of Open Access Journals (Sweden)

    Saba Mohammed Husain‎

    2017-12-01

    Full Text Available The proposed research Provides an approach for hiding an encrypted text in side a digital image. Where the text is encrypted in a complex manner used method of PlayFair to encrypt clear text and to increase security put lettering ciphertext on the geometric shape clockwise and then we write the ciphertext output in the form of lines, taken new ciphertext and converted to Ascii code and then to binary and hidden text in bits least importance in the picture. The results were good by PNSR scale

  17. The Capability Approach

    OpenAIRE

    Robeyns, Ingrid

    2011-01-01

    textabstract In its most general description, the capability approach is a flexible and multi-purpose normative framework, rather than a precise theory of well-being, freedom or justice. At its core are two normative claims: first, the claim that the freedom to achieve well-being is of primary moral importance, and second, that freedom to achieve well-being is to be understood in terms of people’s capabilities, that is, their real opportunities to do and be what they have reason to value. Thi...

  18. The collaboratory approach

    International Nuclear Information System (INIS)

    Peskin, A.M.

    1997-01-01

    A open-quotes collaboratoryclose quotes has been defined as a center without walls, in which researchers can perform their work without regard to geographical location. To an increasing degree, engineering design and development is also taking the form of far-flung collaborations among divisions of a plant, subcontractors, university consultants and customers. It has long been recognized that quality engineering education presents the student with an environment that duplicates as much as possible that which the graduate will encounter in industry. To that end, it is important that engineering schools begin to introduce the collaboratory approach in its preparation, and even use it in delivery of subject matter to students

  19. Diagnostic Approach to Myelopathies

    International Nuclear Information System (INIS)

    Granados Sanchez, Ana Maria; Garcia Posada, Lina Maria; Ortega Toscano, Cesar Andres; Lopez Lopez, Alejandra

    2011-01-01

    Myelopathy is a broad term that refers to spinal cord involvement of multiple etiologies. Spinal cord diseases often have devastating consequences, ranging from quadriplegia and paraplegia to severe sensory deficits due to its confinement in a very small area. Many of these diseases are potentially reversible if they are recognized on time, hence the importance of recognizing the significance of magnetic resonance imaging when approaching a multifactorial disease considered as one of the most critical neurological emergencies, where prognosis depends on an early and accurate diagnosis.

  20. HEDR modeling approach

    International Nuclear Information System (INIS)

    Shipler, D.B.; Napier, B.A.

    1992-07-01

    This report details the conceptual approaches to be used in calculating radiation doses to individuals throughout the various periods of operations at the Hanford Site. The report considers the major environmental transport pathways--atmospheric, surface water, and ground water--and projects and appropriate modeling technique for each. The modeling sequence chosen for each pathway depends on the available data on doses, the degree of confidence justified by such existing data, and the level of sophistication deemed appropriate for the particular pathway and time period being considered

  1. The collaboratory approach

    Energy Technology Data Exchange (ETDEWEB)

    Peskin, A.M.

    1997-04-01

    A {open_quotes}collaboratory{close_quotes} has been defined as a center without walls, in which researchers can perform their work without regard to geographical location. To an increasing degree, engineering design and development is also taking the form of far-flung collaborations among divisions of a plant, subcontractors, university consultants and customers. It has long been recognized that quality engineering education presents the student with an environment that duplicates as much as possible that which the graduate will encounter in industry. To that end, it is important that engineering schools begin to introduce the collaboratory approach in its preparation, and even use it in delivery of subject matter to students.

  2. [Nursing: the meaning of this profession to nurses. A first approach].

    Science.gov (United States)

    Luchesi, Luciana Barizon; Santos, Claudia Benedita dos

    2005-01-01

    In an attempt to understand, tell and, why not, participate a little in the history of Nursing, we proposed to study the prejudices and negative stereotypes that have permeated this profession over time. This is a before-after experimental type of study in a population of adolescents regularly enrolled in the eleventh grade of a Brazilian public school. The intervention took the form of a lecture about the profession and a questionnaire with closed questions which was applied before and after the lecture. Conclusions were based on the results of binomial and McNemar's non-parametric tests for the significance of changes. Although the statistically significant presence of prejudice and negatives stereotypes was not found, the results of the intervention were in line with expectations, since the changes(or tendency towards changes) took place exactly in those subgroups that showed a greater frequency of stereotypes.

  3. Statistical approach for selection of regression model during validation of bioanalytical method

    Directory of Open Access Journals (Sweden)

    Natalija Nakov

    2014-06-01

    Full Text Available The selection of an adequate regression model is the basis for obtaining accurate and reproducible results during the bionalytical method validation. Given the wide concentration range, frequently present in bioanalytical assays, heteroscedasticity of the data may be expected. Several weighted linear and quadratic regression models were evaluated during the selection of the adequate curve fit using nonparametric statistical tests: One sample rank test and Wilcoxon signed rank test for two independent groups of samples. The results obtained with One sample rank test could not give statistical justification for the selection of linear vs. quadratic regression models because slight differences between the error (presented through the relative residuals were obtained. Estimation of the significance of the differences in the RR was achieved using Wilcoxon signed rank test, where linear and quadratic regression models were treated as two independent groups. The application of this simple non-parametric statistical test provides statistical confirmation of the choice of an adequate regression model.

  4. Efficient estimation of an additive quantile regression model

    NARCIS (Netherlands)

    Cheng, Y.; de Gooijer, J.G.; Zerom, D.

    2011-01-01

    In this paper, two non-parametric estimators are proposed for estimating the components of an additive quantile regression model. The first estimator is a computationally convenient approach which can be viewed as a more viable alternative to existing kernel-based approaches. The second estimator

  5. Statistical significance of trends in monthly heavy precipitation over the US

    KAUST Repository

    Mahajan, Salil; North, Gerald R.; Saravanan, R.; Genton, Marc G.

    2011-01-01

    -parametric and parametric bootstrapping techniques. The results from the two Monte Carlo approaches are found to be similar to each other, and also to the traditional non-parametric Kendall's τ test, implying the robustness of the approach. Two different observational data

  6. Pharmacogenetics approach to therapeutics.

    Science.gov (United States)

    Koo, Seok Hwee; Lee, Edmund Jon Deoon

    2006-01-01

    1. Pharmacogenetics refers to the study of genetically controlled variations in drug response. Functional variants caused by single nucleotide polymorphisms (SNPs) in genes encoding drug-metabolising enzymes, transporters, ion channels and drug receptors have been known to be associated with interindividual and interethnic variation in drug response. Genetic variations in these genes play a role in influencing the efficacy and toxicity of medications. 2. Rapid, precise and cost-effective high-throughput technological platforms are essential for performing large-scale mutational analysis of genetic markers involved in the aetiology of variable responses to drug therapy. 3. The application of a pharmacogenetics approach to therapeutics in general clinical practice is still far from being achieved today owing to various constraints, such as limited accessibility of technology, inadequate knowledge, ambiguity of the role of variants and ethical concerns. 4. Drug actions are determined by the interplay of several genes encoding different proteins involved in various biochemical pathways. With rapidly emerging SNP discovery technological platforms and widespread knowledge on the role of SNPs in disease susceptibility and variability in drug response, the pharmacogenetics approach to therapeutics is anticipated to take off in the not-too-distant future. This will present profound clinical, economic and social implications for health care.

  7. Proteomic approach to nanotoxicity.

    Science.gov (United States)

    Matysiak, Magdalena; Kapka-Skrzypczak, Lucyna; Brzóska, Kamil; Gutleb, Arno C; Kruszewski, Marcin

    2016-03-30

    In recent years a large number of engineered nanomaterials (NMs) have been developed with promising technical benefits for consumers and medical appliances. In addition to already known potentially advantageous biological properties (antibiotic, antifungal and antiviral activity) of NMs, many new medical applications of NMs are foreseen, such as drug carriers, contrast agents, radiopharmaceuticals and many others. However, there is increasing concern about potential environmental and health effects due to NMs exposure. An increasing body of evidence suggests that NMs may trigger undesirable hazardous interactions with biological systems with potential to generate harmful effects. In this review we summarized a current state of knowledge on the proteomics approaches to nanotoxicity, including protein corona formation, in vitro and in vivo effects of exposure to NMs on proteome of different classes of organisms, from bacteria and plants to mammals. The effects of NMs on the proteome of environmentally relevant organisms are also described. Despite the benefit that development of nanotechnology may bring to the society, there are still major gaps of knowledge on the influence of nanomaterials on human health and the environment. Thus, it seems necessary to conduct further interdisciplinary research to fill the knowledge gaps in NM toxicity, using more holistic approaches than offered by conventional biological techniques. “OMICS” techniques will certainly help researchers in this field. In this paper we summarized the current stage of knowledge of the effects of nanoparticles on the proteome of different organisms, including those commonly used as an environmentally relevant indicator organisms.

  8. Endoscopic approach to achalasia

    Science.gov (United States)

    Müller, Michaela; Eckardt, Alexander J; Wehrmann, Till

    2013-01-01

    Achalasia is a primary esophageal motor disorder. The etiology is still unknown and therefore all treatment options are strictly palliative with the intention to weaken the lower esophageal sphincter (LES). Current established endoscopic therapeutic options include pneumatic dilation (PD) or botulinum toxin injection. Both treatment approaches have an excellent symptomatic short term effect, and lead to a reduction of LES pressure. However, the long term success of botulinum toxin (BT) injection is poor with symptom recurrence in more than 50% of the patients after 12 mo and in nearly 100% of the patients after 24 mo, which commonly requires repeat injections. In contrast, after a single PD 40%-60% of the patients remain asymptomatic for ≥ 10 years. Repeated on demand PD might become necessary and long term remission can be achieved with this approach in up to 90% of these patients. The main positive predictors for a symptomatic response to PD are an age > 40 years, a LES-pressure reduction to 40 years, was nearly equivalent to surgery. A new promising technique might be peroral endoscopic myotomy, although long term results are needed and practicability as well as safety issues must be considered. Treatment with a temporary self expanding stent has been reported with favorable outcomes, but the data are all from one study group and must be confirmed by others before definite recommendations can be made. In addition to its use as a therapeutic tool, endoscopy also plays an important role in the diagnosis and surveillance of patients with achalasia. PMID:23951393

  9. Interstage Flammability Analysis Approach

    Science.gov (United States)

    Little, Jeffrey K.; Eppard, William M.

    2011-01-01

    The Interstage of the Ares I launch platform houses several key components which are on standby during First Stage operation: the Reaction Control System (ReCS), the Upper Stage (US) Thrust Vector Control (TVC) and the J-2X with the Main Propulsion System (MPS) propellant feed system. Therefore potentially dangerous leaks of propellants could develop. The Interstage leaks analysis addresses the concerns of localized mixing of hydrogen and oxygen gases to produce deflagration zones in the Interstage of the Ares I launch vehicle during First Stage operation. This report details the approach taken to accomplish the analysis. Specified leakage profiles and actual flammability results are not presented due to proprietary and security restrictions. The interior volume formed by the Interstage walls, bounding interfaces with the Upper and First Stages, and surrounding the J2-X engine was modeled using Loci-CHEM to assess the potential for flammable gas mixtures to develop during First Stage operations. The transient analysis included a derived flammability indicator based on mixture ratios to maintain achievable simulation times. Validation of results was based on a comparison to Interstage pressure profiles outlined in prior NASA studies. The approach proved useful in the bounding of flammability risk in supporting program hazard reviews.

  10. Ethnographic Approaches in Primatology.

    Science.gov (United States)

    Dore, Kerry M; Radford, Lucy; Alexander, Sherrie; Waters, Siân

    2018-01-01

    The shared evolutionary histories and anatomical similarities between humans and non-human primates create dynamic interconnections between these alloprimates. In this foreword to Folia Primatologica's special issue on "Ethnographic Approaches in Primatology," we review the ethnographic method and existing literature at the intersection of primatology and ethnography. We summarize, compare and contrast the 5 contributions to this special issue to highlight why the human-non-human primate interface is a compelling area to investigate via ethnographic approaches and to encourage increased incorporation of ethnography into the discipline of primatology. Ethnography is a valuable and increasingly popular tool with its use no longer limited to anthropological practitioners investigating traditional, non-Western peoples. Scholars from many disciplines now use ethnographic methods to investigate all members of our globalised world, including non-humans. As our closest living relatives, non-human primates (hereafter "primates") are compelling subjects and thus appear in a range of contexts within ethnographic investigations. The goal of this special issue is to highlight the trajectory of research at the intersection of primatology and ethnography and to illustrate the importance of ethnographic methods for the advancement of primatology as a discipline. © 2018 S. Karger AG, Basel.

  11. Approaches for Stereo Matching

    Directory of Open Access Journals (Sweden)

    Takouhi Ozanian

    1995-04-01

    Full Text Available This review focuses on the last decade's development of the computational stereopsis for recovering three-dimensional information. The main components of the stereo analysis are exposed: image acquisition and camera modeling, feature selection, feature matching and disparity interpretation. A brief survey is given of the well known feature selection approaches and the estimation parameters for this selection are mentioned. The difficulties in identifying correspondent locations in the two images are explained. Methods as to how effectively to constrain the search for correct solution of the correspondence problem are discussed, as are strategies for the whole matching process. Reasons for the occurrence of matching errors are considered. Some recently proposed approaches, employing new ideas in the modeling of stereo matching in terms of energy minimization, are described. Acknowledging the importance of computation time for real-time applications, special attention is paid to parallelism as a way to achieve the required level of performance. The development of trinocular stereo analysis as an alternative to the conventional binocular one, is described. Finally a classification based on the test images for verification of the stereo matching algorithms, is supplied.

  12. Technical approach document

    International Nuclear Information System (INIS)

    1988-04-01

    This document describes the general technical approaches and design criteria adopted by the US Department of Energy (DOE) in order to implement Remedial Action Plans (RAPs) and final designs that comply with EPS standards. This document is a revision to the original document. Major revisions were made to the sections in riprap selection and sizing, and ground-water; only minor revisions were made to the remainder of the document. The US Nuclear Regulatory Commission (NRC) has prepared a Standard Review Plan (NRC-SRP) which describes factors to be considered by the NRC in approving the RAP. Sections 3.0, 4.0, 5.0, and 7.0 of this document are arranged under the same headings as those used in the NRC-SRP. This approach is adopted in order to facilitate joint use of the documents. Section 2.0 (not included in the NRC-SRP) discusses design considerations; Section 3.0 describes surface-water hydrology and erosion control; Section 4.0 describes geotechnical aspects of pile design; Section 5.0 discusses the Alternate Site Selection Process; Section 6.0 deals with radiological issues (in particular, the design of the radon barrier); Section 7.0 discusses protection of groundwater resources; and Section 8.0 discusses site design criteria for the RAC

  13. Fournier's gangrene current approaches.

    Science.gov (United States)

    Ozkan, Omer F; Koksal, Neset; Altinli, Ediz; Celik, Atilla; Uzun, Mehmet A; Cıkman, Oztekin; Akbas, Alpaslan; Ergun, Ersin; Kiraz, Hasan A; Karaayvaz, Muammer

    2016-10-01

    Fournier's gangrene is a rare but highly mortal infectious disease characterised by fulminant necrotising fasciitis involving the genital and perineal regions. The objective of this study is to analyse the demographics, clinical feature and treatment approaches as well as outcomes of Fournier's gangrene. Data were collected retrospectively from medical records and operative notes. Patient data were analysed by demographics, aetiological factors, clinical features, treatment approaches and outcomes. Twelve patients (five female and seven male) were enrolled in this study. The most common aetiology was perianal abscess (41·6%). Wound cultures showed a mixture of microorganisms in six (50%) patients. For faecal diversion, while colostomy was performed in six cases (50%), Flexi-Seal was used in two cases (16·6%). In four patients (33·4%), no faecal diversion was performed. Negative pressure wound therapy (NPWT) system was effective in the last four patients (33·4%). The mean hospitalisation period in patients who used NPWT was 18 days, while it was 20 days in the others. NPWT in Fournier's gangrene is a safe dressing method. It promotes granulation formation. Flexi-Seal faecal management is an alternative method to colostomy and provides protection from its associated complications. The combination of two devices (Flexi-Seal and NPWT) is an effective and comfortable method in the management of Fournier's gangrene in appropriate patients. © 2014 The Authors. International Wound Journal © 2014 Medicalhelplines.com Inc and John Wiley & Sons Ltd.

  14. Breakfast: a multidisciplinary approach.

    Science.gov (United States)

    Affinita, Antonio; Catalani, Loredana; Cecchetto, Giovanna; De Lorenzo, Gianfranco; Dilillo, Dario; Donegani, Giorgio; Fransos, Lucia; Lucidi, Fabio; Mameli, Chiara; Manna, Elisa; Marconi, Paolo; Mele, Giuseppe; Minestroni, Laura; Montanari, Massimo; Morcellini, Mario; Rovera, Giuseppe; Rotilio, Giuseppe; Sachet, Marco; Zuccotti, Gian Vincenzo

    2013-07-10

    The role of breakfast as an essential part of an healthy diet has been only recently promoted even if breakfast practices were known since the Middle Age. The growing scientific evidences on this topic are extremely sector-based nevertheless breakfast could be regarded from different point of views and from different expertises. This approach, that take into account history, sociology, anthropology, medicine, psychology and pedagogy, is useful to better understand the value of this meal in our culture. The aim of this paper was to analyse breakfast-related issues based on a multidisciplinary approach with input by specialists from different fields of learning. Breakfast is now recommended as part of a diet because it is associated with healthier macro- and micronutrient intakes, body mass index and lifestyle. Moreover recent studies showed that breakfast improves cognitive function, intuitive perception and academic performance. Research demonstrates the importance of providing breakfast not only to children but in adults and elderly too. Although the important role breakfast plays in maintaining the health, epidemiological data from industrialised countries reveal that many individuals either eat a nutritionally unhealthy breakfast or skip it completely. The historical, bio-psychological and educational value of breakfast in our culture is extremely important and should be recognized and stressed by the scientific community. Efforts should be done to promote this practice for the individual health and well-being.

  15. Systemic approaches to biodegradation.

    Science.gov (United States)

    Trigo, Almudena; Valencia, Alfonso; Cases, Ildefonso

    2009-01-01

    Biodegradation, the ability of microorganisms to remove complex chemicals from the environment, is a multifaceted process in which many biotic and abiotic factors are implicated. The recent accumulation of knowledge about the biochemistry and genetics of the biodegradation process, and its categorization and formalization in structured databases, has recently opened the door to systems biology approaches, where the interactions of the involved parts are the main subject of study, and the system is analysed as a whole. The global analysis of the biodegradation metabolic network is beginning to produce knowledge about its structure, behaviour and evolution, such as its free-scale structure or its intrinsic robustness. Moreover, these approaches are also developing into useful tools such as predictors for compounds' degradability or the assisted design of artificial pathways. However, it is the environmental application of high-throughput technologies from the genomics, metagenomics, proteomics and metabolomics that harbours the most promising opportunities to understand the biodegradation process, and at the same time poses tremendous challenges from the data management and data mining point of view.

  16. Advanced intelligence and mechanism approach

    Institute of Scientific and Technical Information of China (English)

    ZHONG Yixin

    2007-01-01

    Advanced intelligence will feature the intelligence research in next 50 years.An understanding of the concept of advanced intelligence as well as its importance will be provided first,and detailed analysis on an approach,the mechanism approach.suitable to the advanced intelligence research will then be flolowed.And the mutual relationship among mechanism approach,traditional approaches existed in artificial intelligence research,and the cognitive informatics will be discussed.It is interesting to discover that mechanism approach is a good one to the Advanced Intelligence research and a tmified form of the existed approaches to artificial intelligence.

  17. Proteomics - new analytical approaches

    International Nuclear Information System (INIS)

    Hancock, W.S.

    2001-01-01

    Full text: Recent developments in the sequencing of the human genome have indicated that the number of coding gene sequences may be as few as 30,000. It is clear, however, that the complexity of the human species is dependent on the much greater diversity of the corresponding protein complement. Estimates of the diversity (discrete protein species) of the human proteome range from 200,000 to 300,000 at the lower end to 2,000,000 to 3,000,000 at the high end. In addition, proteomics (the study of the protein complement to the genome) has been subdivided into two main approaches. Global proteomics refers to a high throughput examination of the full protein set present in a cell under a given environmental condition. Focused proteomics refers to a more detailed study of a restricted set of proteins that are related to a specified biochemical pathway or subcellular structure. While many of the advances in proteomics will be based on the sequencing of the human genome, de novo characterization of protein microheterogeneity (glycosylation, phosphorylation and sulfation as well as the incorporation of lipid components) will be required in disease studies. To characterize these modifications it is necessary to digest the protein mixture with an enzyme to produce the corresponding mixture of peptides. In a process analogous to sequencing of the genome, shot-gun sequencing of the proteome is based on the characterization of the key fragments produced by such a digest. Thus, a glycopeptide and hence a specific glycosylation motif will be identified by a unique mass and then a diagnostic MS/MS spectrum. Mass spectrometry will be the preferred detector in these applications because of the unparalleled information content provided by one or more dimensions of mass measurement. In addition, highly efficient separation processes are an absolute requirement for advanced proteomic studies. For example, a combination of the orthogonal approaches, HPLC and HPCE, can be very powerful

  18. Making literature reviews more reliable through application of lessons from systematic reviews.

    Science.gov (United States)

    Haddaway, N R; Woodcock, P; Macura, B; Collins, A

    2015-12-01

    Review articles can provide valuable summaries of the ever-increasing volume of primary research in conservation biology. Where findings may influence important resource-allocation decisions in policy or practice, there is a need for a high degree of reliability when reviewing evidence. However, traditional literature reviews are susceptible to a number of biases during the identification, selection, and synthesis of included studies (e.g., publication bias, selection bias, and vote counting). Systematic reviews, pioneered in medicine and translated into conservation in 2006, address these issues through a strict methodology that aims to maximize transparency, objectivity, and repeatability. Systematic reviews will always be the gold standard for reliable synthesis of evidence. However, traditional literature reviews remain popular and will continue to be valuable where systematic reviews are not feasible. Where traditional reviews are used, lessons can be taken from systematic reviews and applied to traditional reviews in order to increase their reliability. Certain key aspects of systematic review methods that can be used in a context-specific manner in traditional reviews include focusing on mitigating bias; increasing transparency, consistency, and objectivity, and critically appraising the evidence and avoiding vote counting. In situations where conducting a full systematic review is not feasible, the proposed approach to reviewing evidence in a more systematic way can substantially improve the reliability of review findings, providing a time- and resource-efficient means of maximizing the value of traditional reviews. These methods are aimed particularly at those conducting literature reviews where systematic review is not feasible, for example, for graduate students, single reviewers, or small organizations. © 2015 Society for Conservation Biology.

  19. Stochastic approach to microphysics

    Energy Technology Data Exchange (ETDEWEB)

    Aron, J.C.

    1987-01-01

    The presently widespread idea of ''vacuum population'', together with the quantum concept of vacuum fluctuations leads to assume a random level below that of matter. This stochastic approach starts by a reminder of the author's previous work, first on the relation of diffusion laws with the foundations of microphysics, and then on hadron spectrum. Following the latter, a random quark model is advanced; it gives to quark pairs properties similar to those of a harmonic oscillator or an elastic string, imagined as an explanation to their asymptotic freedom and their confinement. The stochastic study of such interactions as electron-nucleon, jets in e/sup +/e/sup -/ collisions, or pp -> ..pi../sup 0/ + X, gives form factors closely consistent with experiment. The conclusion is an epistemological comment (complementarity between stochastic and quantum domains, E.P.R. paradox, etc...).

  20. Mouthsticks - A Participatory Approach.

    Science.gov (United States)

    Ernst, Waltraud; Nussbaum, Gerhard; Berger, Veronika M; Major, Zoltan

    2017-01-01

    Mouthsticks are quite an old kind of assistive technology (AT) but nevertheless they are up to now the Swiss army knives among AT. Unfortunately the popularity of mouthsticks massively decreased during the 1990s with the result that knowledge about how to produce good mouthsticks got lost and that there are hardly any adaptable mouthsticks available on the market. This paper discusses the development of a personalized mouthstick with the involvement of end users - people with severe physical disabilities - and occupational therapists as experts of everyday use. A participatory approach was chosen. The results of the analysis of a standardized questionnaire, group discussions and a collaborative workshop with IT-designers, polymer engineers, end users, occupational therapists and gender and diversity researchers are presented and discussed. This proved the necessity of the development of a personalized mouthstick.

  1. Globalization - Different approaches

    Directory of Open Access Journals (Sweden)

    Viorica Puscaciu

    2014-11-01

    Full Text Available In this paper we investigate the different approaches of the globalization phenomenon. Despite the geographical distancesm, the link between people are ever more strong on different ways and plans: from technology between political, economical, cultural world events, and many other aspects. So, the link between globalization and democracy, and its impact on the most important social and economic matters. We also surprise the impact of the internet revolution and its corolar e-commerce, and its consequences, sometimes unpredictible ones. Another annalysed problem is that of the governments trying, and sometimes succeeding to controll the money, products, peole and their ideas that freely move inside the national frontiers, thus going to slower or to stop the progress. Nevertheless, this global interraction between people also create phenomena of insecurity on different ways: terrorism, traffic of arms, drugs, economical aggresions causing the environment, and other inconvenient facts and situations.

  2. Engineering students' sustainability approaches

    Science.gov (United States)

    Haase, S.

    2014-05-01

    Sustainability issues are increasingly important in engineering work all over the world. This article explores systematic differences in self-assessed competencies, interests, importance, engagement and practices of newly enrolled engineering students in Denmark in relation to environmental and non-environmental sustainability issues. The empirical base of the article is a nation-wide, web-based survey sent to all newly enrolled engineering students in Denmark commencing their education in the fall term 2010. The response rate was 46%. The survey focused on a variety of different aspects of what can be conceived as sustainability. By means of cluster analysis, three engineering student approaches to sustainability are identified and described. The article provides knowledge on the different prerequisites of engineering students in relation to the role of sustainability in engineering. This information is important input to educators trying to target new engineering students and contribute to the provision of engineers equipped to meet sustainability challenges.

  3. Cognitive approaches to emotions.

    Science.gov (United States)

    Oatley, Keith; Johnson-Laird, P N

    2014-03-01

    Cognitive approaches offer clear links between how emotions are thought about in everyday life and how they are investigated psychologically. Cognitive researchers have focused on how emotions are caused when events or other people affect concerns and on how emotions influence processes such as reasoning, memory, and attention. Three representative cognitive theories of emotion continue to develop productively: the action-readiness theory, the core-affect theory, and the communicative theory. Some principles are common to them and divergences can be resolved by future research. Recent explanations have included how emotions structure social relationships, how they function in psychological illnesses, and how they are central to music and fiction. Copyright © 2013 Elsevier Ltd. All rights reserved.

  4. Radiolab - three different approaches

    DEFF Research Database (Denmark)

    Lønstrup, Ansa

    2012-01-01

    , who methodologically operates within tree levels of investigation: 1) the syntax, 2) the semantic and 3) the ontology level. Accordingly, this analysis is conducted, as if the sound object was performed by a vocal ensemble oscillating ‘between a musical and a speech act’. Torben Sangild’s paper......Radiolab – three different approaches The three papers in this ‘suite’ have a special background and context. At the 2010 conference SoundActs in Aarhus the three panellists were each given the task to provide a paper with an analysis of the same sound object, thus exhibiting and contrasting...... via his own repeated listening process – as a scholarly-analytical analysis of the subjectiv act of creating meaning. He draws on presumptions and prejudices, demonstrating the impossibility of a purely structural listening. The analysis relates these hermeneutical reflections to formal musicological...

  5. Slurry pipeline design approach

    Energy Technology Data Exchange (ETDEWEB)

    Betinol, Roy; Navarro R, Luis [Brass Chile S.A., Santiago (Chile)

    2009-12-19

    Compared to other engineering technologies, the design of a commercial long distance Slurry Pipeline design is a relatively new engineering concept which gained more recognition in the mid 1960 's. Slurry pipeline was first introduced to reduce cost in transporting coal to power generating units. Since then this technology has caught-up worldwide to transport other minerals such as limestone, copper, zinc and iron. In South America, the use of pipeline is commonly practiced in the transport of Copper (Chile, Peru and Argentina), Iron (Chile and Brazil), Zinc (Peru) and Bauxite (Brazil). As more mining operations expand and new mine facilities are opened, the design of the long distance slurry pipeline will continuously present a commercially viable option. The intent of this paper is to present the design process and discuss any new techniques and approach used today to ensure a better, safer and economical slurry pipeline. (author)

  6. Bioengineering a conceptual approach

    CERN Document Server

    Pavlovic, Mirjana

    2015-01-01

    This book explores critical principles and new concepts in bioengineering, integrating the biological, physical and chemical laws and principles that provide a foundation for the field. Both biological and engineering perspectives are included, with key topics such as the physical-chemical properties of cells, tissues and organs; principles of molecules; composition and interplay in physiological scenarios; and the complex physiological functions of heart, neuronal cells, muscle cells and tissues. Chapters evaluate the emerging fields of nanotechnology, drug delivery concepts, biomaterials, and regenerative therapy. The leading individuals and events are introduced along with their critical research. Bioengineering: A Conceptual Approach is a valuable resource for professionals or researchers interested in understanding the central elements of bioengineering. Advanced-level students in biomedical engineering and computer science will also find this book valuable as a secondary textbook or reference.

  7. Integration a functional approach

    CERN Document Server

    Bichteler, Klaus

    1998-01-01

    This book covers Lebesgue integration and its generalizations from Daniell's point of view, modified by the use of seminorms. Integrating functions rather than measuring sets is posited as the main purpose of measure theory. From this point of view Lebesgue's integral can be had as a rather straightforward, even simplistic, extension of Riemann's integral; and its aims, definitions, and procedures can be motivated at an elementary level. The notion of measurability, for example, is suggested by Littlewood's observations rather than being conveyed authoritatively through definitions of (sigma)-algebras and good-cut-conditions, the latter of which are hard to justify and thus appear mysterious, even nettlesome, to the beginner. The approach taken provides the additional benefit of cutting the labor in half. The use of seminorms, ubiquitous in modern analysis, speeds things up even further. The book is intended for the reader who has some experience with proofs, a beginning graduate student for example. It might...

  8. [Psychosomatic approach to encopresis].

    Science.gov (United States)

    Boige, N; Missonnier, S; Bellaïche, M; Foucaud, P

    1999-12-01

    Encopresis most often results from functional constipation and a behaviour disorder characterised by retention of faeces. Rarely it is a passive or active expulsion of normal faeces. It indicates a failure in the education of sphincter control, often with a preferential development of autoerotic versus relational investments. A depressive component is frequent. We propose a bidisciplinary approach with a somatic and psychological evaluation of the encopretic child from the first visit. The physical examination assesses constipation and stercoral stasis. Associated psychopathological symptoms or a pathogenic psychosocial situation must be sought. The therapeutic means must be directed towards the different etiologic features. Explanations of the physiopathology of the symptom and discussion with the child and the parents on the origin of the dysfunction must be accomplished first. A medical treatment of the constipation is generally indicated. Psychotherapy is initiated according to the background and associated psychopathological symptoms.

  9. Microscopic approach to polaritons

    DEFF Research Database (Denmark)

    Skettrup, Torben

    1981-01-01

    contrary to experimental experience. In order to remove this absurdity the semiclassical approach must be abandoned and the electromagnetic field quantized. A simple microscopic polariton model is then derived. From this the wave function for the interacting exciton-photon complex is obtained...... of light of the crystal. The introduction of damping smears out the excitonic spectra. The wave function of the polariton, however, turns out to be very independent of damping up to large damping values. Finally, this simplified microscopic polariton model is compared with the exact solutions obtained...... for the macroscopic polariton model by Hopfield. It is seen that standing photon and exciton waves must be included in an exact microscopic polariton model. However, it is concluded that for practical purposes, only the propagating waves are of importance and the simple microscopic polariton wave function derived...

  10. Investigational Approaches for Mesothelioma

    International Nuclear Information System (INIS)

    Surmont, Veerle F.; Thiel, Eric R. E. van; Vermaelen, Karim; Meerbeeck, Jan P. van

    2011-01-01

    Malignant pleural mesothelioma (MPM) is a rare, aggressive tumor with a poor prognosis. In view of the poor survival benefit from first-line chemotherapy and the lack of subsequent effective treatment options, there is a strong need for the development of more effective treatment approaches for patients with MPM. This review will provide a comprehensive state of the art of new investigational approaches for mesothelioma. In an introductory section, the etiology, epidemiology, natural history, and standard of care treatment for MPM will be discussed. This review provide an update of the major clinical trials that impact mesothelioma treatment, discuss the impact of novel therapeutics, and provide perspective on where the clinical research in mesothelioma is moving. The evidence was collected by a systematic analysis of the literature (2000–2011) using the databases Medline (National Library of Medicine, USA), Embase (Elsevier, Netherlands), Cochrane Library (Great Britain), National Guideline Clearinghouse (USA), HTA Database (International Network of Agencies for Health Technology Assessment – INAHTA), NIH database (USA), International Pleural Mesothelioma Program – WHOLIS (WHO Database), with the following keywords and filters: mesothelioma, guidelines, treatment, surgery, chemotherapy, radiotherapy, review, investigational, drugs. Currently different targeted therapies and biologicals are under investigation for MPM. It is important that the molecular biologic research should first focus on mesothelioma-specific pathways and biomarkers in order to have more effective treatment options for this disease. The use of array technology will be certainly an implicit gain in the identification of new potential prognostic or biomarkers or important pathways in the MPM pathogenesis. Probably a central mesothelioma virtual tissue bank may contribute to the ultimate goal to identify druggable targets and to develop personalized treatment for the MPM patients.

  11. Investigational approaches for mesothelioma

    Directory of Open Access Journals (Sweden)

    Veerle F Surmont

    2011-08-01

    Full Text Available MPM is a rare, aggressive tumour with a poor prognosis. In view of the poor survival benefit from first-line chemotherapy and the lack of subsequent effective treatment options, there is a strong need for the development of more effective treatment approaches for patients with MPM. This review will provide a comprehensive state of the art of new investigational approaches for mesothelioma. In an introductory section, the aetiology, epidemiology, natural history and standard of care treatment for MPM will be discussed. This review provide an update of the major clinical trials that impact mesothelioma treatment, discuss the impact of novel therapeutics and provide perspective on where the clinical research in mesothelioma is moving.The evidence was collected by a systematic analysis of the literature (2000–2011 using the databases Medline (National Library of Medicine, USA, Embase (Elsevier, Netherlands, Cochrane Library (Great Britain, National Guideline Clearinghouse (USA, HTA Database (International Network of Agencies for Health Technology Assessment – INAHTA, NIH database (USA, International Pleural Mesothelioma Program – WHOLIS (WHO Database , with the following keywords and filters: mesothelioma, guidelines, treatment, surgery, chemotherapy, radiotherapy, review, investigational, drugsCurrently different targeted therapies and biologicals are under investigation for MPM. It is important that the molecular biologic research should first focus on mesothelioma-specific pathways and biomarkers in order to have more effective treatment options for this disease. The use of array technology will be certainly an implicit gain in the identification of new potential prognostic or biomarkers or important pathways in the MPM pathogenesis. Probably a central mesothelioma virtual tissue bank may contribute to the ultimate goal to identify druggable targets and to develop personalized treatment for the MPM patients.

  12. Using multiple decrement models to estimate risk and morbidity from specific AIDS illnesses. Multicenter AIDS Cohort Study (MACS).

    Science.gov (United States)

    Hoover, D R; Peng, Y; Saah, A J; Detels, R R; Day, R S; Phair, J P

    A simple non-parametric approach is developed to simultaneously estimate net incidence and morbidity time from specific AIDS illnesses in populations at high risk for death from these illnesses and other causes. The disease-death process has four-stages that can be recast as two sandwiching three-state multiple decrement processes. Non-parametric estimation of net incidence and morbidity time with error bounds are achieved from these sandwiching models through modification of methods from Aalen and Greenwood, and bootstrapping. An application to immunosuppressed HIV-1 infected homosexual men reveals that cytomegalovirus disease, Kaposi's sarcoma and Pneumocystis pneumonia are likely to occur and cause significant morbidity time.

  13. Structural Sustainability - Heuristic Approach

    Science.gov (United States)

    Rostański, Krzysztof

    2017-10-01

    Nowadays, we are faced with a challenge of having to join building structures with elements of nature, which seems to be the paradigm of modern planning and design. The questions arise, however, with reference to the following categories: the leading idea, the relation between elements of nature and buildings, the features of a structure combining such elements and, finally, our perception of this structure. If we consider both the overwhelming globalization and our attempts to preserve local values, the only reasonable solution is to develop naturalistic greenery. It can add its uniqueness to any building and to any developed area. Our holistic model, presented in this paper, contains the above mentioned categories within the scope of naturalism. The model is divided into principles, actions related, and possible effects to be obtained. It provides a useful tool for determining the ways and priorities of our design. Although it is not possible to consider all possible actions and solutions in order to support sustainability in any particular design, we can choose, however, a proper mode for our design according to the local conditions by turning to the heuristic method, which helps to choose priorities and targets. Our approach is an attempt to follow the ways of nature as in the natural environment it is optimal solutions that appear and survive, idealism being the domain of mankind only. We try to describe various natural processes in a manner comprehensible to us, which is always a generalization. Such definitions, however, called artificial by naturalists, are presented as art or the current state of knowledge by artists and engineers. Reality, in fact, is always more complicated than its definitions. The heuristic method demonstrates the way how to optimize our design. It requires that all possible information about the local environment should be gathered, as the more is known, the fewer mistakes are made. Following the unquestionable principles, we can

  14. Approach to team skills training

    International Nuclear Information System (INIS)

    Koontz, J.L.; Roe, M.L.; Gaddy, C.D.

    1987-01-01

    The US commercial nuclear power industry has recognized the importance of team skills in control room operation. The desire to combine training of team interaction skills, like communications, with technical knowledge of reactor operations requires a unique approach to training. An NRC-sponsored study identified a five-phase approach to team skills training designed to be consistent with the systems approach to training currently endorsed by the NRC Policy Statement on Training and Qualification. This paper describes an approach to team skills training with emphasis on the nuclear power plant control room crew. An approach to team skills training

  15. Developing regulatory approaches

    International Nuclear Information System (INIS)

    Axelsson, Lars

    2012-01-01

    Lars Axelsson presented SSM progress on oversight of LMfS/SC since the Chester 1 Workshop in 2007. Current SSM approaches for safety culture oversight include targeted safety management and safety culture inspections, compliance inspections which cover aspects of safety management/safety culture and multi-disciplinary team inspections. Examples of themes for targeted inspections include management of ambiguous operational situations or other weak signals, understanding of and attitudes to Human Performance tools, the Safety Department's role and authority and Leadership for safety. All regulatory activities provide inputs for the SSM yearly safety evaluation of each licensee. A form has been developed to capture safety culture observations from inspections and other interactions with licensees. Analysis will be performed to identify patterns and provide information to support planning of specific Safety Culture activities. Training has been developed for regulatory staff to enhance the quality of regulatory interventions on safety culture. This includes a half-day seminar to provide an overview of safety culture, and a workshop which provides more in-depth discussion on cultural issues and how to capture those during regulatory activities. Future plans include guidance for inspectors, and informal seminars on safety culture with licensees

  16. Light cone approach

    International Nuclear Information System (INIS)

    Brodsky, Stan

    1993-01-01

    One of the most challenging problems in theoretical high energy physics is to compute the bound state structure of the proton and other hadrons from quantum chromodynamics (QCD), the field theory of quarks and gluons. The goal is not only to calculate the spectrum of hadrons masses from first principles, but also to derive the momentum and spin distributions of the quarks and gluons which control high energy hadron interactions. One approach to these difficult calculations is to simulate QCD on an artificial lattice. Recently, several new methods based on ''light-cone'' quantization have been proposed as alternatives to lattice theory for solving non-perturbative problems in QCD and other field theories. The basic idea is a generalization of Heisenberg's pioneer matrix formulation of quantum mechanics: if one could numerically diagonalize the matrix of the Hamiltonian representing the underlying QCD interaction, then the resulting eigenvalues would give the hadron spectrum, while the corresponding eigenstates would describe each hadron in terms of its quark and gluon degrees of freedom

  17. Combined approach for gynecomastia

    Directory of Open Access Journals (Sweden)

    El-Sabbagh, Ahmed Hassan

    2016-02-01

    Full Text Available Background: Gynecomastia is a deformity of male chest. Treatment of gynecomastia varied from direct surgical excision to other techniques (mainly liposuction to a combination of both. Skin excision is done according to the grade. In this study, experience of using liposuction adjuvant to surgical excision was described. Patients and methods: Between September 2012 and April 2015, a total of 14 patients were treated with liposuction and surgical excision through a periareolar incision. Preoperative evaluation was done in all cases to exclude any underlying cause of gynecomastia. Results: All fourteen patients were treated bilaterally (28 breast tissues. Their ages ranged between 13 and 33 years. Two patients were classified as grade I, and four as grade IIa, IIb or III, respectively. The first showed seroma. Partial superficial epidermolysis of areola occurred in 2 cases. Superficial infection of incision occurred in one case and was treated conservatively. Conclusion: All grades of gynecomastia were managed by the same approach. Skin excision was added to a patient that had severe skin excess with limited activity and bad skin complexion. No cases required another setting or asked for 2 opinion.

  18. Combined approach for gynecomastia.

    Science.gov (United States)

    El-Sabbagh, Ahmed Hassan

    2016-01-01

    Gynecomastia is a deformity of male chest. Treatment of gynecomastia varied from direct surgical excision to other techniques (mainly liposuction) to a combination of both. Skin excision is done according to the grade. In this study, experience of using liposuction adjuvant to surgical excision was described. Between September 2012 and April 2015, a total of 14 patients were treated with liposuction and surgical excision through a periareolar incision. Preoperative evaluation was done in all cases to exclude any underlying cause of gynecomastia. All fourteen patients were treated bilaterally (28 breast tissues). Their ages ranged between 13 and 33 years. Two patients were classified as grade I, and four as grade IIa, IIb or III, respectively. The first 3 patients showed seroma. Partial superficial epidermolysis of areola occurred in 2 cases. Superficial infection of incision occurred in one case and was treated conservatively. All grades of gynecomastia were managed by the same approach. Skin excision was added to a patient that had severe skin excess with limited activity and bad skin complexion. No cases required another setting or asked for 2(nd) opinion.

  19. Approaches to refractory epilepsy

    Directory of Open Access Journals (Sweden)

    Jerome Engel

    2014-01-01

    Full Text Available Epilepsy is one of the most common serious neurological conditions, and 30 to 40% of people with epilepsy have seizures that are not controlled by medication. Patients are considered to have refractory epilepsy if disabling seizures continue despite appropriate trials of two antiseizure drugs, either alone or in combination. At this point, patients should be referred to multidisciplinary epilepsy centers that perform specialized diagnostic testing to first determine whether they are, in fact, pharmacoresistant, and then, if so, offer alternative treatments. Apparent pharmacoresistance can result from a variety of situations, including noncompliance, seizures that are not epileptic, misdiagnosis of the seizure type or epilepsy syndrome, inappropriate use of medication, and lifestyle issues. For patients who are pharmacoresistant, surgical treatment offers the best opportunity for complete freedom from seizures. Surgically remediable epilepsy syndromes have been identified, but patients with more complicated epilepsy can also benefit from surgical treatment and require more specialized evaluation, including intracranial EEG monitoring. For patients who are not surgical candidates, or who are unwilling to consider surgery, a variety of other alternative treatments can be considered, including peripheral or central neurostimulation, ketogenic diet, and complementary and alternative approaches. When such alternative treatments are not appropriate or effective, quality of life can still be greatly improved by the psychological and social support services offered by multidisciplinary epilepsy centers. A major obstacle remains the fact that only a small proportion of patients with refractory epilepsy are referred for expert evaluation and treatment.

  20. Approaches to adenomyomectomy

    Directory of Open Access Journals (Sweden)

    Serene Thain

    2015-08-01

    Full Text Available Adenomyosis is a common gynecological condition that affects women, causing menstrual disturbances, pain, and subfertility. Adenomyomectomy as an alternative to hysterectomy has been widely performed in those who have not completed childbearing or those refusing a hysterectomy for a variety of reasons. Whichever the surgical route, the challenges of adenomyomectomy include possible misdiagnosis, defining the extent of resection, technical difficulties, dealing with the associated complications, and managing the risks of uterine rupture during a subsequent pregnancy. The principles of surgery mimic those of myomectomy, but the evolution of adenomyomectomy has been relatively unexciting with a general paucity of published data to date. Laparoscopic techniques have proven feasible generally, avoiding the risks of open surgery while conferring the benefits of microsurgery. Limitations in tactile feedback and access constraints have been the main drawbacks via this route. Meticulous stitching and repair is still of paramount importance in these operations. Preoperative gonadotropin-releasing hormone agonists have proven effective in shrinking the disease and reducing blood loss during surgery, whereas the postoperative use has resulted in a dramatic reduction in symptoms. Uterine artery ligation techniques have also been shown to be useful adjuncts, although we still need to be mindful of the potential effects in those desiring fertility. Furthermore, there is still no foolproof way in predicting those at risk of uterine rupture after adenomyomectomy. Hence, a nonprescriptive approach in managing adenomyomas is advised, where proper patient selection and counseling are important.

  1. The narrative approach to personalisation

    Science.gov (United States)

    Conlan, Owen; Staikopoulos, Athanasios; Hampson, Cormac; Lawless, Séamus; O'keeffe, Ian

    2013-06-01

    This article describes the narrative approach to personalisation. This novel approach to the generation of personalised adaptive hypermedia experiences employs runtime reconciliation between a personalisation strategy and a number of contextual models (e.g. user and domain). The approach also advocates the late binding of suitable content and services to the generated personalised pathway resulting in an interactive composition that comprises services as well as content. This article provides a detailed definition of the narrative approach to personalisation and showcases the approach through the examination of two use-cases: the personalised digital educational games developed by the ELEKTRA and 80Days projects; and the personalised learning activities realised as part of the AMAS project. These use-cases highlight the general applicability of the narrative approach and how it has been applied to create a diverse range of real-world systems.

  2. Field theory approach to gravitation

    International Nuclear Information System (INIS)

    Yilmaz, H.

    1978-01-01

    A number of authors considered the possibility of formulating a field-theory approach to gravitation with the claim that such an approach would uniquely lead to Einstein's theory of general relativity. In this article it is shown that the field theory approach is more generally applicable and uniqueness cannot be claimed. Theoretical and experimental reasons are given showing that the Einsteinian limit appears to be unviable

  3. The paramedian supracerebellar infratentorial approach.

    Science.gov (United States)

    La Pira, Biagia; Sorenson, Thomas; Quillis-Quesada, Vicent; Lanzino, Giuseppe

    2017-08-01

    Lesions of the superior cerebellar surface, pineal region, lateral and dorsal midbrain and mesial temporal lobe are challenging to treat and often require neurosurgical intervention. The paramedian variation of the supracerebellar infratentorial approach utilizes the downward slope of the cerebellum to facilitate exposure and the lower density of cerebellar bridging veins away from the midline decreases the need to sacrifice larger venous channels. We also discuss our experiences with the approach, and some of the drawbacks and nuances that we have encountered as it has evolved over the years. This approach is versatile and effective and the authors' surgical approach of choice for resecting these challenging lesions.

  4. An Approach to Interface Synthesis

    DEFF Research Database (Denmark)

    Madsen, Jan; Hald, Bjarne

    1995-01-01

    Presents a novel interface synthesis approach based on a one-sided interface description. Whereas most other approaches consider interface synthesis as optimizing a channel to existing client/server modules, we consider the interface synthesis as part of the client/server module synthesis (which...... may contain the re-use of existing modules). The interface synthesis approach describes the basic transformations needed to transform the server interface description into an interface description on the client side of the communication medium. The synthesis approach is illustrated through a point...

  5. Molecular approach of uranyl/mineral surfaces: experimental approach

    International Nuclear Information System (INIS)

    Drot, R.

    2009-01-01

    The author reports an experimental approach in which different spectroscopic approaches are coupled (laser spectroscopy, X-ray absorption spectroscopy, vibrational spectroscopy) to investigate the mechanisms controlling actinide sorption processes by different substrates, in order to assess radioactive waste storage site safety. Different substrates have been considered: monocrystalline or powdered TiO 2 , montmorillonite, and gibbsite

  6. MBO: A Rational Approach and a Comparative Frameworks Approach

    Science.gov (United States)

    Harries, T. W.

    1974-01-01

    Considering an organizational phenomenon from more than one theoretical perspective may be more fruitful than using a single rational approach. There is a danger that the restriction of information generation caused by the single approach may produce a false certainty engendered in part through the methodology itself. (Author/WM)

  7. Learning Mixtures of Polynomials of Conditional Densities from Data

    DEFF Research Database (Denmark)

    L. López-Cruz, Pedro; Nielsen, Thomas Dyhre; Bielza, Concha

    2013-01-01

    Mixtures of polynomials (MoPs) are a non-parametric density estimation technique for hybrid Bayesian networks with continuous and discrete variables. We propose two methods for learning MoP ap- proximations of conditional densities from data. Both approaches are based on learning MoP approximatio...

  8. Random survival forests for competing risks

    DEFF Research Database (Denmark)

    Ishwaran, Hemant; Gerds, Thomas A; Kogalur, Udaya B

    2014-01-01

    We introduce a new approach to competing risks using random forests. Our method is fully non-parametric and can be used for selecting event-specific variables and for estimating the cumulative incidence function. We show that the method is highly effective for both prediction and variable selection...

  9. Experimental Comparison of Signal Subspace Based Noise Reduction Methods

    DEFF Research Database (Denmark)

    Hansen, Peter Søren Kirk; Hansen, Per Christian; Hansen, Steffen Duus

    1999-01-01

    The signal subspace approach for non-parametric speech enhancement is considered. Several algorithms have been proposed in the literature but only partly analyzed. Here, the different algorithms are compared, and the emphasis is put onto the limiting factors and practical behavior of the estimators...

  10. Testing over-representation of observations in subsets of a DEA technology

    DEFF Research Database (Denmark)

    Asmild, Mette; Hougaard, Jens Leth; Olesen, Ole Bent

    2013-01-01

    This paper proposes a test for whether data are over-represented in a given production zone, i.e. a subset of a production possibility set which has been estimated using the non-parametric Data Envelopment Analysis (DEA) approach. A binomial test is used that relates the number of observations...

  11. Multi-site solar power forecasting using gradient boosted regression trees

    DEFF Research Database (Denmark)

    Persson, Caroline Stougård; Bacher, Peder; Shiga, Takahiro

    2017-01-01

    The challenges to optimally utilize weather dependent renewable energy sources call for powerful tools for forecasting. This paper presents a non-parametric machine learning approach used for multi-site prediction of solar power generation on a forecast horizon of one to six hours. Historical pow...

  12. Diagnostic tools for nearest neighbors techniques when used with satellite imagery

    Science.gov (United States)

    Ronald E. McRoberts

    2009-01-01

    Nearest neighbors techniques are non-parametric approaches to multivariate prediction that are useful for predicting both continuous and categorical forest attribute variables. Although some assumptions underlying nearest neighbor techniques are common to other prediction techniques such as regression, other assumptions are unique to nearest neighbor techniques....

  13. Applied cost allocation

    DEFF Research Database (Denmark)

    Bogetoft, Peter; Hougaard, Jens Leth; Smilgins, Aleksandrs

    2016-01-01

    This paper deals with empirical computation of Aumann–Shapley cost shares for joint production. We show that if one uses a mathematical programing approach with its non-parametric estimation of the cost function there may be observations in the data set for which we have multiple Aumann–Shapley p...

  14. Probabilistic model of random uncertainties in structural dynamics for mis-tuned bladed disks; Modele probabiliste des incertitudes en dynamique des structures pour le desaccordage des roues aubagees

    Energy Technology Data Exchange (ETDEWEB)

    Capiez-Lernout, E.; Soize, Ch. [Universite de Marne la Vallee, Lab. de Mecanique, 77 (France)

    2003-10-01

    The mis-tuning of blades is frequently the cause of spatial localizations for the dynamic forced response in turbomachinery industry. The random character of mis-tuning requires the construction of probabilistic models of random uncertainties. A usual parametric probabilistic description considers the mis-tuning through the Young modulus of each blade. This model consists in mis-tuning blade eigenfrequencies, assuming the blade modal shapes unchanged. Recently a new approach known as a non-parametric model of random uncertainties has been introduced for modelling random uncertainties in elasto-dynamics. This paper proposes the construction of a non-parametric model which is coherent with all the uncertainties which characterize mis-tuning. As mis-tuning is a phenomenon which is independent from one blade to another one, the structure is considered as an assemblage of substructures. The mean reduced matrix model required by the non-parametric approach is thus constructed by dynamic sub-structuring. A comparative approach is also needed to study the influence of the non-parametric approach for a usual parametric model adapted to mis-tuning. A numerical example is presented. (authors)

  15. Model-based Estimation of High Frequency Jump Diffusions with Microstructure Noise and Stochastic Volatility

    NARCIS (Netherlands)

    Bos, Charles S.

    2008-01-01

    When analysing the volatility related to high frequency financial data, mostly non-parametric approaches based on realised or bipower variation are applied. This article instead starts from a continuous time diffusion model and derives a parametric analog at high frequency for it, allowing

  16. Thermafil: A New Clinical Approach Due to New Dimensional Evaluations

    Science.gov (United States)

    Vittoria, G.; Pantaleo, G.; Blasi, A.; Spagnuolo, G.; Iandolo, A.; Amato, M.

    2018-01-01

    Background: There are a lot of techniques to obturate the root canals, but lateral condensation of gutta-percha is the most used one. An important aspect of thermafil is the error margin tolerated by the manufacturer in the production of plastic carriers. In literature, there is no evidence about discrepancy percentage between different carriers. It is demonstrated that the error margin of gutta-percha is 0.5% and is 0.2% for metal files (ISO standards). Objective: The aim of this study was to evaluate the real dimensions of thermafil plastic carriers observed by the stereo microscope measuring the dimensional discrepancy between them. Methods: For this study, 80 new thermafil (Dentsply Maillefer) have been selected. 40 thermafil 0.25 and 40 thermafil 0.30. Through 60X stereo microscope, the dimensions of the plastic carrier tips have been measured. The dimensions of the plastic carrier were also measured after a heating cycle. ZL GAL 11TUSM (Zetaline stereo evolution) microscope was used to observe the samples. Measurements were made through a dedicated software (Image Focus). All samples were analysed at 60X. Results: A non-parametric paired test (Wilcoxon test) was used to compare baseline and after heating values; p-values ≤ 0.05 were assumed to be statistically significant. Conclusion: The samples we measured showed a mean value of the diameters in Thermafil 25 that was 0.27 mm, for Thermafil 30 the mean value was 0.33 mm. We have measured a dimensional variable of 8% in the 25 group while in group 30 the maximum possible variation found was 4%, that’s why we propose a new protocol of obturation with thermafil. We can also conclude that a single heating process does not affect clinically the plastic carrier dimensions. PMID:29541263

  17. Operations management approach to hospitals.

    Science.gov (United States)

    Harvey, J; Duguay, C R

    1988-06-01

    An operations management systems approach can be a useful tool for coordinating and planning in a complex organization. The authors argue for adapting such an approach to health care from the manufacturing industries in order to facilitate strategy formulation, communication and implementation.

  18. The CAPM approach to materiality

    OpenAIRE

    Hadjieftychiou, Aristarchos

    1993-01-01

    Materiality is a pervasive accounting concept that has defied a precise quantitative definition. The Capital Asset Pricing Model (CAPM) approach to materiality provides a means for determining the limits that bound materiality. Also, the approach makes it possible to locate the point estimate within these limits based on certain assumptions.

  19. Cognitive Approaches to Automated Instruction.

    Science.gov (United States)

    Regian, J. Wesley, Ed.; Shute, Valerie J., Ed.

    This book contains a snapshot of state-of-the-art research on the design of automated instructional systems. Selected cognitive psychologists were asked to describe their approach to instruction and cognitive diagnosis, the theoretical basis of the approach, its utility and applicability, and the knowledge engineering or task analysis methods…

  20. Extending cosmology: the metric approach

    OpenAIRE

    Mendoza, S.

    2012-01-01

    Comment: 2012, Extending Cosmology: The Metric Approach, Open Questions in Cosmology; Review article for an Intech "Open questions in cosmology" book chapter (19 pages, 3 figures). Available from: http://www.intechopen.com/books/open-questions-in-cosmology/extending-cosmology-the-metric-approach

  1. [Alternative approaches in thyroid surgery].

    Science.gov (United States)

    Maurer, E; Wächter, S; Bartsch, D K

    2017-08-01

    In thyroid surgery multiple different cervical minimally invasive (partly endoscopically assisted) and extracervical endoscopic (partly robot-assisted) approaches have been developed in the last 20 years. The aim of all these alternative approaches to the thyroid gland is optimization of the cosmetic result. The indications for the use of alternative and conventional approaches are principally the same. Important requirements for the use of alternative methods are nevertheless a broad experience in conventional thyroid operations of the thyroid and adequate patient selection under consideration of the size of the thyroid and the underlying pathology. Contraindications for the use of alternative approaches are a large size of the thyroid gland including local symptoms, advanced carcinomas, reoperations and previous radiations of the anterior neck. The current article gives an overview of the clinically implemented alternative approaches for thyroid surgery. Of those the majority must still be considered as experimental. The alternative approaches to the thyroid gland can be divided in cervical minimally invasive, extracervical endosopic (robot-assisted) and transoral operations (natural orifice transluminal endoscopic surgery, NOTES). Since conventional thyroid operations are standardized procedures with low complication rates, alternative approaches to the thyroid gland are considered critically in Germany. The request for a perfect cosmetic result should not overweigh patients' safety. Only a few alternative approaches (e. g. MIVAT, RAT) can yet be considered as a safe addition in experienced hands in highly selected patients.

  2. Artificial intelligence approaches in statistics

    International Nuclear Information System (INIS)

    Phelps, R.I.; Musgrove, P.B.

    1986-01-01

    The role of pattern recognition and knowledge representation methods from Artificial Intelligence within statistics is considered. Two areas of potential use are identified and one, data exploration, is used to illustrate the possibilities. A method is presented to identify and separate overlapping groups within cluster analysis, using an AI approach. The potential of such ''intelligent'' approaches is stressed

  3. Towards a Standard Architecture for Digital Voting Systems - Defining a Generalized Ballot Schema

    DEFF Research Database (Denmark)

    Cochran, Dermot Robert

    2015-01-01

    Many electoral jurisdictions have their own distinctive voting schemes. There is no clear consensus about the ideal voting scheme for fair elections. Various systems for electronic and online voting have been either proposed or developed, but these systems tend to be aimed at particular vote...... counting schemes e.g. plurality. This paper proposes a way to decouple the ballot casting process from the vote counting scheme, using a generalized ballot schema....

  4. A practical approach for the validation and clinical implementation of a high-sensitivity cardiac troponin I assay across a North American city

    Directory of Open Access Journals (Sweden)

    Peter A. Kavsak

    2015-04-01

    Full Text Available Objectives: Despite several publications on the analytical performance of high-sensitivity cardiac troponin (hs-cTn assays, there has been little information on how laboratories should validate and implement these assays into clinical service. Our study provides a practical approach for the validation and implementation of a hs-cTn assay across a large North American City. Design and methods: Validation for the Abbott ARCHITECT hs-cTnI assay (across 5 analyzers consisted of verification of limit of blank (LoB, precision (i.e., coefficient of variation; CV testing at the reported limit of detection (LoD and within and outside the 99th percentile, linearity testing, cTnI versus hs-cTnI patient comparison within and between analyzers (Passing and Bablok and non-parametric analyses. Education, clinical communications, and memorandums were issued in advance to inform all staff across the city as well as a selected reminder the day before live-date to important users. All hospitals switched to the hs-cTnI assay concurrently (the contemporary cTnI assay removed with laboratory staff instructed to repeat samples previously measured with the contemporary cTnI assay with the hs-cTnI assay only by physician request. Results: Across the 5 analyzers and 6 reagent packs the overall LoB was 0.6 ng/L (n=60 with a CV of 33% at an overall mean of 1.2 ng/L (n=60; reported LoD=1.0 ng/L, with linearity demonstrated from 45,005 ng/L to 1.1 ng/L. Precision testing with a normal patient-pool QC material (mean range across 5 analyzers was 3.9–4.4 ng/L yielded a range of CVs from 7% to 10% (within-run and CVs from 7% to 18% (between-run with the high patient-pool QC material (mean range across 5 analyzers was 29.6–36.3 ng/L yielding a range of CVs from 2% to 5% (within-run and CVs from 4% to 8% (between-run. There was agreement between hs-cTnI versus cTnI with the patient samples (slope ranges: 0.89–1.03; intercept ranges: 1.9–3

  5. A systems approach to obesity

    Science.gov (United States)

    Bartsch, Sarah M.; Mui, Yeeli; Haidari, Leila A.; Spiker, Marie L.; Gittelsohn, Joel

    2017-01-01

    Obesity has become a truly global epidemic, affecting all age groups, all populations, and countries of all income levels. To date, existing policies and interventions have not reversed these trends, suggesting that innovative approaches are needed to transform obesity prevention and control. There are a number of indications that the obesity epidemic is a systems problem, as opposed to a simple problem with a linear cause-and-effect relationship. What may be needed to successfully address obesity is an approach that considers the entire system when making any important decision, observation, or change. A systems approach to obesity prevention and control has many benefits, including the potential to further understand indirect effects or to test policies virtually before implementing them in the real world. Discussed here are 5 key efforts to implement a systems approach for obesity prevention: 1) utilize more global approaches; 2) bring new experts from disciplines that do not traditionally work with obesity to share experiences and ideas with obesity experts; 3) utilize systems methods, such as systems mapping and modeling; 4) modify and combine traditional approaches to achieve a stronger systems orientation; and 5) bridge existing gaps between research, education, policy, and action. This article also provides an example of how a systems approach has been used to convene a multidisciplinary team and conduct systems mapping and modeling as part of an obesity prevention program in Baltimore, Maryland. PMID:28049754

  6. Approaches to Scandinavian Crime Fiction

    DEFF Research Database (Denmark)

    Agger, Gunhild

    2010-01-01

    as the approach of genre typology and the concept of evil – seemingly disparate concepts and approaches, but all related to the complex processes in the borderlands between crime fiction and society. Using examples from Scandinavian crime fiction, I discuss whether the growing proximity to international genres......The working paper discusses some of the major approaches to Scandinavian crime fiction in the light of the dominant features of crime culture, e.g. the broad exposure of crime fiction via different platforms and media. In this connection, the concept of mediatization is considered as well......, ways of production and standards increasingly removes Scandinavian crime fiction from its original attractions or not....

  7. Vanpooling: the three major approaches

    Energy Technology Data Exchange (ETDEWEB)

    Sears, P.M.

    1979-08-01

    The manual provides technical assistance to existing or prospective vanpool sponsors. It is designed to help them promote vanpooling in its three major approaches: employer sponsored, third party sponsored, and driver owned and operated. The first chapter is an overview of vanpooling and a second chapter, on vanpool marketing, is addressed to ridesharing coordinators and others whose responsibilities include the promotion of vanpooling. Some fact sheets on the three approaches provide convenient summaries of the needs and opportunities of each approach and suggest solutions to practical problems likely to be encountered in starting new vanpool programs.

  8. Approaches to ultrafast neutron detectors

    International Nuclear Information System (INIS)

    Wang, C.L.; Kalibjian, R.; Singh, M.S.

    1984-01-01

    We discuss two approaches to obtain detectors of very high temporal resolution. In the first approach, uranium-coated cathode is used in a streak tube configuration. Secondary electrons accompanying the fission fragments from a neutron-uranium reaction are accelerated, focussed and energy analyzed through a pinhole and streaked. Calculations show that 20 ps time-resolution can be obtained. In the second approach, a uranium-coated cathode is integrated into a transmission line. State-of-the-art technology indicates that time resolution of 20 ps can be obtained by gating the cathode with a fast electric pulse

  9. Groundbreaking approach to disaster relief

    OpenAIRE

    2008-01-01

    The humanitarian response to Cyclone Nargis, which struck Myanmar on 2 and 3 May, heralds a fundamentally new approach to relief coordination. As a result, a unique survey showed what really happened to the survivors. Sarah Cumberland reports.

  10. Radiological Approach to Forefoot Pain

    Directory of Open Access Journals (Sweden)

    Sai Chung Ho

    2015-06-01

    Full Text Available Forefoot pain is a common clinical complaint in orthopaedic practice. In this article, we discuss the anatomy of the forefoot, clinical and radiological approaches to forefoot pain, and common painful forefoot disorders and their associated radiological features.

  11. Theoretical approaches to elections defining

    OpenAIRE

    Natalya V. Lebedeva

    2011-01-01

    Theoretical approaches to elections defining develop the nature, essence and content of elections, help to determine their place and a role as one of the major national law institutions in democratic system.

  12. Theoretical approaches to elections defining

    Directory of Open Access Journals (Sweden)

    Natalya V. Lebedeva

    2011-01-01

    Full Text Available Theoretical approaches to elections defining develop the nature, essence and content of elections, help to determine their place and a role as one of the major national law institutions in democratic system.

  13. History: A Great Lives Approach

    Science.gov (United States)

    Jarvis, F. Washington

    1973-01-01

    After examining the drawbacks of some of the currently popular teaching methods, the author proposes an approach to the teaching of high school history focusing on the matter of history -- the lives of men and ideas of the past. (SM)

  14. Dynamic Approaches for Multichoice Solutions

    Directory of Open Access Journals (Sweden)

    Yu-Hsien Liao

    2011-01-01

    Full Text Available Based on alternative reduced games, several dynamic approaches are proposed to show how the three extended Shapley values can be reached dynamically from arbitrary efficient payoff vectors on multichoice games.

  15. Cell approach to glass transition

    International Nuclear Information System (INIS)

    Aste, Tomaso; Coniglio, Antonio

    2003-01-01

    We present a novel theoretical approach to understanding the complex dynamics of glass-forming liquids, granular packings and amorphous solids. This theory, which is an elaboration of the free volume and inherent structure approaches, allows one to retrieve the thermodynamical properties of these systems from studies of geometrical and topological properties of local, static configurations alone. When applied to hard-sphere systems, the present theory reproduces with a good quantitative agreement the equation of state for the crystalline and the disordered glassy phases. Moreover, we find that, as the density approaches a critical value close to the random close-packing density, the configurational entropy approaches zero and the large relaxation time diverges according to the Vogel-Fulcher behaviour, following also the Adam-Gibbs relation

  16. Four Approaches to Entrepreneurship II.

    Science.gov (United States)

    Meyer, Earl C.; Nauta, Tom

    1994-01-01

    Four approaches to teaching advanced entrepreneurship in current use are as follows: (1) advanced options such as franchises and buyouts and international entrepreneurship; (2) preentrepreneurship courses; (3) starting a business; and (4) structured experience. (JOW)

  17. Abstract Cauchy problems three approaches

    CERN Document Server

    Melnikova, Irina V

    2001-01-01

    Although the theory of well-posed Cauchy problems is reasonably understood, ill-posed problems-involved in a numerous mathematical models in physics, engineering, and finance- can be approached in a variety of ways. Historically, there have been three major strategies for dealing with such problems: semigroup, abstract distribution, and regularization methods. Semigroup and distribution methods restore well-posedness, in a modern weak sense. Regularization methods provide approximate solutions to ill-posed problems. Although these approaches were extensively developed over the last decades by many researchers, nowhere could one find a comprehensive treatment of all three approaches.Abstract Cauchy Problems: Three Approaches provides an innovative, self-contained account of these methods and, furthermore, demonstrates and studies some of the profound connections between them. The authors discuss the application of different methods not only to the Cauchy problem that is not well-posed in the classical sense, b...

  18. Nonlinear Approaches in Engineering Applications

    CERN Document Server

    Jazar, Reza

    2012-01-01

    Nonlinear Approaches in Engineering Applications focuses on nonlinear phenomena that are common in the engineering field. The nonlinear approaches described in this book provide a sound theoretical base and practical tools to design and analyze engineering systems with high efficiency and accuracy and with less energy and downtime. Presented here are nonlinear approaches in areas such as dynamic systems, optimal control and approaches in nonlinear dynamics and acoustics. Coverage encompasses a wide range of applications and fields including mathematical modeling and nonlinear behavior as applied to microresonators, nanotechnologies, nonlinear behavior in soil erosion,nonlinear population dynamics, and optimization in reducing vibration and noise as well as vibration in triple-walled carbon nanotubes. This book also: Provides a complete introduction to nonlinear behavior of systems and the advantages of nonlinearity as a tool for solving engineering problems Includes applications and examples drawn from the el...

  19. Cancer and Complementary Health Approaches

    Science.gov (United States)

    ... According to the 2007 National Health Interview Survey (NHIS), which included a comprehensive survey on the use ... their use of complementary health approaches. In the NHIS, survey respondents who had been diagnosed with cancer ...

  20. Fibromyalgia and Complementary Health Approaches

    Science.gov (United States)

    ... Musculoskeletal and Skin Diseases Web site . What the Science Says About Complementary Health Approaches for Fibromyalgia Mind ... Complementary and alternative medical therapies in fibromyalgia . Current Pharmaceutical Design . 2006;12(1):47–57. Sherman KJ, ...

  1. Anterior approach for knee arthrography

    International Nuclear Information System (INIS)

    Zurlo, J.V.; Towers, J.D.; Golla, S.

    2001-01-01

    Objective. To develop a new method of magnetic resonance arthrography (MRA) of the knee using an anterior approach analogous to the portals used for knee arthroscopy.Design. An anterior approach to the knee joint was devised mimicking anterior portals used for knee arthroscopy. Seven patients scheduled for routine knee MRA were placed in a decubitus position and under fluoroscopic guidance a needle was advanced from a position adjacent to the patellar tendon into the knee joint. After confirmation of the needle tip location, a dilute gadolinium solution was injected.Results and conclusion. All the arthrograms were technically successful. The anterior approach to knee MRA has greater technical ease than the traditional approach with little patient discomfort. (orig.)

  2. Approaches for assessing sustainable remediation

    DEFF Research Database (Denmark)

    Søndergaard, Gitte Lemming; Binning, Philip John; Bjerg, Poul Løgstrup

    Sustainable remediation seeks to reduce direct contaminant point source impacts on the environment, while minimizing the indirect cost of remediation to the environment, society and economy. This paper presents an overview of available approaches for assessing the sustainability of alternative...... remediation strategies for a contaminated site. Most approaches use multi-criteria assessment methods (MCA) to structure a decision support process. Different combinations of environmental, social and economic criteria are employed, and are assessed either in qualitative or quantitative forms with various...... tools such as life cycle assessment and cost benefit analysis. Stakeholder involvement, which is a key component of sustainable remediation, is conducted in various ways. Some approaches involve stakeholders directly in the evaluation or weighting of criteria, whereas other approaches only indirectly...

  3. System approach to chemistry course

    OpenAIRE

    Lorina E. Kruglova; Valentina G. Derendyaeva

    2010-01-01

    The article considers the raise of chemistry profile for engineers and constructors training, discloses the system approach to chemistry course and singles out the most important modules from the course of general chemistry for construction industry.

  4. A Four-Dimensional Approach

    African Journals Online (AJOL)

    ... of East Asian Students in English-speaking Countries: A Four-Dimensional ... country's language greatly shapes all aspects of the student's international education ... Taking this ecological approach will help clearly define the role that home ...

  5. Qualitative Approaches to Evaluating Education.

    Science.gov (United States)

    Fetterman, David M.

    1988-01-01

    Discusses qualitative research and its application to educational evaluation. Approaches discussed include the following: (1) ethnography; (2) naturalistic inquiry; (3) generic pragmatic (sociological) inquiry; (4) connoisseurship/criticism; (5) metaphors; and (6) phenomenography. (FMW)

  6. Transmandibular approach to total maxillectomy

    OpenAIRE

    Tiwari, R. M.

    2001-01-01

    Total Maxillectomy through transfacial approach has been practiced in the treatment of Cancer for more than a decade. Its role in T3 - T4 tumors extending posteriorly through gthe bony wall is questionable, since an oncological radical procedure is often not possible. Recurrences in the infratemporal fossa are common. Despite the addition of radiotherapy five year survivals have not significantly improved. Transmandibular approach to Total Maxillectomy overcomes this shortcoming by including ...

  7. Microscopic approach to nuclear anharmonicities

    International Nuclear Information System (INIS)

    Matsuo, Masayuki; Shimizu, Yoshifumi; Matsuyanagi, Kenichi

    1985-01-01

    Present status of microscopic study of nuclear anharmonicity phenomena is reviewed from the viewpoint of the time-dependent Hartree-Bogoliubov approach. Both classical- and quantum-mechanical aspects of this approach are discussed. The Bohr-Mottelson-type collective Hamiltonian for anharmonic gamma vibrations is microscopically derived by means of the self-consistent-collective-coordinate method, and applied to the problem of two-phonon states of 168 Er. (orig.)

  8. Bounding approaches to system identification

    CERN Document Server

    Norton, John; Piet-Lahanier, Hélène; Walter, Éric

    1996-01-01

    In response to the growing interest in bounding error approaches, the editors of this volume offer the first collection of papers to describe advances in techniques and applications of bounding of the parameters, or state variables, of uncertain dynamical systems. Contributors explore the application of the bounding approach as an alternative to the probabilistic analysis of such systems, relating its importance to robust control-system design.

  9. Approaches toward learning in physiotherapy

    Directory of Open Access Journals (Sweden)

    L. Keiller

    2013-11-01

    Full Text Available The aim of this study was to investigate the approaches toward learning of undergraduate Physiotherapy students in a PBl module to enhance facilitation of learning at the Stellenbosch University, Division of Physiotherapy in South Africa. This quantitative, descriptive study utilized the revised Two-factor Study Process Questionnaire (r-SPQ-2f to evaluate the study cohorts’ approaches toward learning in the module. results of the data instruments were analysed statistically and discussed in a descriptive manner. There were a statistically significant greater number of students who adopted a deep approach toward learning at the commencement of the academic year. Students showed a trend toward an increase in their intrinsic interest in the learning material as the module progressed. Students in the Applied Physiotherapy module (ATP started to shift their focus from a surface learning approach to a deep learning approach. further research is needed to determine the long-term changes in approach toward learning and the possible determinants of these changes. This can be done in conjunction with the implementation of quality assurance mechanisms for learning material and earlier preparation of students for the change in the learning environment.

  10. Approach

    Directory of Open Access Journals (Sweden)

    Guido Pinto Aguirre

    2008-01-01

    Full Text Available El propósito de este documento es investigar y re-estimar los efectos de los patrones de lactancia, salud y estado nutricional de la mujer y consumo de energía sobre la duración del retorno de la fertilidad de postparto (es decir, retorno de la menstruación de postparto utilizando toda la información relevante en el estudio longitudinal del Instituto de Nutrición de Centroamérica y Panamá y un procedimiento de estimación más adecuado (modelos de riesgo. Los datos utilizados provienen del Estudio Longitudinal llevado a cabo en Guatemala entre 1967 y 1979. En este artículo se utiliza un modelo de riesgo con varios estados que reconoce diferentes caminos y estados en el proceso del retorno de la fertilidad de postparto. El modelo descansa en la existencia de cinco estados (lactancia total, lactancia parcial, destete, mortalidad infantil y menstruación. También incluye de manera explícita nutrición maternal y consumo de energía de la mujer como elementos estratégicos del modelo. El estudio encontró que los efectos de los patrones de lactancia, nutrición de la madre y patrones de trabajo de la mujer (consumo de energía sobre la fertilidad en las áreas rurales de Guatemala son fuertes y significativos. La contribución de este artículo es mostrar que la aplicación de los modelos de riesgo con múltiples estados proporciona estimados que son consistentes con hipótesis que relacionan patrones de lactancia, estado nutricional maternal y estresores maternales externos a procesos que aceleran (desaceleran el retorno de ciclos menstruales normales.

  11. Strain expansion-reduction approach

    Science.gov (United States)

    Baqersad, Javad; Bharadwaj, Kedar

    2018-02-01

    Validating numerical models are one of the main aspects of engineering design. However, correlating million degrees of freedom of numerical models to the few degrees of freedom of test models is challenging. Reduction/expansion approaches have been traditionally used to match these degrees of freedom. However, the conventional reduction/expansion approaches are only limited to displacement, velocity or acceleration data. While in many cases only strain data are accessible (e.g. when a structure is monitored using strain-gages), the conventional approaches are not capable of expanding strain data. To bridge this gap, the current paper outlines a reduction/expansion technique to reduce/expand strain data. In the proposed approach, strain mode shapes of a structure are extracted using the finite element method or the digital image correlation technique. The strain mode shapes are used to generate a transformation matrix that can expand the limited set of measurement data. The proposed approach can be used to correlate experimental and analytical strain data. Furthermore, the proposed technique can be used to expand real-time operating data for structural health monitoring (SHM). In order to verify the accuracy of the approach, the proposed technique was used to expand the limited set of real-time operating data in a numerical model of a cantilever beam subjected to various types of excitations. The proposed technique was also applied to expand real-time operating data measured using a few strain gages mounted to an aluminum beam. It was shown that the proposed approach can effectively expand the strain data at limited locations to accurately predict the strain at locations where no sensors were placed.

  12. Employee Reactions to Merit Pay: Cognitive Approach and Social Approach

    Science.gov (United States)

    Wang, Yingchun

    2010-01-01

    The dissertation aims to tackle one of the most pressing questions facing the merit pay system researchers and practitioners: Why do merit pay raises have such a small effect on employees' satisfaction, commitment and job performance? My approach to the study of this question is to develop explanatory frameworks from two perspectives: cognitive…

  13. [Modern treatment approaches to gambling].

    Science.gov (United States)

    Egorov, A Iu

    2014-01-01

    Compulsive gambling has received widespread attention in the last decade. Gambling has become the first non-chemical addiction, which went down to the section "Addiction and related disorders" of the modern DSM-V. The review considers non-pharmacological and pharmacological approaches to the treatment of gambling. Among non-drug approaches, cognitive-behavioral therapy and 12-step programs have gained the most popularity in the "Gamblers Anonymous" community. Among pharmacological approaches, three classes of drugs: antidepressants (mainly SSRIs), opiate antagonists (naltrexone and nalmefene) and mood stabilizers (valproate, lithium, topiramate) proved to be effective in treatment of gambling. No differences in the efficacy of the three classes of psychotropic drugs have been identified. Preliminary results for N-acetylcysteine and memantine cause optimism in terms of perspective.

  14. Solid mechanics a variational approach

    CERN Document Server

    Dym, Clive L

    2013-01-01

    Solid Mechanics: A Variational Approach, Augmented Edition presents a lucid and thoroughly developed approach to solid mechanics for students engaged in the study of elastic structures not seen in other texts currently on the market. This work offers a clear and carefully prepared exposition of variational techniques as they are applied to solid mechanics. Unlike other books in this field, Dym and Shames treat all the necessary theory needed for the study of solid mechanics and include extensive applications. Of particular note is the variational approach used in developing consistent structural theories and in obtaining exact and approximate solutions for many problems.  Based on both semester and year-long courses taught to undergraduate seniors and graduate students, this text is geared for programs in aeronautical, civil, and mechanical engineering, and in engineering science. The authors’ objective is two-fold: first, to introduce the student to the theory of structures (one- and two-dimensional) as ...

  15. City evacuations an interdisciplinary approach

    CERN Document Server

    Binner, Jane; Branicki, Layla; Galla, Tobias; Jones, Nick; King, James; Kolokitha, Magdalini; Smyrnakis, Michalis

    2015-01-01

    Evacuating a city is a complex problem that involves issues of governance, preparedness education, warning, information sharing, population dynamics, resilience and recovery. As natural and anthropogenic threats to cities grow, it is an increasingly pressing problem for policy makers and practitioners.   The book is the result of a unique interdisciplinary collaboration between researchers in the physical and social sciences to consider how an interdisciplinary approach can help plan for large scale evacuations.  It draws on perspectives from physics, mathematics, organisation theory, economics, sociology and education.  Importantly it goes beyond disciplinary boundaries and considers how interdisciplinary methods are necessary to approach a complex problem involving human actors and increasingly complex communications and transportation infrastructures.   Using real world case studies and modelling the book considers new approaches to evacuation dynamics.  It addresses questions of complexity, not only ...

  16. Toward predicate approaches to modality

    CERN Document Server

    Stern, Johannes

    2016-01-01

    In this volume, the author investigates and argues for, a particular answer to the question: What is the right way to logically analyze modalities from natural language within formal languages? The answer is: by formalizing modal expressions in terms of predicates. But, as in the case of truth, the most intuitive modal principles lead to paradox once the modal notions are conceived as predicates. The book discusses the philosophical interpretation of these modal paradoxes and argues that any satisfactory approach to modality will have to face the paradoxes independently of the grammatical category of the modal notion. By systematizing modal principles with respect to their joint consistency and inconsistency, Stern provides an overview of the options and limitations of the predicate approach to modality that may serve as a useful starting point for future work on predicate approaches to modality. Stern also develops a general strategy for constructing philosophically attractive theories of modal notions conce...

  17. Thermodynamic approach to biomass gasification

    International Nuclear Information System (INIS)

    Boissonnet, G.; Seiler, J.M.

    2003-01-01

    The document presents an approach of biomass transformation in presence of steam, hydrogen or oxygen. Calculation results based on thermodynamic equilibrium are discussed. The objective of gasification techniques is to increase the gas content in CO and H 2 . The maximum content in these gases is obtained when thermodynamic equilibrium is approached. Any optimisation action of a process. will, thus, tend to approach thermodynamic equilibrium conditions. On the other hand, such calculations can be used to determine the conditions which lead to an increase in the production of CO and H 2 . An objective is also to determine transformation enthalpies that are an important input for process calculations. Various existing processes are assessed, and associated thermodynamic limitations are evidenced. (author)

  18. Market-based approaches to tree valuation

    Science.gov (United States)

    Geoffrey H. Donovan; David T. Butry

    2008-01-01

    A recent four-part series in Arborist News outlined different appraisal processes used to value urban trees. The final article in the series described the three generally accepted approaches to tree valuation: the sales comparison approach, the cost approach, and the income capitalization approach. The author, D. Logan Nelson, noted that the sales comparison approach...

  19. Real analysis a constructive approach

    CERN Document Server

    Bridger, Mark

    2012-01-01

    A unique approach to analysis that lets you apply mathematics across a range of subjects This innovative text sets forth a thoroughly rigorous modern account of the theoretical underpinnings of calculus: continuity, differentiability, and convergence. Using a constructive approach, every proof of every result is direct and ultimately computationally verifiable. In particular, existence is never established by showing that the assumption of non-existence leads to a contradiction. The ultimate consequence of this method is that it makes sense-not just to math majors but also to students from a

  20. New Approaches to Reliability Assessment

    DEFF Research Database (Denmark)

    Ma, Ke; Wang, Huai; Blaabjerg, Frede

    2016-01-01

    of energy. New approaches for reliability assessment are being taken in the design phase of power electronics systems based on the physics-of-failure in components. In this approach, many new methods, such as multidisciplinary simulation tools, strength testing of components, translation of mission profiles......, and statistical analysis, are involved to enable better prediction and design of reliability for products. This article gives an overview of the new design flow in the reliability engineering of power electronics from the system-level point of view and discusses some of the emerging needs for the technology...

  1. Probabilistic approach to EMP assessment

    International Nuclear Information System (INIS)

    Bevensee, R.M.; Cabayan, H.S.; Deadrick, F.J.; Martin, L.C.; Mensing, R.W.

    1980-09-01

    The development of nuclear EMP hardness requirements must account for uncertainties in the environment, in interaction and coupling, and in the susceptibility of subsystems and components. Typical uncertainties of the last two kinds are briefly summarized, and an assessment methodology is outlined, based on a probabilistic approach that encompasses the basic concepts of reliability. It is suggested that statements of survivability be made compatible with system reliability. Validation of the approach taken for simple antenna/circuit systems is performed with experiments and calculations that involve a Transient Electromagnetic Range, numerical antenna modeling, separate device failure data, and a failure analysis computer program

  2. Need for a multidisciplinary approach

    International Nuclear Information System (INIS)

    Foersund, Hans Martin; Kristoffersen, Lasse; Skaatun, Helge; Dragsund, Egil

    2004-01-01

    Transportation of oil poses significant environmental risks. Effective reduction of those risks require an integrated approach, including the legal and institutional aspects, waste reception facilities in ports, navigational risk assessment, oil spill modelling, environmental sensitivity mapping and oil spill contingency planning. Proper management starts with an integrated study of oil pollution risk management, covering both oil pollution as a consequence of operational discharges and as a consequence of accidents. However, as the two areas of oil pollution management are different, they therefore require separate approaches

  3. Systems biology approach to bioremediation

    Energy Technology Data Exchange (ETDEWEB)

    Chakraborty, Romy; Wu, Cindy H.; Hazen, Terry C.

    2012-06-01

    Bioremediation has historically been approached as a ‘black box’ in terms of our fundamental understanding. Thus it succeeds and fails, seldom without a complete understanding of why. Systems biology is an integrated research approach to study complex biological systems, by investigating interactions and networks at the molecular, cellular, community, and ecosystem level. The knowledge of these interactions within individual components is fundamental to understanding the dynamics of the ecosystem under investigation. Finally, understanding and modeling functional microbial community structure and stress responses in environments at all levels have tremendous implications for our fundamental understanding of hydrobiogeochemical processes and the potential for making bioremediation breakthroughs and illuminating the ‘black box’.

  4. Elementary calculus an infinitesimal approach

    CERN Document Server

    Keisler, H Jerome

    2012-01-01

    This first-year calculus book is centered around the use of infinitesimals, an approach largely neglected until recently for reasons of mathematical rigor. It exposes students to the intuition that originally led to the calculus, simplifying their grasp of the central concepts of derivatives and integrals. The author also teaches the traditional approach, giving students the benefits of both methods.Chapters 1 through 4 employ infinitesimals to quickly develop the basic concepts of derivatives, continuity, and integrals. Chapter 5 introduces the traditional limit concept, using approximation p

  5. International differences in design approach

    International Nuclear Information System (INIS)

    Roberts, A.C.; McFarlane, J.P.

    1998-01-01

    The objective of this paper is to review a number of separate research studies and civil engineering nuclear projects from the authors' experience, with a view to examining the apparent differences in approach taken by different nationalities in the civil engineering design and specification of nuclear facilities. In particular, the development of design codes applicable to the UK nuclear power industry is reviewed and comparisons made with the highly regulated approach adopted in other major nuclear power generating countries. Significant differences resulting from the use of specific design codes and regulations are identified. (author)

  6. Synergistic approach to patient dialysate.

    Science.gov (United States)

    Dragotoiu, A; Checheriţă, A I; Ciocâlteu, A; Rizeanu, S

    2015-01-01

    The stress a patient is subjected to during dialysis treatment can be reduced by using a synergetic approach by the medical team. The integration into therapy of the positive psychical resources such as: active positive coping mechanisms, individual or family mental resilience, improvement of the image and self-esteem, better tolerance to frustration can represent an important part in the improvement of the patient's quality of life, determination of a positive approach of the situations both for him and close friends and relatives.

  7. Cleft Palate; A Multidiscipline Approach.

    Science.gov (United States)

    Stark, Richard B., Ed.

    Nineteen articles present a multidisciplinary approach to the management of facial clefts. The following subjects are discussed: the history of cleft lip and cleft palate surgery; cogenital defects; classification; the operation of a cleft palate clinic; physical examination of newborns with cleft lip and/or palate; nursing care; anesthesia;…

  8. Oncoplastic Approaches to Breast Conservation

    Directory of Open Access Journals (Sweden)

    Dennis R. Holmes

    2011-01-01

    Full Text Available When a woman is diagnosed with breast cancer many aspects of her physical, emotional, and sexual wholeness are threatened. The quickly expanding field of oncoplastic breast surgery aims to enhance the physician commitment to restore the patient's image and self-assurance. By combining a multidisciplinary approach to diagnosis and treatment with oncoplastic surgery, successful results in the eyes of the patient and physician are significantly more likely to occur. As a way to aid oncoplastic teams in determining which approach is most suitable for their patient's tumor size, tumor location, body habitus, and desired cosmetic outcome we present a review of several oncoplastic surgical approaches. For resections located anywhere in the breast, the radial ellipse segmentectomy incision and circumareolar approach for segmental resection are discussed. For resections in the upper or central breast, crescent mastopexy, the batwing incision, the hemibatwing incision, donut mastopexy, B-flap resection, and the central quadrantectomy are reviewed. For lesions of the lower breast, the triangle incision, inframammary incision, and reduction mastopexy are discussed. Surgeons who are interested in adding oncoplastic breast conserving therapies to their skill sets are encouraged to implement these surgical techniques where applicable and to seek out breast fellowships or enhanced training when appropriate.

  9. Budgeting Approaches in Community Colleges

    Science.gov (United States)

    Palmer, James C.

    2014-01-01

    Several budgeting approaches have been initiated as alternatives to the traditional, incremental process. These include formula budgeting; zero-base budgeting; planning, programming, and budgeting systems; and responsibility center budgeting. Each is premised on assumptions about how organizations might best make resource allocation decisions.…

  10. Tiered Approach to Resilience Assessment.

    Science.gov (United States)

    Linkov, Igor; Fox-Lent, Cate; Read, Laura; Allen, Craig R; Arnott, James C; Bellini, Emanuele; Coaffee, Jon; Florin, Marie-Valentine; Hatfield, Kirk; Hyde, Iain; Hynes, William; Jovanovic, Aleksandar; Kasperson, Roger; Katzenberger, John; Keys, Patrick W; Lambert, James H; Moss, Richard; Murdoch, Peter S; Palma-Oliveira, Jose; Pulwarty, Roger S; Sands, Dale; Thomas, Edward A; Tye, Mari R; Woods, David

    2018-04-25

    Regulatory agencies have long adopted a three-tier framework for risk assessment. We build on this structure to propose a tiered approach for resilience assessment that can be integrated into the existing regulatory processes. Comprehensive approaches to assessing resilience at appropriate and operational scales, reconciling analytical complexity as needed with stakeholder needs and resources available, and ultimately creating actionable recommendations to enhance resilience are still lacking. Our proposed framework consists of tiers by which analysts can select resilience assessment and decision support tools to inform associated management actions relative to the scope and urgency of the risk and the capacity of resource managers to improve system resilience. The resilience management framework proposed is not intended to supplant either risk management or the many existing efforts of resilience quantification method development, but instead provide a guide to selecting tools that are appropriate for the given analytic need. The goal of this tiered approach is to intentionally parallel the tiered approach used in regulatory contexts so that resilience assessment might be more easily and quickly integrated into existing structures and with existing policies. Published 2018. This article is a U.S. government work and is in the public domain in the USA.

  11. Superspace approach to lattice supersymmetry

    International Nuclear Information System (INIS)

    Kostelecky, V.A.; Rabin, J.M.

    1984-01-01

    We construct a cubic lattice of discrete points in superspace, as well as a discrete subgroup of the supersymmetry group which maps this ''superlattice'' into itself. We discuss the connection between this structure and previous versions of lattice supersymmetry. Our approach clarifies the mathematical problems of formulating supersymmetric lattice field theories and suggests new methods for attacking them

  12. Stress Management: A Rational Approach.

    Science.gov (United States)

    Reeves, Cecil

    This workbook was designed for use as the primary resource tool during a l-day participatory stress management seminar in which participants identify stressful situations, conduct analyses, and develop approaches to manage the stressful situations more effectively. Small group warm-up activities designed to introduce participants, encourage…

  13. Ecological approaches to human nutrition.

    Science.gov (United States)

    DeClerck, Fabrice A J; Fanzo, Jessica; Palm, Cheryl; Remans, Roseline

    2011-03-01

    Malnutrition affects a large number of people throughout the developing world. Approaches to reducing malnutrition rarely focus on ecology and agriculture to simultaneously improve human nutrition and environmental sustainability. However, evidence suggests that interdisciplinary approaches that combine the knowledge bases of these disciplines can serve as a central strategy in alleviating hidden hunger for the world's poorest. To describe the role that ecological knowledge plays in alleviating hidden hunger, considering human nutrition as an overlooked ecosystem service. We review existing literature and propose a framework that expands on earlier work on econutrition. We provide novel evidence from case studies conducted by the authors in western Kenya and propose a framework for interdisciplinary collaboration to alleviate hidden hunger, increase agricultural productivity, and improve environmental sustainability. Our review supports the concept that an integrated approach will impact human nutrition. We provide evidence that increased functional agrobiodiversity can alleviate anemia, and interventions that contribute to environmental sustainability can have both direct and indirect effects on human health and nutritional well-being. Integrated and interdisciplinary approaches are critical to reaching development goals. Ecologists must begin to consider not only how their field can contribute to biodiversity conservation, but also, the relationship between biodiversity and provisioning of nontraditional ecosystem services such as human health. Likewise, nutritionists and agronomists must recognize that many of the solutions to increasing human wellbeing and health can best be achieved by focusing on a healthy environment and the conservation of ecosystem services.

  14. A Psychoanalytic Approach to Fieldwork

    Science.gov (United States)

    Ramvi, Ellen

    2012-01-01

    This article focuses on what both psychoanalysis and anthropology have in common: the emphasis on the researcher's own experience. An ethnographic fieldwork will be used to illustrate how a psychoanalytical approach unfolds the material when studying conditions for learning from experience among teachers in two Norwegian junior high schools, and…

  15. Eschenmoser Approach to Vitamin B

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 19; Issue 7. Eschenmoser Approach to Vitamin B12 by A/D Strategy: An Unexpected Journey. G Wayne Craig. General Article Volume 19 Issue 7 July 2014 pp 624-640. Fulltext. Click here to view fulltext PDF. Permanent link:

  16. Variational approach to magnetic moments

    Energy Technology Data Exchange (ETDEWEB)

    Lipparini, E; Stringari, S; Traini, M [Dipartimento di Matematica e Fisica, Libera Universita di Trento, Italy

    1977-11-07

    Magnetic moments in nuclei with a spin unsaturated core plus or minus an extra nucleon have been studied using a restricted Hartree-Fock approach. The method yields simple explicit expressions for the deformed ground state and for magnetic moments. Different projection techniques of the HF scheme have been discussed and compared with perturbation theory.

  17. Nuclear regulation - the Canadian approach

    International Nuclear Information System (INIS)

    Jennekens, J.

    1981-09-01

    Although the Atomic Energy Control Board was established 35 years ago the basic philosophy of nuclear regulation in Canada and the underlying principles of the regulatory process remain essentially unchanged. This paper outlines the Canadian approach to nuclear regulation and explains in practical terms how the principles of regulation are applied. (author)

  18. Dental approach to craniofacial syndromes

    DEFF Research Database (Denmark)

    Kjær, Inger

    2012-01-01

    is essential for insight into craniofacial syndromes. The dentition, thus, becomes central in diagnostics and evaluation of the pathogenesis. Developmental fields can explore and advance the concept of dental approaches to craniofacial syndromes. Discussion. As deviations in teeth persist and do not reorganize...

  19. Listening Comprehension: Approach, Design, Procedure.

    Science.gov (United States)

    Richards, Jack C.

    1983-01-01

    Three dimensions in the teaching of listening comprehension are outlined: (1) a theory is presented that takes account of the cognitive processes used (approach); (2) listeners' needs are analyzed and a taxonomy of microskills and objectives for teaching them are proposed (design); and (3) classroom exercises and activities are suggested…

  20. Therapeutic approaches to genetic disorders

    African Journals Online (AJOL)

    salah

    Although prevention is the ideal goal for genetic disorders, various types of therapeutic ... The patient being ... pirical or aimed at controlling or mediating signs and symptoms without care. ... plications and gene therapy approaches .... genes family, have opened a wide and .... cancer where nanoparticles are used to.

  1. INTEGRATED EXPERIENCE APPROACH TO LEARNING.

    Science.gov (United States)

    POSTLETHWAIT, S.N.; AND OTHERS

    THE USE OF AUDIOTUTORIAL TECHNIQUES FOR TEACHING INTRODUCTORY COLLEGE BOTANY IS DESCRIBED. SPECIFIC PRACTICES USED AT PURDUE UNIVERSITY TO ILLUSTRATE DIFFERENT FACETS OF THE APPROACH ARE ANALYZED. INCLUDED ARE INDEPENDENT STUDY SESSIONS, SMALL ASSEMBLY SESSIONS, GENERAL ASSEMBLY SESSIONS, AND HOME STUDY SESSIONS. ILLUSTRATIONS AND SPECIFICATIONS…

  2. Designing for Uncertainty: Three Approaches

    Science.gov (United States)

    Bennett, Scott

    2007-01-01

    Higher education wishes to get long life and good returns on its investment in learning spaces. Doing this has become difficult because rapid changes in information technology have created fundamental uncertainties about the future in which capital investments must deliver value. Three approaches to designing for this uncertainty are described…

  3. Coaching Humanistically: An Alternative Approach.

    Science.gov (United States)

    Danziger, Raymond Curtis

    1982-01-01

    Four goals for a humanistic approach to athletics are: (1) elevating perception of students' physical abilities to improve self-esteem; (2) encouraging self-actualization; (3) contributing to self-understanding; and (4) improving interpersonal relationships. Implications of these objectives for team management, competition, and the attitudes of…

  4. Gender Differences in Disciplinary Approaches.

    Science.gov (United States)

    Rodriguez, Nixaliz

    This study explored differences in disciplinary approaches of male and female teachers toward male and female children, examining the connection between educator's gender and method of disciplining urban, elementary school aged children. Participants were 20 New York State certified and licensed teachers in two elementary schools. Teacher surveys…

  5. A Professional Learning Community Approach

    African Journals Online (AJOL)

    This paper provides insights into how Life Sciences teachers in the Eastern Cape can be supported through professional learning communities (PLCs) as a potential approach to enhancing their biodiversity knowledge. PLCs are communities that provide the setting and necessary support for groups of classroom teachers to ...

  6. A Freudian Approach to Education.

    Science.gov (United States)

    Gartner, Sandra L.

    This document offers the point of view that Bruno Bettelheim's writings, based on Sigmund Freud's approach to education, suggest the most practical applications for achieving positive results within the classroom. The overall result of a student being taught all through school by the Freudian method would be an extremely positive one. Such a…

  7. International Approaches to Clinical Costing

    DEFF Research Database (Denmark)

    Chapman, Christopher; Kern, Anja; Laguecir, Aziza

    This report has been commissioned by both HFMA and Monitor and the work has been led by Imperial College Business School’s Health Management Group. The report compares current approaches to costing, primarily across Europe. The findings reveal wide-ranging practices and uses for costing data, and...

  8. An Integrated Approach to Biology

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 16; Issue 8. An Integrated Approach to Biology. Aniket Bhattacharya. General Article Volume 16 Issue 8 August 2011 pp 742-753. Fulltext. Click here to view fulltext PDF. Permanent link: https://www.ias.ac.in/article/fulltext/reso/016/08/0742-0753 ...

  9. Algorithmic approach to diagram techniques

    International Nuclear Information System (INIS)

    Ponticopoulos, L.

    1980-10-01

    An algorithmic approach to diagram techniques of elementary particles is proposed. The definition and axiomatics of the theory of algorithms are presented, followed by the list of instructions of an algorithm formalizing the construction of graphs and the assignment of mathematical objects to them. (T.A.)

  10. Classical approach in atomic physics

    International Nuclear Information System (INIS)

    Solov'ev, E.A.

    2011-01-01

    The application of a classical approach to various quantum problems - the secular perturbation approach to quantization of a hydrogen atom in external fields and a helium atom, the adiabatic switching method for calculation of a semiclassical spectrum of a hydrogen atom in crossed electric and magnetic fields, a spontaneous decay of excited states of a hydrogen atom, Gutzwiller's approach to Stark problem, long-lived excited states of a helium atom discovered with the help of Poincare section, inelastic transitions in slow and fast electron-atom and ion-atom collisions - is reviewed. Further, a classical representation in quantum theory is discussed. In this representation the quantum states are treated as an ensemble of classical states. This approach opens the way to an accurate description of the initial and final states in classical trajectory Monte Carlo (CTMC) method and a purely classical explanation of tunneling phenomenon. The general aspects of the structure of the semiclassical series such as renormalization group symmetry, criterion of accuracy and so on are reviewed as well. (author)

  11. Oncoplastic Approaches to Breast Conservation

    International Nuclear Information System (INIS)

    Holmes, D.R.; Schooler, W.; Smith, R.

    2011-01-01

    When a woman is diagnosed with breast cancer many aspects of her physical, emotional, and sexual wholeness are threatened. The quickly expanding field of oncoplastic breast surgery aims to enhance the physician commitment to restore the patient's image and self-assurance. By combining a multidisciplinary approach to diagnosis and treatment with oncoplastic surgery, successful results in the eyes of the patient and physician are significantly more likely to occur. As a way to aid oncoplastic teams in determining which approach is most suitable for their patient's tumor size, tumor location, body habitus, and desired cosmetic outcome we present a review of several oncoplastic surgical approaches. For resections located anywhere in the breast, the radial ellipse segmentectomy incision and circumareolar approach for segmental resection are discussed. For resections in the upper or central breast, crescent mastopexy, the batwing incision, the hemi batwing incision, donut mastopexy, B-flap resection, and the central quadrantectomy are reviewed. For lesions of the lower breast, the triangle incision, infra mammary incision, and reduction mastopexy are discussed. Surgeons who are interested in adding oncoplastic breast conserving therapies to their skill sets are encouraged to implement these surgical techniques where applicable and to seek out breast fellowships or enhanced training when appropriate

  12. HEURISTIC APPROACHES FOR PORTFOLIO OPTIMIZATION

    OpenAIRE

    Manfred Gilli, Evis Kellezi

    2000-01-01

    The paper first compares the use of optimization heuristics to the classical optimization techniques for the selection of optimal portfolios. Second, the heuristic approach is applied to problems other than those in the standard mean-variance framework where the classical optimization fails.

  13. Appreciating Music: An Active Approach

    Science.gov (United States)

    Levin, Andrew R.; Pargas, Roy P.

    2005-01-01

    A particularly innovative use of laptops is to enhance the music appreciation experience. Group listening and discussion, in combination with a new Web-based application, lead to deeper understanding of classical music. ["Appreciating Music: An Active Approach" was written with Joshua Austin.

  14. Approaching Environmental Issues in Architecture

    DEFF Research Database (Denmark)

    Petersen, Mads Dines; Knudstrup, Mary-Ann

    2013-01-01

    The research presented here takes its point of departure in the design process with a specific focus on how it is approached when designing energy efficient architecture. This is done through a case-study of a design process in a Danish architectural office. This study shows the importance...

  15. Lignin Sulfonation - A different Approach

    DEFF Research Database (Denmark)

    Bjørkmann, Anders

    2001-01-01

    The research on sulfite pulping has been characterized by the attempts to explain its chemistry. The. different approach presented is incited by perceptions about the (still) unsolved problem of the ultrastructural features of lignin in wood. A simple kinetic model has been chosen to describe the...

  16. Theoretical Approaches to Political Communication.

    Science.gov (United States)

    Chesebro, James W.

    Political communication appears to be emerging as a theoretical and methodological academic area of research within both speech-communication and political science. Five complimentary approaches to political science (Machiavellian, iconic, ritualistic, confirmational, and dramatistic) may be viewed as a series of variations which emphasize the…

  17. Approaching Bose-Einstein Condensation

    Science.gov (United States)

    Ferrari, Loris

    2011-01-01

    Bose-Einstein condensation (BEC) is discussed at the level of an advanced course of statistical thermodynamics, clarifying some formal and physical aspects that are usually not covered by the standard pedagogical literature. The non-conventional approach adopted starts by showing that the continuum limit, in certain cases, cancels out the crucial…

  18. Corpus Approaches to Language Ideology

    Science.gov (United States)

    Vessey, Rachelle

    2017-01-01

    This paper outlines how corpus linguistics--and more specifically the corpus-assisted discourse studies approach--can add useful dimensions to studies of language ideology. First, it is argued that the identification of words of high, low, and statistically significant frequency can help in the identification and exploration of language ideologies…

  19. Low current approach to ignition

    International Nuclear Information System (INIS)

    Cenacchi, G.; Sugiyama, L.; Airoldi, A.; Coppi, B.

    1996-01-01

    The open-quotes standardclose quotes path to achieve ignition conditions so far has been that of producing plasmas with the maximum current and poloidal field that axe compatible with the applied toroidal field and the geometry of the adopted configuration (the low q a approach.) The other approach is that motivated by recent experiments with reversed shear configurations, with relatively low currents and high fields corresponding to high values of q a (e-g., q a ≅ 6). While the first approach can be pursued with ohmic heating alone, the second one necessarily involves an auxiliary heating system. One of the advantages of this approach is that the onset of large scale internal modes can be avoided as q(ψ) is kept above 1 over the entire plasma column. Since quite peaked density profiles are produced in the regimes where enhanced confinement is observed, the α-particle power levels for which ignition can be reached and therefore the thermal wall loading on the first wall, can be reduced relatively to the standard, low q a , approach. The possibility is considered that ignition is reached in the reversed shear, high q a , regime and that this is followed by a transition to non-reversed profiles, or even the low q a regime, assuming that the excitation of modes involving magnetic reconnection will not undermine the needed degree of confinement. These results have been demonstrated by numerical transport simulation for the Ignitor-Ult machine, but are applicable to all high field ignition experiments

  20. Lean approach in knowledge work

    Directory of Open Access Journals (Sweden)

    Hanna Kropsu-Vehkapera

    2018-05-01

    Full Text Available Purpose: Knowledge work productivity is a key area of improvement for many organisations. Lean approach is a sustainable way to achieve operational excellence and can be applied in many areas. The purpose of this novel study is to examine the potential of using lean approach for improving knowledge work practices. Design/methodology/approach: A systematic literature review has been carried out to study how lean approach is realised in knowledge work. The research is conceptual in nature and draws upon earlier research findings. Findings: This study shows that lean studies’ in knowledge work is an emerging research area. This study documents the methods and practices implemented in knowledge work to date, and presents a knowledge work continuum, which is an essential framework for effective lean approach deployment and to frame future research focus in knowledge work productivity. Research limitations/implications: This study structures the concept of knowledge work and outlines a concrete concept derived from earlier literature. The study summarises the literature on lean in knowledge work and highlights, which methods are used. More research is needed to understand how lean can be implemented in complex knowledge work environment and not only on the repetitive knowledge work. The limitations of this research are due to the limited availability of previous research. Practical implications: To analyse the nature of knowledge work, we implicate the areas where lean methods especially apply to improving knowledge work productivity. When applying lean in knowledge work context the focus should be using the people better and improving information flow. Originality/value: This study focuses on adapting lean methods into a knowledge work context and summarises earlier research done in this field. The study discusses the potential to improve knowledge work productivity by implementing lean methods and presents a unique knowledge work continuum to