WorldWideScience

Sample records for iii likelihood analysis

  1. Improved EDELWEISS-III sensitivity for low-mass WIMPs using a profile likelihood approach

    Energy Technology Data Exchange (ETDEWEB)

    Hehn, L. [Karlsruher Institut fuer Technologie, Institut fuer Kernphysik, Karlsruhe (Germany); Armengaud, E.; Boissiere, T. de; Gros, M.; Navick, X.F.; Nones, C.; Paul, B. [CEA Saclay, DSM/IRFU, Gif-sur-Yvette Cedex (France); Arnaud, Q. [Univ Lyon, Universite Claude Bernard Lyon 1, CNRS/IN2P3, Institut de Physique Nucleaire de Lyon, Lyon (France); Queen' s University, Kingston (Canada); Augier, C.; Billard, J.; Cazes, A.; Charlieux, F.; Jesus, M. de; Gascon, J.; Juillard, A.; Queguiner, E.; Sanglard, V.; Vagneron, L. [Univ Lyon, Universite Claude Bernard Lyon 1, CNRS/IN2P3, Institut de Physique Nucleaire de Lyon, Lyon (France); Benoit, A.; Camus, P. [Institut Neel, CNRS/UJF, Grenoble (France); Berge, L.; Chapellier, M.; Dumoulin, L.; Giuliani, A.; Le-Sueur, H.; Marnieros, S.; Olivieri, E.; Poda, D. [CSNSM, Univ. Paris-Sud, CNRS/IN2P3, Universite Paris-Saclay, Orsay (France); Bluemer, J. [Karlsruher Institut fuer Technologie, Institut fuer Kernphysik, Karlsruhe (Germany); Karlsruher Institut fuer Technologie, Institut fuer Experimentelle Kernphysik, Karlsruhe (Germany); Broniatowski, A. [CSNSM, Univ. Paris-Sud, CNRS/IN2P3, Universite Paris-Saclay, Orsay (France); Karlsruher Institut fuer Technologie, Institut fuer Experimentelle Kernphysik, Karlsruhe (Germany); Eitel, K.; Kozlov, V.; Siebenborn, B. [Karlsruher Institut fuer Technologie, Institut fuer Kernphysik, Karlsruhe (Germany); Foerster, N.; Heuermann, G.; Scorza, S. [Karlsruher Institut fuer Technologie, Institut fuer Experimentelle Kernphysik, Karlsruhe (Germany); Jin, Y. [Laboratoire de Photonique et de Nanostructures, CNRS, Route de Nozay, Marcoussis (France); Kefelian, C. [Univ Lyon, Universite Claude Bernard Lyon 1, CNRS/IN2P3, Institut de Physique Nucleaire de Lyon, Lyon (France); Karlsruher Institut fuer Technologie, Institut fuer Experimentelle Kernphysik, Karlsruhe (Germany); Kleifges, M.; Tcherniakhovski, D.; Weber, M. [Karlsruher Institut fuer Technologie, Institut fuer Prozessdatenverarbeitung und Elektronik, Karlsruhe (Germany); Kraus, H. [University of Oxford, Department of Physics, Oxford (United Kingdom); Kudryavtsev, V.A. [University of Sheffield, Department of Physics and Astronomy, Sheffield (United Kingdom); Pari, P. [CEA Saclay, DSM/IRAMIS, Gif-sur-Yvette (France); Piro, M.C. [CSNSM, Univ. Paris-Sud, CNRS/IN2P3, Universite Paris-Saclay, Orsay (France); Rensselaer Polytechnic Institute, Troy, NY (United States); Rozov, S.; Yakushev, E. [JINR, Laboratory of Nuclear Problems, Dubna, Moscow Region (Russian Federation); Schmidt, B. [Karlsruher Institut fuer Technologie, Institut fuer Kernphysik, Karlsruhe (Germany); Lawrence Berkeley National Laboratory, Berkeley, CA (United States)

    2016-10-15

    We report on a dark matter search for a Weakly Interacting Massive Particle (WIMP) in the mass range m{sub χ} element of [4, 30] GeV/c{sup 2} with the EDELWEISS-III experiment. A 2D profile likelihood analysis is performed on data from eight selected detectors with the lowest energy thresholds leading to a combined fiducial exposure of 496 kg-days. External backgrounds from γ- and β-radiation, recoils from {sup 206}Pb and neutrons as well as detector intrinsic backgrounds were modelled from data outside the region of interest and constrained in the analysis. The basic data selection and most of the background models are the same as those used in a previously published analysis based on boosted decision trees (BDT) [1]. For the likelihood approach applied in the analysis presented here, a larger signal efficiency and a subtraction of the expected background lead to a higher sensitivity, especially for the lowest WIMP masses probed. No statistically significant signal was found and upper limits on the spin-independent WIMP-nucleon scattering cross section can be set with a hypothesis test based on the profile likelihood test statistics. The 90 % C.L. exclusion limit set for WIMPs with m{sub χ} = 4 GeV/c{sup 2} is 1.6 x 10{sup -39} cm{sup 2}, which is an improvement of a factor of seven with respect to the BDT-based analysis. For WIMP masses above 15 GeV/c{sup 2} the exclusion limits found with both analyses are in good agreement. (orig.)

  2. On Maximum Likelihood Estimation for Left Censored Burr Type III Distribution

    Directory of Open Access Journals (Sweden)

    Navid Feroze

    2015-12-01

    Full Text Available Burr type III is an important distribution used to model the failure time data. The paper addresses the problem of estimation of parameters of the Burr type III distribution based on maximum likelihood estimation (MLE when the samples are left censored. As the closed form expression for the MLEs of the parameters cannot be derived, the approximate solutions have been obtained through iterative procedures. An extensive simulation study has been carried out to investigate the performance of the estimators with respect to sample size, censoring rate and true parametric values. A real life example has also been presented. The study revealed that the proposed estimators are consistent and capable of providing efficient results under small to moderate samples.

  3. Unbinned likelihood analysis of EGRET observations

    International Nuclear Information System (INIS)

    Digel, Seth W.

    2000-01-01

    We present a newly-developed likelihood analysis method for EGRET data that defines the likelihood function without binning the photon data or averaging the instrumental response functions. The standard likelihood analysis applied to EGRET data requires the photons to be binned spatially and in energy, and the point-spread functions to be averaged over energy and inclination angle. The full-width half maximum of the point-spread function increases by about 40% from on-axis to 30 degree sign inclination, and depending on the binning in energy can vary by more than that in a single energy bin. The new unbinned method avoids the loss of information that binning and averaging cause and can properly analyze regions where EGRET viewing periods overlap and photons with different inclination angles would otherwise be combined in the same bin. In the poster, we describe the unbinned analysis method and compare its sensitivity with binned analysis for detecting point sources in EGRET data

  4. Likelihood analysis of parity violation in the compound nucleus

    International Nuclear Information System (INIS)

    Bowman, D.; Sharapov, E.

    1993-01-01

    We discuss the determination of the root mean-squared matrix element of the parity-violating interaction between compound-nuclear states using likelihood analysis. We briefly review the relevant features of the statistical model of the compound nucleus and the formalism of likelihood analysis. We then discuss the application of likelihood analysis to data on panty-violating longitudinal asymmetries. The reliability of the extracted value of the matrix element and errors assigned to the matrix element is stressed. We treat the situations where the spins of the p-wave resonances are not known and known using experimental data and Monte Carlo techniques. We conclude that likelihood analysis provides a reliable way to determine M and its confidence interval. We briefly discuss some problems associated with the normalization of the likelihood function

  5. Generalized linear models with random effects unified analysis via H-likelihood

    CERN Document Server

    Lee, Youngjo; Pawitan, Yudi

    2006-01-01

    Since their introduction in 1972, generalized linear models (GLMs) have proven useful in the generalization of classical normal models. Presenting methods for fitting GLMs with random effects to data, Generalized Linear Models with Random Effects: Unified Analysis via H-likelihood explores a wide range of applications, including combining information over trials (meta-analysis), analysis of frailty models for survival data, genetic epidemiology, and analysis of spatial and temporal models with correlated errors.Written by pioneering authorities in the field, this reference provides an introduction to various theories and examines likelihood inference and GLMs. The authors show how to extend the class of GLMs while retaining as much simplicity as possible. By maximizing and deriving other quantities from h-likelihood, they also demonstrate how to use a single algorithm for all members of the class, resulting in a faster algorithm as compared to existing alternatives. Complementing theory with examples, many of...

  6. Phylogenetic analysis using parsimony and likelihood methods.

    Science.gov (United States)

    Yang, Z

    1996-02-01

    The assumptions underlying the maximum-parsimony (MP) method of phylogenetic tree reconstruction were intuitively examined by studying the way the method works. Computer simulations were performed to corroborate the intuitive examination. Parsimony appears to involve very stringent assumptions concerning the process of sequence evolution, such as constancy of substitution rates between nucleotides, constancy of rates across nucleotide sites, and equal branch lengths in the tree. For practical data analysis, the requirement of equal branch lengths means similar substitution rates among lineages (the existence of an approximate molecular clock), relatively long interior branches, and also few species in the data. However, a small amount of evolution is neither a necessary nor a sufficient requirement of the method. The difficulties involved in the application of current statistical estimation theory to tree reconstruction were discussed, and it was suggested that the approach proposed by Felsenstein (1981, J. Mol. Evol. 17: 368-376) for topology estimation, as well as its many variations and extensions, differs fundamentally from the maximum likelihood estimation of a conventional statistical parameter. Evidence was presented showing that the Felsenstein approach does not share the asymptotic efficiency of the maximum likelihood estimator of a statistical parameter. Computer simulations were performed to study the probability that MP recovers the true tree under a hierarchy of models of nucleotide substitution; its performance relative to the likelihood method was especially noted. The results appeared to support the intuitive examination of the assumptions underlying MP. When a simple model of nucleotide substitution was assumed to generate data, the probability that MP recovers the true topology could be as high as, or even higher than, that for the likelihood method. When the assumed model became more complex and realistic, e.g., when substitution rates were

  7. THESEUS: maximum likelihood superpositioning and analysis of macromolecular structures.

    Science.gov (United States)

    Theobald, Douglas L; Wuttke, Deborah S

    2006-09-01

    THESEUS is a command line program for performing maximum likelihood (ML) superpositions and analysis of macromolecular structures. While conventional superpositioning methods use ordinary least-squares (LS) as the optimization criterion, ML superpositions provide substantially improved accuracy by down-weighting variable structural regions and by correcting for correlations among atoms. ML superpositioning is robust and insensitive to the specific atoms included in the analysis, and thus it does not require subjective pruning of selected variable atomic coordinates. Output includes both likelihood-based and frequentist statistics for accurate evaluation of the adequacy of a superposition and for reliable analysis of structural similarities and differences. THESEUS performs principal components analysis for analyzing the complex correlations found among atoms within a structural ensemble. ANSI C source code and selected binaries for various computing platforms are available under the GNU open source license from http://monkshood.colorado.edu/theseus/ or http://www.theseus3d.org.

  8. Constraint likelihood analysis for a network of gravitational wave detectors

    International Nuclear Information System (INIS)

    Klimenko, S.; Rakhmanov, M.; Mitselmakher, G.; Mohanty, S.

    2005-01-01

    We propose a coherent method for detection and reconstruction of gravitational wave signals with a network of interferometric detectors. The method is derived by using the likelihood ratio functional for unknown signal waveforms. In the likelihood analysis, the global maximum of the likelihood ratio over the space of waveforms is used as the detection statistic. We identify a problem with this approach. In the case of an aligned pair of detectors, the detection statistic depends on the cross correlation between the detectors as expected, but this dependence disappears even for infinitesimally small misalignments. We solve the problem by applying constraints on the likelihood functional and obtain a new class of statistics. The resulting method can be applied to data from a network consisting of any number of detectors with arbitrary detector orientations. The method allows us reconstruction of the source coordinates and the waveforms of two polarization components of a gravitational wave. We study the performance of the method with numerical simulations and find the reconstruction of the source coordinates to be more accurate than in the standard likelihood method

  9. Secondary Analysis under Cohort Sampling Designs Using Conditional Likelihood

    Directory of Open Access Journals (Sweden)

    Olli Saarela

    2012-01-01

    Full Text Available Under cohort sampling designs, additional covariate data are collected on cases of a specific type and a randomly selected subset of noncases, primarily for the purpose of studying associations with a time-to-event response of interest. With such data available, an interest may arise to reuse them for studying associations between the additional covariate data and a secondary non-time-to-event response variable, usually collected for the whole study cohort at the outset of the study. Following earlier literature, we refer to such a situation as secondary analysis. We outline a general conditional likelihood approach for secondary analysis under cohort sampling designs and discuss the specific situations of case-cohort and nested case-control designs. We also review alternative methods based on full likelihood and inverse probability weighting. We compare the alternative methods for secondary analysis in two simulated settings and apply them in a real-data example.

  10. Likelihood functions for the analysis of single-molecule binned photon sequences

    Energy Technology Data Exchange (ETDEWEB)

    Gopich, Irina V., E-mail: irinag@niddk.nih.gov [Laboratory of Chemical Physics, National Institute of Diabetes and Digestive and Kidney Diseases, National Institutes of Health, Bethesda, MD 20892 (United States)

    2012-03-02

    Graphical abstract: Folding of a protein with attached fluorescent dyes, the underlying conformational trajectory of interest, and the observed binned photon trajectory. Highlights: Black-Right-Pointing-Pointer A sequence of photon counts can be analyzed using a likelihood function. Black-Right-Pointing-Pointer The exact likelihood function for a two-state kinetic model is provided. Black-Right-Pointing-Pointer Several approximations are considered for an arbitrary kinetic model. Black-Right-Pointing-Pointer Improved likelihood functions are obtained to treat sequences of FRET efficiencies. - Abstract: We consider the analysis of a class of experiments in which the number of photons in consecutive time intervals is recorded. Sequence of photon counts or, alternatively, of FRET efficiencies can be studied using likelihood-based methods. For a kinetic model of the conformational dynamics and state-dependent Poisson photon statistics, the formalism to calculate the exact likelihood that this model describes such sequences of photons or FRET efficiencies is developed. Explicit analytic expressions for the likelihood function for a two-state kinetic model are provided. The important special case when conformational dynamics are so slow that at most a single transition occurs in a time bin is considered. By making a series of approximations, we eventually recover the likelihood function used in hidden Markov models. In this way, not only is insight gained into the range of validity of this procedure, but also an improved likelihood function can be obtained.

  11. Likelihood-based Dynamic Factor Analysis for Measurement and Forecasting

    NARCIS (Netherlands)

    Jungbacker, B.M.J.P.; Koopman, S.J.

    2015-01-01

    We present new results for the likelihood-based analysis of the dynamic factor model. The latent factors are modelled by linear dynamic stochastic processes. The idiosyncratic disturbance series are specified as autoregressive processes with mutually correlated innovations. The new results lead to

  12. Maximal information analysis: I - various Wayne State plots and the most common likelihood principle

    International Nuclear Information System (INIS)

    Bonvicini, G.

    2005-01-01

    Statistical analysis using all moments of the likelihood L(y vertical bar α) (y being the data and α being the fit parameters) is presented. The relevant plots for various data fitting situations are presented. The goodness of fit (GOF) parameter (currently the χ 2 ) is redefined as the isoprobability level in a multidimensional space. Many useful properties of statistical analysis are summarized in a new statistical principle which states that the most common likelihood, and not the tallest, is the best possible likelihood, when comparing experiments or hypotheses

  13. Algorithms of maximum likelihood data clustering with applications

    Science.gov (United States)

    Giada, Lorenzo; Marsili, Matteo

    2002-12-01

    We address the problem of data clustering by introducing an unsupervised, parameter-free approach based on maximum likelihood principle. Starting from the observation that data sets belonging to the same cluster share a common information, we construct an expression for the likelihood of any possible cluster structure. The likelihood in turn depends only on the Pearson's coefficient of the data. We discuss clustering algorithms that provide a fast and reliable approximation to maximum likelihood configurations. Compared to standard clustering methods, our approach has the advantages that (i) it is parameter free, (ii) the number of clusters need not be fixed in advance and (iii) the interpretation of the results is transparent. In order to test our approach and compare it with standard clustering algorithms, we analyze two very different data sets: time series of financial market returns and gene expression data. We find that different maximization algorithms produce similar cluster structures whereas the outcome of standard algorithms has a much wider variability.

  14. Statistical analysis of COMPTEL maximum likelihood-ratio distributions: evidence for a signal from previously undetected AGN

    International Nuclear Information System (INIS)

    Williams, O. R.; Bennett, K.; Much, R.; Schoenfelder, V.; Blom, J. J.; Ryan, J.

    1997-01-01

    The maximum likelihood-ratio method is frequently used in COMPTEL analysis to determine the significance of a point source at a given location. In this paper we do not consider whether the likelihood-ratio at a particular location indicates a detection, but rather whether distributions of likelihood-ratios derived from many locations depart from that expected for source free data. We have constructed distributions of likelihood-ratios by reading values from standard COMPTEL maximum-likelihood ratio maps at positions corresponding to the locations of different categories of AGN. Distributions derived from the locations of Seyfert galaxies are indistinguishable, according to a Kolmogorov-Smirnov test, from those obtained from ''random'' locations, but differ slightly from those obtained from the locations of flat spectrum radio loud quasars, OVVs, and BL Lac objects. This difference is not due to known COMPTEL sources, since regions near these sources are excluded from the analysis. We suggest that it might arise from a number of sources with fluxes below the COMPTEL detection threshold

  15. Practical likelihood analysis for spatial generalized linear mixed models

    DEFF Research Database (Denmark)

    Bonat, W. H.; Ribeiro, Paulo Justiniano

    2016-01-01

    We investigate an algorithm for maximum likelihood estimation of spatial generalized linear mixed models based on the Laplace approximation. We compare our algorithm with a set of alternative approaches for two datasets from the literature. The Rhizoctonia root rot and the Rongelap are......, respectively, examples of binomial and count datasets modeled by spatial generalized linear mixed models. Our results show that the Laplace approximation provides similar estimates to Markov Chain Monte Carlo likelihood, Monte Carlo expectation maximization, and modified Laplace approximation. Some advantages...... of Laplace approximation include the computation of the maximized log-likelihood value, which can be used for model selection and tests, and the possibility to obtain realistic confidence intervals for model parameters based on profile likelihoods. The Laplace approximation also avoids the tuning...

  16. Steady state likelihood ratio sensitivity analysis for stiff kinetic Monte Carlo simulations.

    Science.gov (United States)

    Núñez, M; Vlachos, D G

    2015-01-28

    Kinetic Monte Carlo simulation is an integral tool in the study of complex physical phenomena present in applications ranging from heterogeneous catalysis to biological systems to crystal growth and atmospheric sciences. Sensitivity analysis is useful for identifying important parameters and rate-determining steps, but the finite-difference application of sensitivity analysis is computationally demanding. Techniques based on the likelihood ratio method reduce the computational cost of sensitivity analysis by obtaining all gradient information in a single run. However, we show that disparity in time scales of microscopic events, which is ubiquitous in real systems, introduces drastic statistical noise into derivative estimates for parameters affecting the fast events. In this work, the steady-state likelihood ratio sensitivity analysis is extended to singularly perturbed systems by invoking partial equilibration for fast reactions, that is, by working on the fast and slow manifolds of the chemistry. Derivatives on each time scale are computed independently and combined to the desired sensitivity coefficients to considerably reduce the noise in derivative estimates for stiff systems. The approach is demonstrated in an analytically solvable linear system.

  17. On the likelihood of detecting gravitational waves from Population III compact object binaries

    Science.gov (United States)

    Belczynski, Krzysztof; Ryu, Taeho; Perna, Rosalba; Berti, Emanuele; Tanaka, Takamitsu L.; Bulik, Tomasz

    2017-11-01

    We study the contribution of binary black hole (BH-BH) mergers from the first, metal-free stars in the Universe (Pop III) to gravitational wave detection rates. Our study combines initial conditions for the formation of Pop III stars based on N-body simulations of binary formation (including rates, binary fraction, initial mass function, orbital separation and eccentricity distributions) with an updated model of stellar evolution specific for Pop III stars. We find that the merger rate of these Pop III BH-BH systems is relatively small (≲ 0.1 Gpc-3 yr-1) at low redshifts (z 1 per cent) contribution of these stars to low-redshift BH-BH mergers. However, it remains to be tested whether (and at what level) rapidly spinning Pop III stars in the homogeneous evolution scenario can contribute to BH-BH mergers in the local Universe.

  18. Maximum Likelihood, Consistency and Data Envelopment Analysis: A Statistical Foundation

    OpenAIRE

    Rajiv D. Banker

    1993-01-01

    This paper provides a formal statistical basis for the efficiency evaluation techniques of data envelopment analysis (DEA). DEA estimators of the best practice monotone increasing and concave production function are shown to be also maximum likelihood estimators if the deviation of actual output from the efficient output is regarded as a stochastic variable with a monotone decreasing probability density function. While the best practice frontier estimator is biased below the theoretical front...

  19. Tapered composite likelihood for spatial max-stable models

    KAUST Repository

    Sang, Huiyan

    2014-05-01

    Spatial extreme value analysis is useful to environmental studies, in which extreme value phenomena are of interest and meaningful spatial patterns can be discerned. Max-stable process models are able to describe such phenomena. This class of models is asymptotically justified to characterize the spatial dependence among extremes. However, likelihood inference is challenging for such models because their corresponding joint likelihood is unavailable and only bivariate or trivariate distributions are known. In this paper, we propose a tapered composite likelihood approach by utilizing lower dimensional marginal likelihoods for inference on parameters of various max-stable process models. We consider a weighting strategy based on a "taper range" to exclude distant pairs or triples. The "optimal taper range" is selected to maximize various measures of the Godambe information associated with the tapered composite likelihood function. This method substantially reduces the computational cost and improves the efficiency over equally weighted composite likelihood estimators. We illustrate its utility with simulation experiments and an analysis of rainfall data in Switzerland.

  20. Tapered composite likelihood for spatial max-stable models

    KAUST Repository

    Sang, Huiyan; Genton, Marc G.

    2014-01-01

    Spatial extreme value analysis is useful to environmental studies, in which extreme value phenomena are of interest and meaningful spatial patterns can be discerned. Max-stable process models are able to describe such phenomena. This class of models is asymptotically justified to characterize the spatial dependence among extremes. However, likelihood inference is challenging for such models because their corresponding joint likelihood is unavailable and only bivariate or trivariate distributions are known. In this paper, we propose a tapered composite likelihood approach by utilizing lower dimensional marginal likelihoods for inference on parameters of various max-stable process models. We consider a weighting strategy based on a "taper range" to exclude distant pairs or triples. The "optimal taper range" is selected to maximize various measures of the Godambe information associated with the tapered composite likelihood function. This method substantially reduces the computational cost and improves the efficiency over equally weighted composite likelihood estimators. We illustrate its utility with simulation experiments and an analysis of rainfall data in Switzerland.

  1. Optimized Large-scale CMB Likelihood and Quadratic Maximum Likelihood Power Spectrum Estimation

    Science.gov (United States)

    Gjerløw, E.; Colombo, L. P. L.; Eriksen, H. K.; Górski, K. M.; Gruppuso, A.; Jewell, J. B.; Plaszczynski, S.; Wehus, I. K.

    2015-11-01

    We revisit the problem of exact cosmic microwave background (CMB) likelihood and power spectrum estimation with the goal of minimizing computational costs through linear compression. This idea was originally proposed for CMB purposes by Tegmark et al., and here we develop it into a fully functioning computational framework for large-scale polarization analysis, adopting WMAP as a working example. We compare five different linear bases (pixel space, harmonic space, noise covariance eigenvectors, signal-to-noise covariance eigenvectors, and signal-plus-noise covariance eigenvectors) in terms of compression efficiency, and find that the computationally most efficient basis is the signal-to-noise eigenvector basis, which is closely related to the Karhunen-Loeve and Principal Component transforms, in agreement with previous suggestions. For this basis, the information in 6836 unmasked WMAP sky map pixels can be compressed into a smaller set of 3102 modes, with a maximum error increase of any single multipole of 3.8% at ℓ ≤ 32 and a maximum shift in the mean values of a joint distribution of an amplitude-tilt model of 0.006σ. This compression reduces the computational cost of a single likelihood evaluation by a factor of 5, from 38 to 7.5 CPU seconds, and it also results in a more robust likelihood by implicitly regularizing nearly degenerate modes. Finally, we use the same compression framework to formulate a numerically stable and computationally efficient variation of the Quadratic Maximum Likelihood implementation, which requires less than 3 GB of memory and 2 CPU minutes per iteration for ℓ ≤ 32, rendering low-ℓ QML CMB power spectrum analysis fully tractable on a standard laptop.

  2. Maximum Likelihood and Restricted Likelihood Solutions in Multiple-Method Studies.

    Science.gov (United States)

    Rukhin, Andrew L

    2011-01-01

    A formulation of the problem of combining data from several sources is discussed in terms of random effects models. The unknown measurement precision is assumed not to be the same for all methods. We investigate maximum likelihood solutions in this model. By representing the likelihood equations as simultaneous polynomial equations, the exact form of the Groebner basis for their stationary points is derived when there are two methods. A parametrization of these solutions which allows their comparison is suggested. A numerical method for solving likelihood equations is outlined, and an alternative to the maximum likelihood method, the restricted maximum likelihood, is studied. In the situation when methods variances are considered to be known an upper bound on the between-method variance is obtained. The relationship between likelihood equations and moment-type equations is also discussed.

  3. Likelihood Analysis of Supersymmetric SU(5) GUTs

    CERN Document Server

    Bagnaschi, E.

    2017-01-01

    We perform a likelihood analysis of the constraints from accelerator experiments and astrophysical observations on supersymmetric (SUSY) models with SU(5) boundary conditions on soft SUSY-breaking parameters at the GUT scale. The parameter space of the models studied has 7 parameters: a universal gaugino mass $m_{1/2}$, distinct masses for the scalar partners of matter fermions in five- and ten-dimensional representations of SU(5), $m_5$ and $m_{10}$, and for the $\\mathbf{5}$ and $\\mathbf{\\bar 5}$ Higgs representations $m_{H_u}$ and $m_{H_d}$, a universal trilinear soft SUSY-breaking parameter $A_0$, and the ratio of Higgs vevs $\\tan \\beta$. In addition to previous constraints from direct sparticle searches, low-energy and flavour observables, we incorporate constraints based on preliminary results from 13 TeV LHC searches for jets + MET events and long-lived particles, as well as the latest PandaX-II and LUX searches for direct Dark Matter detection. In addition to previously-identified mechanisms for bringi...

  4. Neandertal admixture in Eurasia confirmed by maximum-likelihood analysis of three genomes.

    Science.gov (United States)

    Lohse, Konrad; Frantz, Laurent A F

    2014-04-01

    Although there has been much interest in estimating histories of divergence and admixture from genomic data, it has proved difficult to distinguish recent admixture from long-term structure in the ancestral population. Thus, recent genome-wide analyses based on summary statistics have sparked controversy about the possibility of interbreeding between Neandertals and modern humans in Eurasia. Here we derive the probability of full mutational configurations in nonrecombining sequence blocks under both admixture and ancestral structure scenarios. Dividing the genome into short blocks gives an efficient way to compute maximum-likelihood estimates of parameters. We apply this likelihood scheme to triplets of human and Neandertal genomes and compare the relative support for a model of admixture from Neandertals into Eurasian populations after their expansion out of Africa against a history of persistent structure in their common ancestral population in Africa. Our analysis allows us to conclusively reject a model of ancestral structure in Africa and instead reveals strong support for Neandertal admixture in Eurasia at a higher rate (3.4-7.3%) than suggested previously. Using analysis and simulations we show that our inference is more powerful than previous summary statistics and robust to realistic levels of recombination.

  5. Maximum likelihood-based analysis of photon arrival trajectories in single-molecule FRET

    Energy Technology Data Exchange (ETDEWEB)

    Waligorska, Marta [Adam Mickiewicz University, Faculty of Chemistry, Grunwaldzka 6, 60-780 Poznan (Poland); Molski, Andrzej, E-mail: amolski@amu.edu.pl [Adam Mickiewicz University, Faculty of Chemistry, Grunwaldzka 6, 60-780 Poznan (Poland)

    2012-07-25

    Highlights: Black-Right-Pointing-Pointer We study model selection and parameter recovery from single-molecule FRET experiments. Black-Right-Pointing-Pointer We examine the maximum likelihood-based analysis of two-color photon trajectories. Black-Right-Pointing-Pointer The number of observed photons determines the performance of the method. Black-Right-Pointing-Pointer For long trajectories, one can extract mean dwell times that are comparable to inter-photon times. -- Abstract: When two fluorophores (donor and acceptor) are attached to an immobilized biomolecule, anti-correlated fluctuations of the donor and acceptor fluorescence caused by Foerster resonance energy transfer (FRET) report on the conformational kinetics of the molecule. Here we assess the maximum likelihood-based analysis of donor and acceptor photon arrival trajectories as a method for extracting the conformational kinetics. Using computer generated data we quantify the accuracy and precision of parameter estimates and the efficiency of the Akaike information criterion (AIC) and the Bayesian information criterion (BIC) in selecting the true kinetic model. We find that the number of observed photons is the key parameter determining parameter estimation and model selection. For long trajectories, one can extract mean dwell times that are comparable to inter-photon times.

  6. Likelihood ratio meta-analysis: New motivation and approach for an old method.

    Science.gov (United States)

    Dormuth, Colin R; Filion, Kristian B; Platt, Robert W

    2016-03-01

    A 95% confidence interval (CI) in an updated meta-analysis may not have the expected 95% coverage. If a meta-analysis is simply updated with additional data, then the resulting 95% CI will be wrong because it will not have accounted for the fact that the earlier meta-analysis failed or succeeded to exclude the null. This situation can be avoided by using the likelihood ratio (LR) as a measure of evidence that does not depend on type-1 error. We show how an LR-based approach, first advanced by Goodman, can be used in a meta-analysis to pool data from separate studies to quantitatively assess where the total evidence points. The method works by estimating the log-likelihood ratio (LogLR) function from each study. Those functions are then summed to obtain a combined function, which is then used to retrieve the total effect estimate, and a corresponding 'intrinsic' confidence interval. Using as illustrations the CAPRIE trial of clopidogrel versus aspirin in the prevention of ischemic events, and our own meta-analysis of higher potency statins and the risk of acute kidney injury, we show that the LR-based method yields the same point estimate as the traditional analysis, but with an intrinsic confidence interval that is appropriately wider than the traditional 95% CI. The LR-based method can be used to conduct both fixed effect and random effects meta-analyses, it can be applied to old and new meta-analyses alike, and results can be presented in a format that is familiar to a meta-analytic audience. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. Empirical likelihood

    CERN Document Server

    Owen, Art B

    2001-01-01

    Empirical likelihood provides inferences whose validity does not depend on specifying a parametric model for the data. Because it uses a likelihood, the method has certain inherent advantages over resampling methods: it uses the data to determine the shape of the confidence regions, and it makes it easy to combined data from multiple sources. It also facilitates incorporating side information, and it simplifies accounting for censored, truncated, or biased sampling.One of the first books published on the subject, Empirical Likelihood offers an in-depth treatment of this method for constructing confidence regions and testing hypotheses. The author applies empirical likelihood to a range of problems, from those as simple as setting a confidence region for a univariate mean under IID sampling, to problems defined through smooth functions of means, regression models, generalized linear models, estimating equations, or kernel smooths, and to sampling with non-identically distributed data. Abundant figures offer vi...

  8. Approximate Likelihood

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Most physics results at the LHC end in a likelihood ratio test. This includes discovery and exclusion for searches as well as mass, cross-section, and coupling measurements. The use of Machine Learning (multivariate) algorithms in HEP is mainly restricted to searches, which can be reduced to classification between two fixed distributions: signal vs. background. I will show how we can extend the use of ML classifiers to distributions parameterized by physical quantities like masses and couplings as well as nuisance parameters associated to systematic uncertainties. This allows for one to approximate the likelihood ratio while still using a high dimensional feature vector for the data. Both the MEM and ABC approaches mentioned above aim to provide inference on model parameters (like cross-sections, masses, couplings, etc.). ABC is fundamentally tied Bayesian inference and focuses on the “likelihood free” setting where only a simulator is available and one cannot directly compute the likelihood for the dat...

  9. Composite likelihood estimation of demographic parameters

    Directory of Open Access Journals (Sweden)

    Garrigan Daniel

    2009-11-01

    accuracy, demographic parameters from three simulated data sets that vary in the magnitude of a founder event and a skew in the effective population size of the X chromosome relative to the autosomes. The behavior of the Markov chain is also examined and shown to convergence to its stationary distribution, while also showing high levels of parameter mixing. The analysis of three pairwise comparisons of sub-Saharan African human populations with non-African human populations do not provide unequivocal support for a strong non-African founder event from these nuclear data. The estimates do however suggest a skew in the ratio of X chromosome to autosome effective population size that is greater than one. However in all three cases, the 95% highest posterior density interval for this ratio does include three-fourths, the value expected under an equal breeding sex ratio. Conclusion The implementation of composite and approximate likelihood methods in a framework that includes MCMCMC demographic parameter estimation shows great promise for being flexible and computationally efficient enough to scale up to the level of whole-genome polymorphism and divergence analysis. Further work must be done to characterize the effects of the assumption of linkage equilibrium among genomic regions that is crucial to the validity of applying the composite likelihood method.

  10. Likelihood analysis of the minimal AMSB model

    Energy Technology Data Exchange (ETDEWEB)

    Bagnaschi, E.; Weiglein, G. [DESY, Hamburg (Germany); Borsato, M.; Chobanova, V.; Lucio, M.; Santos, D.M. [Universidade de Santiago de Compostela, Santiago de Compostela (Spain); Sakurai, K. [Institute for Particle Physics Phenomenology, University of Durham, Science Laboratories, Department of Physics, Durham (United Kingdom); University of Warsaw, Faculty of Physics, Institute of Theoretical Physics, Warsaw (Poland); Buchmueller, O.; Citron, M.; Costa, J.C.; Richards, A. [Imperial College, High Energy Physics Group, Blackett Laboratory, London (United Kingdom); Cavanaugh, R. [Fermi National Accelerator Laboratory, Batavia, IL (United States); University of Illinois at Chicago, Physics Department, Chicago, IL (United States); De Roeck, A. [Experimental Physics Department, CERN, Geneva (Switzerland); Antwerp University, Wilrijk (Belgium); Dolan, M.J. [School of Physics, University of Melbourne, ARC Centre of Excellence for Particle Physics at the Terascale, Melbourne (Australia); Ellis, J.R. [King' s College London, Theoretical Particle Physics and Cosmology Group, Department of Physics, London (United Kingdom); CERN, Theoretical Physics Department, Geneva (Switzerland); Flaecher, H. [University of Bristol, H.H. Wills Physics Laboratory, Bristol (United Kingdom); Heinemeyer, S. [Campus of International Excellence UAM+CSIC, Madrid (Spain); Instituto de Fisica Teorica UAM-CSIC, Madrid (Spain); Instituto de Fisica de Cantabria (CSIC-UC), Cantabria (Spain); Isidori, G. [Physik-Institut, Universitaet Zuerich, Zurich (Switzerland); Luo, F. [Kavli IPMU (WPI), UTIAS, The University of Tokyo, Kashiwa, Chiba (Japan); Olive, K.A. [School of Physics and Astronomy, University of Minnesota, William I. Fine Theoretical Physics Institute, Minneapolis, MN (United States)

    2017-04-15

    We perform a likelihood analysis of the minimal anomaly-mediated supersymmetry-breaking (mAMSB) model using constraints from cosmology and accelerator experiments. We find that either a wino-like or a Higgsino-like neutralino LSP, χ{sup 0}{sub 1}, may provide the cold dark matter (DM), both with similar likelihoods. The upper limit on the DM density from Planck and other experiments enforces m{sub χ{sup 0}{sub 1}} 0) but the scalar mass m{sub 0} is poorly constrained. In the wino-LSP case, m{sub 3/2} is constrained to about 900 TeV and m{sub χ{sup 0}{sub 1}} to 2.9 ± 0.1 TeV, whereas in the Higgsino-LSP case m{sub 3/2} has just a lower limit >or similar 650 TeV (>or similar 480 TeV) and m{sub χ{sup 0}{sub 1}} is constrained to 1.12 (1.13) ± 0.02 TeV in the μ > 0 (μ < 0) scenario. In neither case can the anomalous magnetic moment of the muon, (g-2){sub μ}, be improved significantly relative to its Standard Model (SM) value, nor do flavour measurements constrain the model significantly, and there are poor prospects for discovering supersymmetric particles at the LHC, though there are some prospects for direct DM detection. On the other hand, if the χ{sup 0}{sub 1} contributes only a fraction of the cold DM density, future LHC E{sub T}-based searches for gluinos, squarks and heavier chargino and neutralino states as well as disappearing track searches in the wino-like LSP region will be relevant, and interference effects enable BR(B{sub s,d} → μ{sup +}μ{sup -}) to agree with the data better than in the SM in the case of wino-like DM with μ > 0. (orig.)

  11. Affective mapping: An activation likelihood estimation (ALE) meta-analysis.

    Science.gov (United States)

    Kirby, Lauren A J; Robinson, Jennifer L

    2017-11-01

    Functional neuroimaging has the spatial resolution to explain the neural basis of emotions. Activation likelihood estimation (ALE), as opposed to traditional qualitative meta-analysis, quantifies convergence of activation across studies within affective categories. Others have used ALE to investigate a broad range of emotions, but without the convenience of the BrainMap database. We used the BrainMap database and analysis resources to run separate meta-analyses on coordinates reported for anger, anxiety, disgust, fear, happiness, humor, and sadness. Resultant ALE maps were compared to determine areas of convergence between emotions, as well as to identify affect-specific networks. Five out of the seven emotions demonstrated consistent activation within the amygdala, whereas all emotions consistently activated the right inferior frontal gyrus, which has been implicated as an integration hub for affective and cognitive processes. These data provide the framework for models of affect-specific networks, as well as emotional processing hubs, which can be used for future studies of functional or effective connectivity. Copyright © 2015 Elsevier Inc. All rights reserved.

  12. Posterior distributions for likelihood ratios in forensic science.

    Science.gov (United States)

    van den Hout, Ardo; Alberink, Ivo

    2016-09-01

    Evaluation of evidence in forensic science is discussed using posterior distributions for likelihood ratios. Instead of eliminating the uncertainty by integrating (Bayes factor) or by conditioning on parameter values, uncertainty in the likelihood ratio is retained by parameter uncertainty derived from posterior distributions. A posterior distribution for a likelihood ratio can be summarised by the median and credible intervals. Using the posterior mean of the distribution is not recommended. An analysis of forensic data for body height estimation is undertaken. The posterior likelihood approach has been criticised both theoretically and with respect to applicability. This paper addresses the latter and illustrates an interesting application area. Copyright © 2016 The Chartered Society of Forensic Sciences. Published by Elsevier Ireland Ltd. All rights reserved.

  13. Analysis of Minute Features in Speckled Imagery with Maximum Likelihood Estimation

    Directory of Open Access Journals (Sweden)

    Alejandro C. Frery

    2004-12-01

    Full Text Available This paper deals with numerical problems arising when performing maximum likelihood parameter estimation in speckled imagery using small samples. The noise that appears in images obtained with coherent illumination, as is the case of sonar, laser, ultrasound-B, and synthetic aperture radar, is called speckle, and it can neither be assumed Gaussian nor additive. The properties of speckle noise are well described by the multiplicative model, a statistical framework from which stem several important distributions. Amongst these distributions, one is regarded as the universal model for speckled data, namely, the 𝒢0 law. This paper deals with amplitude data, so the 𝒢A0 distribution will be used. The literature reports that techniques for obtaining estimates (maximum likelihood, based on moments and on order statistics of the parameters of the 𝒢A0 distribution require samples of hundreds, even thousands, of observations in order to obtain sensible values. This is verified for maximum likelihood estimation, and a proposal based on alternate optimization is made to alleviate this situation. The proposal is assessed with real and simulated data, showing that the convergence problems are no longer present. A Monte Carlo experiment is devised to estimate the quality of maximum likelihood estimators in small samples, and real data is successfully analyzed with the proposed alternated procedure. Stylized empirical influence functions are computed and used to choose a strategy for computing maximum likelihood estimates that is resistant to outliers.

  14. The behavior of the likelihood ratio test for testing missingness

    OpenAIRE

    Hens, Niel; Aerts, Marc; Molenberghs, Geert; Thijs, Herbert

    2003-01-01

    To asses the sensitivity of conclusions to model choices in the context of selection models for non-random dropout, one can oppose the different missing mechanisms to each other; e.g. by the likelihood ratio tests. The finite sample behavior of the null distribution and the power of the likelihood ratio test is studied under a variety of missingness mechanisms. missing data; sensitivity analysis; likelihood ratio test; missing mechanisms

  15. Bias correction in the hierarchical likelihood approach to the analysis of multivariate survival data.

    Science.gov (United States)

    Jeon, Jihyoun; Hsu, Li; Gorfine, Malka

    2012-07-01

    Frailty models are useful for measuring unobserved heterogeneity in risk of failures across clusters, providing cluster-specific risk prediction. In a frailty model, the latent frailties shared by members within a cluster are assumed to act multiplicatively on the hazard function. In order to obtain parameter and frailty variate estimates, we consider the hierarchical likelihood (H-likelihood) approach (Ha, Lee and Song, 2001. Hierarchical-likelihood approach for frailty models. Biometrika 88, 233-243) in which the latent frailties are treated as "parameters" and estimated jointly with other parameters of interest. We find that the H-likelihood estimators perform well when the censoring rate is low, however, they are substantially biased when the censoring rate is moderate to high. In this paper, we propose a simple and easy-to-implement bias correction method for the H-likelihood estimators under a shared frailty model. We also extend the method to a multivariate frailty model, which incorporates complex dependence structure within clusters. We conduct an extensive simulation study and show that the proposed approach performs very well for censoring rates as high as 80%. We also illustrate the method with a breast cancer data set. Since the H-likelihood is the same as the penalized likelihood function, the proposed bias correction method is also applicable to the penalized likelihood estimators.

  16. Empirical Likelihood in Nonignorable Covariate-Missing Data Problems.

    Science.gov (United States)

    Xie, Yanmei; Zhang, Biao

    2017-04-20

    Missing covariate data occurs often in regression analysis, which frequently arises in the health and social sciences as well as in survey sampling. We study methods for the analysis of a nonignorable covariate-missing data problem in an assumed conditional mean function when some covariates are completely observed but other covariates are missing for some subjects. We adopt the semiparametric perspective of Bartlett et al. (Improving upon the efficiency of complete case analysis when covariates are MNAR. Biostatistics 2014;15:719-30) on regression analyses with nonignorable missing covariates, in which they have introduced the use of two working models, the working probability model of missingness and the working conditional score model. In this paper, we study an empirical likelihood approach to nonignorable covariate-missing data problems with the objective of effectively utilizing the two working models in the analysis of covariate-missing data. We propose a unified approach to constructing a system of unbiased estimating equations, where there are more equations than unknown parameters of interest. One useful feature of these unbiased estimating equations is that they naturally incorporate the incomplete data into the data analysis, making it possible to seek efficient estimation of the parameter of interest even when the working regression function is not specified to be the optimal regression function. We apply the general methodology of empirical likelihood to optimally combine these unbiased estimating equations. We propose three maximum empirical likelihood estimators of the underlying regression parameters and compare their efficiencies with other existing competitors. We present a simulation study to compare the finite-sample performance of various methods with respect to bias, efficiency, and robustness to model misspecification. The proposed empirical likelihood method is also illustrated by an analysis of a data set from the US National Health and

  17. Neural Networks Involved in Adolescent Reward Processing: An Activation Likelihood Estimation Meta-Analysis of Functional Neuroimaging Studies

    Science.gov (United States)

    Silverman, Merav H.; Jedd, Kelly; Luciana, Monica

    2015-01-01

    Behavioral responses to, and the neural processing of, rewards change dramatically during adolescence and may contribute to observed increases in risk-taking during this developmental period. Functional MRI (fMRI) studies suggest differences between adolescents and adults in neural activation during reward processing, but findings are contradictory, and effects have been found in non-predicted directions. The current study uses an activation likelihood estimation (ALE) approach for quantitative meta-analysis of functional neuroimaging studies to: 1) confirm the network of brain regions involved in adolescents’ reward processing, 2) identify regions involved in specific stages (anticipation, outcome) and valence (positive, negative) of reward processing, and 3) identify differences in activation likelihood between adolescent and adult reward-related brain activation. Results reveal a subcortical network of brain regions involved in adolescent reward processing similar to that found in adults with major hubs including the ventral and dorsal striatum, insula, and posterior cingulate cortex (PCC). Contrast analyses find that adolescents exhibit greater likelihood of activation in the insula while processing anticipation relative to outcome and greater likelihood of activation in the putamen and amygdala during outcome relative to anticipation. While processing positive compared to negative valence, adolescents show increased likelihood for activation in the posterior cingulate cortex (PCC) and ventral striatum. Contrasting adolescent reward processing with the existing ALE of adult reward processing (Liu et al., 2011) reveals increased likelihood for activation in limbic, frontolimbic, and striatal regions in adolescents compared with adults. Unlike adolescents, adults also activate executive control regions of the frontal and parietal lobes. These findings support hypothesized elevations in motivated activity during adolescence. PMID:26254587

  18. Detection of COL III in Parchment by Amino Acid Analysis

    DEFF Research Database (Denmark)

    Vestergaard Poulsen Sommer, Dorte; Larsen, René

    2016-01-01

    Cultural heritage parchments made from the reticular dermis of animals have been subject to studies of deterioration and conservation by amino acid analysis. The reticular dermis contains a varying mixture of collagen I and III (COL I and III). When dealing with the results of the amino acid...... analyses, till now the COL III content has not been taken into account. Based on the available amino acid sequences we present a method for determining the amount of COL III in the reticular dermis of new and historical parchments calculated from the ratio of Ile/Val. We find COL III contents between 7...... and 32 % in new parchments and between 0.2 and 40 % in the historical parchments. This is consistent with results in the literature. The varying content of COL III has a significant influence on the uncertainty of the amino acid analysis. Although we have not found a simple correlation between the COL...

  19. The likelihood ratio as a random variable for linked markers in kinship analysis.

    Science.gov (United States)

    Egeland, Thore; Slooten, Klaas

    2016-11-01

    The likelihood ratio is the fundamental quantity that summarizes the evidence in forensic cases. Therefore, it is important to understand the theoretical properties of this statistic. This paper is the last in a series of three, and the first to study linked markers. We show that for all non-inbred pairwise kinship comparisons, the expected likelihood ratio in favor of a type of relatedness depends on the allele frequencies only via the number of alleles, also for linked markers, and also if the true relationship is another one than is tested for by the likelihood ratio. Exact expressions for the expectation and variance are derived for all these cases. Furthermore, we show that the expected likelihood ratio is a non-increasing function if the recombination rate increases between 0 and 0.5 when the actual relationship is the one investigated by the LR. Besides being of theoretical interest, exact expressions such as obtained here can be used for software validation as they allow to verify the correctness up to arbitrary precision. The paper also presents results and advice of practical importance. For example, we argue that the logarithm of the likelihood ratio behaves in a fundamentally different way than the likelihood ratio itself in terms of expectation and variance, in agreement with its interpretation as weight of evidence. Equipped with the results presented and freely available software, one may check calculations and software and also do power calculations.

  20. Use of deterministic sampling for exploring likelihoods in linkage analysis for quantitative traits.

    NARCIS (Netherlands)

    Mackinnon, M.J.; Beek, van der S.; Kinghorn, B.P.

    1996-01-01

    Deterministic sampling was used to numerically evaluate the expected log-likelihood surfaces of QTL-marker linkage models in large pedigrees with simple structures. By calculating the expected values of likelihoods, questions of power of experimental designs, bias in parameter estimates, approximate

  1. The phylogenetic likelihood library.

    Science.gov (United States)

    Flouri, T; Izquierdo-Carrasco, F; Darriba, D; Aberer, A J; Nguyen, L-T; Minh, B Q; Von Haeseler, A; Stamatakis, A

    2015-03-01

    We introduce the Phylogenetic Likelihood Library (PLL), a highly optimized application programming interface for developing likelihood-based phylogenetic inference and postanalysis software. The PLL implements appropriate data structures and functions that allow users to quickly implement common, error-prone, and labor-intensive tasks, such as likelihood calculations, model parameter as well as branch length optimization, and tree space exploration. The highly optimized and parallelized implementation of the phylogenetic likelihood function and a thorough documentation provide a framework for rapid development of scalable parallel phylogenetic software. By example of two likelihood-based phylogenetic codes we show that the PLL improves the sequential performance of current software by a factor of 2-10 while requiring only 1 month of programming time for integration. We show that, when numerical scaling for preventing floating point underflow is enabled, the double precision likelihood calculations in the PLL are up to 1.9 times faster than those in BEAGLE. On an empirical DNA dataset with 2000 taxa the AVX version of PLL is 4 times faster than BEAGLE (scaling enabled and required). The PLL is available at http://www.libpll.org under the GNU General Public License (GPL). © The Author(s) 2014. Published by Oxford University Press, on behalf of the Society of Systematic Biologists.

  2. Morphometric Analysis of the Mandible in Subjects with Class III Malocclusion

    Directory of Open Access Journals (Sweden)

    Jin-Yun Pan

    2006-07-01

    Full Text Available This study evaluated the deformations that contribute to Class III mandibular configuration, employing geometric morphometric analysis. Lateral cephalograms of male and female groups of 100 young adults and 70 children with Class III malocclusion were compared to those of counterparts with normal occlusion. The sample included an equal number of both genders. The cephalographs were traced, and 12 homologous landmarks were identified and digitized. Average mandibular geometries were generated by means of Procrustes analysis. Thin-plate spline analysis was then applied to mandibular configurations to determine local form differences in male and female groups of adults and children with normal occlusion and Class III malocclusion. The mandibular morphology was significantly different between these two groups of male and female adults, and children (p < 0.0001. This spline analysis revealed an anteroposterior elongation of the mandible along the condylion-gnathion axis, showing an extension in the regions of the mandibular condyle and ramus, and of the anteroinferior portion of the mandibular symphysis in Class III groups. More extension was evident in Class III adults. The deformations in subjects with Class III malocclusion may represent a developmental elongation of the mandible anteroposteriorly, which leads to the appearance of a prognathic mandibular profile.

  3. Likelihood analysis of supersymmetric SU(5) GUTs

    Energy Technology Data Exchange (ETDEWEB)

    Bagnaschi, E.; Weiglein, G. [DESY, Hamburg (Germany); Costa, J.C.; Buchmueller, O.; Citron, M.; Richards, A.; De Vries, K.J. [Imperial College, High Energy Physics Group, Blackett Laboratory, London (United Kingdom); Sakurai, K. [University of Durham, Science Laboratories, Department of Physics, Institute for Particle Physics Phenomenology, Durham (United Kingdom); University of Warsaw, Faculty of Physics, Institute of Theoretical Physics, Warsaw (Poland); Borsato, M.; Chobanova, V.; Lucio, M.; Martinez Santos, D. [Universidade de Santiago de Compostela, Santiago de Compostela (Spain); Cavanaugh, R. [Fermi National Accelerator Laboratory, Batavia, IL (United States); University of Illinois at Chicago, Physics Department, Chicago, IL (United States); Roeck, A. de [CERN, Experimental Physics Department, Geneva (Switzerland); Antwerp University, Wilrijk (Belgium); Dolan, M.J. [University of Melbourne, ARC Centre of Excellence for Particle Physics at the Terascale, School of Physics, Parkville (Australia); Ellis, J.R. [King' s College London, Theoretical Particle Physics and Cosmology Group, Department of Physics, London (United Kingdom); Theoretical Physics Department, CERN, Geneva 23 (Switzerland); Flaecher, H. [University of Bristol, H.H. Wills Physics Laboratory, Bristol (United Kingdom); Heinemeyer, S. [Campus of International Excellence UAM+CSIC, Cantoblanco, Madrid (Spain); Instituto de Fisica Teorica UAM-CSIC, Madrid (Spain); Instituto de Fisica de Cantabria (CSIC-UC), Santander (Spain); Isidori, G. [Universitaet Zuerich, Physik-Institut, Zurich (Switzerland); Olive, K.A. [University of Minnesota, William I. Fine Theoretical Physics Institute, School of Physics and Astronomy, Minneapolis, MN (United States)

    2017-02-15

    We perform a likelihood analysis of the constraints from accelerator experiments and astrophysical observations on supersymmetric (SUSY) models with SU(5) boundary conditions on soft SUSY-breaking parameters at the GUT scale. The parameter space of the models studied has seven parameters: a universal gaugino mass m{sub 1/2}, distinct masses for the scalar partners of matter fermions in five- and ten-dimensional representations of SU(5), m{sub 5} and m{sub 10}, and for the 5 and anti 5 Higgs representations m{sub H{sub u}} and m{sub H{sub d}}, a universal trilinear soft SUSY-breaking parameter A{sub 0}, and the ratio of Higgs vevs tan β. In addition to previous constraints from direct sparticle searches, low-energy and flavour observables, we incorporate constraints based on preliminary results from 13 TeV LHC searches for jets + E{sub T} events and long-lived particles, as well as the latest PandaX-II and LUX searches for direct Dark Matter detection. In addition to previously identified mechanisms for bringing the supersymmetric relic density into the range allowed by cosmology, we identify a novel u{sub R}/c{sub R} - χ{sup 0}{sub 1} coannihilation mechanism that appears in the supersymmetric SU(5) GUT model and discuss the role of ν{sub τ} coannihilation. We find complementarity between the prospects for direct Dark Matter detection and SUSY searches at the LHC. (orig.)

  4. Likelihood analysis of supersymmetric SU(5) GUTs

    Energy Technology Data Exchange (ETDEWEB)

    Bagnaschi, E. [DESY, Hamburg (Germany); Costa, J.C. [Imperial College, London (United Kingdom). Blackett Lab.; Sakurai, K. [Durham Univ. (United Kingdom). Inst. for Particle Physics Phenomonology; Warsaw Univ. (Poland). Inst. of Theoretical Physics; Collaboration: MasterCode Collaboration; and others

    2016-10-15

    We perform a likelihood analysis of the constraints from accelerator experiments and astrophysical observations on supersymmetric (SUSY) models with SU(5) boundary conditions on soft SUSY-breaking parameters at the GUT scale. The parameter space of the models studied has 7 parameters: a universal gaugino mass m{sub 1/2}, distinct masses for the scalar partners of matter fermions in five- and ten-dimensional representations of SU(5), m{sub 5} and m{sub 10}, and for the 5 and anti 5 Higgs representations m{sub H{sub u}} and m{sub H{sub d}}, a universal trilinear soft SUSY-breaking parameter A{sub 0}, and the ratio of Higgs vevs tan β. In addition to previous constraints from direct sparticle searches, low-energy and avour observables, we incorporate constraints based on preliminary results from 13 TeV LHC searches for jets+E{sub T} events and long-lived particles, as well as the latest PandaX-II and LUX searches for direct Dark Matter detection. In addition to previously-identified mechanisms for bringing the supersymmetric relic density into the range allowed by cosmology, we identify a novel u{sub R}/c{sub R}-χ{sup 0}{sub 1} coannihilation mechanism that appears in the supersymmetric SU(5) GUT model and discuss the role of ν{sub T} coannihilation. We find complementarity between the prospects for direct Dark Matter detection and SUSY searches at the LHC.

  5. Logic of likelihood

    International Nuclear Information System (INIS)

    Wall, M.J.W.

    1992-01-01

    The notion of open-quotes probabilityclose quotes is generalized to that of open-quotes likelihood,close quotes and a natural logical structure is shown to exist for any physical theory which predicts likelihoods. Two physically based axioms are given for this logical structure to form an orthomodular poset, with an order-determining set of states. The results strengthen the basis of the quantum logic approach to axiomatic quantum theory. 25 refs

  6. Use of COMCAN III in system design and reliability analysis

    International Nuclear Information System (INIS)

    Rasmuson, D.M.; Shepherd, J.C.; Marshall, N.H.; Fitch, L.R.

    1982-03-01

    This manual describes the COMCAN III computer program and its use. COMCAN III is a tool that can be used by the reliability analyst performing a probabilistic risk assessment or by the designer of a system desiring improved performance and efficiency. COMCAN III can be used to determine minimal cut sets of a fault tree, to calculate system reliability characteristics, and to perform qualitative common cause failure analysis

  7. Rethinking ASME III seismic analysis for piping operability evaluations

    International Nuclear Information System (INIS)

    Adams, T.M.; Stevenson, J.D.

    1994-01-01

    It has been recognized since the mid 1980's that there are very large seismic margins to failure for nuclear piping systems when designed using current industry practice, design criteria, and methods. As a result of this realization there are or have been approximately eighteen initiatives within the ASME , Boiler and Pressure Vessel Code Section III, Division 1, in the form of proposed code cases and proposed code text changes designed to reduce these failure margins to more realistic values. For the most part these initiatives have concentrated on reclassifying seismic inertia stresses in the piping as secondary and increasing the allowable stress limits permitted by Section III of the ASME, Boiler Code. This paper focuses on the application of non-linear spectral analysis methods as a method to reduce the input seismic demand determination and thereby reduce the seismic failure margins. The approach is evaluated using the ASME Boiler Pressure Vessel Code Section III Subgroup on Design benchmark procedure as proposed by the Subgroup's Special Task Group on Integrated Piping Criteria. Using this procedure, criteria are compared to current code criterion and analysis methods, and several other of the currently proposed Boiler and Pressure Vessel, Section III, changes. Finally, the applicability of the non-linear spectral analysis to continued Safe Operation Evaluations is reviewed and discussed

  8. An empirical likelihood ratio test robust to individual heterogeneity for differential expression analysis of RNA-seq.

    Science.gov (United States)

    Xu, Maoqi; Chen, Liang

    2018-01-01

    The individual sample heterogeneity is one of the biggest obstacles in biomarker identification for complex diseases such as cancers. Current statistical models to identify differentially expressed genes between disease and control groups often overlook the substantial human sample heterogeneity. Meanwhile, traditional nonparametric tests lose detailed data information and sacrifice the analysis power, although they are distribution free and robust to heterogeneity. Here, we propose an empirical likelihood ratio test with a mean-variance relationship constraint (ELTSeq) for the differential expression analysis of RNA sequencing (RNA-seq). As a distribution-free nonparametric model, ELTSeq handles individual heterogeneity by estimating an empirical probability for each observation without making any assumption about read-count distribution. It also incorporates a constraint for the read-count overdispersion, which is widely observed in RNA-seq data. ELTSeq demonstrates a significant improvement over existing methods such as edgeR, DESeq, t-tests, Wilcoxon tests and the classic empirical likelihood-ratio test when handling heterogeneous groups. It will significantly advance the transcriptomics studies of cancers and other complex disease. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  9. Extended likelihood inference in reliability

    International Nuclear Information System (INIS)

    Martz, H.F. Jr.; Beckman, R.J.; Waller, R.A.

    1978-10-01

    Extended likelihood methods of inference are developed in which subjective information in the form of a prior distribution is combined with sampling results by means of an extended likelihood function. The extended likelihood function is standardized for use in obtaining extended likelihood intervals. Extended likelihood intervals are derived for the mean of a normal distribution with known variance, the failure-rate of an exponential distribution, and the parameter of a binomial distribution. Extended second-order likelihood methods are developed and used to solve several prediction problems associated with the exponential and binomial distributions. In particular, such quantities as the next failure-time, the number of failures in a given time period, and the time required to observe a given number of failures are predicted for the exponential model with a gamma prior distribution on the failure-rate. In addition, six types of life testing experiments are considered. For the binomial model with a beta prior distribution on the probability of nonsurvival, methods are obtained for predicting the number of nonsurvivors in a given sample size and for predicting the required sample size for observing a specified number of nonsurvivors. Examples illustrate each of the methods developed. Finally, comparisons are made with Bayesian intervals in those cases where these are known to exist

  10. Statistical modelling of survival data with random effects h-likelihood approach

    CERN Document Server

    Ha, Il Do; Lee, Youngjo

    2017-01-01

    This book provides a groundbreaking introduction to the likelihood inference for correlated survival data via the hierarchical (or h-) likelihood in order to obtain the (marginal) likelihood and to address the computational difficulties in inferences and extensions. The approach presented in the book overcomes shortcomings in the traditional likelihood-based methods for clustered survival data such as intractable integration. The text includes technical materials such as derivations and proofs in each chapter, as well as recently developed software programs in R (“frailtyHL”), while the real-world data examples together with an R package, “frailtyHL” in CRAN, provide readers with useful hands-on tools. Reviewing new developments since the introduction of the h-likelihood to survival analysis (methods for interval estimation of the individual frailty and for variable selection of the fixed effects in the general class of frailty models) and guiding future directions, the book is of interest to research...

  11. An Activation Likelihood Estimation Meta-Analysis Study of Simple Motor Movements in Older and Young Adults

    Science.gov (United States)

    Turesky, Ted K.; Turkeltaub, Peter E.; Eden, Guinevere F.

    2016-01-01

    The functional neuroanatomy of finger movements has been characterized with neuroimaging in young adults. However, less is known about the aging motor system. Several studies have contrasted movement-related activity in older versus young adults, but there is inconsistency among their findings. To address this, we conducted an activation likelihood estimation (ALE) meta-analysis on within-group data from older adults and young adults performing regularly paced right-hand finger movement tasks in response to external stimuli. We hypothesized that older adults would show a greater likelihood of activation in right cortical motor areas (i.e., ipsilateral to the side of movement) compared to young adults. ALE maps were examined for conjunction and between-group differences. Older adults showed overlapping likelihoods of activation with young adults in left primary sensorimotor cortex (SM1), bilateral supplementary motor area, bilateral insula, left thalamus, and right anterior cerebellum. Their ALE map differed from that of the young adults in right SM1 (extending into dorsal premotor cortex), right supramarginal gyrus, medial premotor cortex, and right posterior cerebellum. The finding that older adults uniquely use ipsilateral regions for right-hand finger movements and show age-dependent modulations in regions recruited by both age groups provides a foundation by which to understand age-related motor decline and motor disorders. PMID:27799910

  12. PROCOV: maximum likelihood estimation of protein phylogeny under covarion models and site-specific covarion pattern analysis

    Directory of Open Access Journals (Sweden)

    Wang Huai-Chun

    2009-09-01

    Full Text Available Abstract Background The covarion hypothesis of molecular evolution holds that selective pressures on a given amino acid or nucleotide site are dependent on the identity of other sites in the molecule that change throughout time, resulting in changes of evolutionary rates of sites along the branches of a phylogenetic tree. At the sequence level, covarion-like evolution at a site manifests as conservation of nucleotide or amino acid states among some homologs where the states are not conserved in other homologs (or groups of homologs. Covarion-like evolution has been shown to relate to changes in functions at sites in different clades, and, if ignored, can adversely affect the accuracy of phylogenetic inference. Results PROCOV (protein covarion analysis is a software tool that implements a number of previously proposed covarion models of protein evolution for phylogenetic inference in a maximum likelihood framework. Several algorithmic and implementation improvements in this tool over previous versions make computationally expensive tree searches with covarion models more efficient and analyses of large phylogenomic data sets tractable. PROCOV can be used to identify covarion sites by comparing the site likelihoods under the covarion process to the corresponding site likelihoods under a rates-across-sites (RAS process. Those sites with the greatest log-likelihood difference between a 'covarion' and an RAS process were found to be of functional or structural significance in a dataset of bacterial and eukaryotic elongation factors. Conclusion Covarion models implemented in PROCOV may be especially useful for phylogenetic estimation when ancient divergences between sequences have occurred and rates of evolution at sites are likely to have changed over the tree. It can also be used to study lineage-specific functional shifts in protein families that result in changes in the patterns of site variability among subtrees.

  13. Parametric Sensitivity Analysis of the WAVEWATCH III Model

    Directory of Open Access Journals (Sweden)

    Beng-Chun Lee

    2009-01-01

    Full Text Available The parameters in numerical wave models need to be calibrated be fore a model can be applied to a specific region. In this study, we selected the 8 most important parameters from the source term of the WAVEWATCH III model and subjected them to sensitivity analysis to evaluate the sensitivity of the WAVEWATCH III model to the selected parameters to determine how many of these parameters should be considered for further discussion, and to justify the significance priority of each parameter. After ranking each parameter by sensitivity and assessing their cumulative impact, we adopted the ARS method to search for the optimal values of those parameters to which the WAVEWATCH III model is most sensitive by comparing modeling results with ob served data at two data buoys off the coast of north eastern Taiwan; the goal being to find optimal parameter values for improved modeling of wave development. The procedure adopting optimal parameters in wave simulations did improve the accuracy of the WAVEWATCH III model in comparison to default runs based on field observations at two buoys.

  14. Maximum likelihood-based analysis of single-molecule photon arrival trajectories

    Science.gov (United States)

    Hajdziona, Marta; Molski, Andrzej

    2011-02-01

    In this work we explore the statistical properties of the maximum likelihood-based analysis of one-color photon arrival trajectories. This approach does not involve binning and, therefore, all of the information contained in an observed photon strajectory is used. We study the accuracy and precision of parameter estimates and the efficiency of the Akaike information criterion and the Bayesian information criterion (BIC) in selecting the true kinetic model. We focus on the low excitation regime where photon trajectories can be modeled as realizations of Markov modulated Poisson processes. The number of observed photons is the key parameter in determining model selection and parameter estimation. For example, the BIC can select the true three-state model from competing two-, three-, and four-state kinetic models even for relatively short trajectories made up of 2 × 103 photons. When the intensity levels are well-separated and 104 photons are observed, the two-state model parameters can be estimated with about 10% precision and those for a three-state model with about 20% precision.

  15. Maximum likelihood-based analysis of single-molecule photon arrival trajectories.

    Science.gov (United States)

    Hajdziona, Marta; Molski, Andrzej

    2011-02-07

    In this work we explore the statistical properties of the maximum likelihood-based analysis of one-color photon arrival trajectories. This approach does not involve binning and, therefore, all of the information contained in an observed photon strajectory is used. We study the accuracy and precision of parameter estimates and the efficiency of the Akaike information criterion and the Bayesian information criterion (BIC) in selecting the true kinetic model. We focus on the low excitation regime where photon trajectories can be modeled as realizations of Markov modulated Poisson processes. The number of observed photons is the key parameter in determining model selection and parameter estimation. For example, the BIC can select the true three-state model from competing two-, three-, and four-state kinetic models even for relatively short trajectories made up of 2 × 10(3) photons. When the intensity levels are well-separated and 10(4) photons are observed, the two-state model parameters can be estimated with about 10% precision and those for a three-state model with about 20% precision.

  16. Likelihood inference for a nonstationary fractional autoregressive model

    DEFF Research Database (Denmark)

    Johansen, Søren; Ørregård Nielsen, Morten

    2010-01-01

    This paper discusses model-based inference in an autoregressive model for fractional processes which allows the process to be fractional of order d or d-b. Fractional differencing involves infinitely many past values and because we are interested in nonstationary processes we model the data X1......,...,X_{T} given the initial values X_{-n}, n=0,1,..., as is usually done. The initial values are not modeled but assumed to be bounded. This represents a considerable generalization relative to all previous work where it is assumed that initial values are zero. For the statistical analysis we assume...... the conditional Gaussian likelihood and for the probability analysis we also condition on initial values but assume that the errors in the autoregressive model are i.i.d. with suitable moment conditions. We analyze the conditional likelihood and its derivatives as stochastic processes in the parameters, including...

  17. MXLKID: a maximum likelihood parameter identifier

    International Nuclear Information System (INIS)

    Gavel, D.T.

    1980-07-01

    MXLKID (MaXimum LiKelihood IDentifier) is a computer program designed to identify unknown parameters in a nonlinear dynamic system. Using noisy measurement data from the system, the maximum likelihood identifier computes a likelihood function (LF). Identification of system parameters is accomplished by maximizing the LF with respect to the parameters. The main body of this report briefly summarizes the maximum likelihood technique and gives instructions and examples for running the MXLKID program. MXLKID is implemented LRLTRAN on the CDC7600 computer at LLNL. A detailed mathematical description of the algorithm is given in the appendices. 24 figures, 6 tables

  18. Synthesis and physicochemical analysis of Sm (II, III) acetylacetone chelate complexes

    International Nuclear Information System (INIS)

    Kostyuk, N.N.; Dik, T.A.; Trebnikov, A.G.

    2004-01-01

    Sm (II, III) acetylacetone chelate complexes were synthesized by electrochemical method. It was shown that anode dissolution of the metal samarium over acetylacetone leads to formation of the Sm (II, III) chelate complexes: xSm(acac)2 · ySm(acac)3 · zH(acac). Factors x, y and z depend on quantity of the electricity, which flew through the electrolysis cell. The compositions of the obtained substances were confirmed by the physicochemical analysis (ultimate analysis, IR-, mass spectroscopy and thermal analysis (thermogravimetric, isothermal warming-up and differential scanning colorimetry). (Authors)

  19. Attitude towards, and likelihood of, complaining in the banking ...

    African Journals Online (AJOL)

    aims to determine customers' attitudes towards complaining as well as their likelihood of voicing a .... is particularly powerful and impacts greatly on customer satisfaction and retention. ...... 'Cross-national analysis of hotel customers' attitudes ...

  20. Likelihood devices in spatial statistics

    NARCIS (Netherlands)

    Zwet, E.W. van

    1999-01-01

    One of the main themes of this thesis is the application to spatial data of modern semi- and nonparametric methods. Another, closely related theme is maximum likelihood estimation from spatial data. Maximum likelihood estimation is not common practice in spatial statistics. The method of moments

  1. Massive optimal data compression and density estimation for scalable, likelihood-free inference in cosmology

    Science.gov (United States)

    Alsing, Justin; Wandelt, Benjamin; Feeney, Stephen

    2018-03-01

    Many statistical models in cosmology can be simulated forwards but have intractable likelihood functions. Likelihood-free inference methods allow us to perform Bayesian inference from these models using only forward simulations, free from any likelihood assumptions or approximations. Likelihood-free inference generically involves simulating mock data and comparing to the observed data; this comparison in data-space suffers from the curse of dimensionality and requires compression of the data to a small number of summary statistics to be tractable. In this paper we use massive asymptotically-optimal data compression to reduce the dimensionality of the data-space to just one number per parameter, providing a natural and optimal framework for summary statistic choice for likelihood-free inference. Secondly, we present the first cosmological application of Density Estimation Likelihood-Free Inference (DELFI), which learns a parameterized model for joint distribution of data and parameters, yielding both the parameter posterior and the model evidence. This approach is conceptually simple, requires less tuning than traditional Approximate Bayesian Computation approaches to likelihood-free inference and can give high-fidelity posteriors from orders of magnitude fewer forward simulations. As an additional bonus, it enables parameter inference and Bayesian model comparison simultaneously. We demonstrate Density Estimation Likelihood-Free Inference with massive data compression on an analysis of the joint light-curve analysis supernova data, as a simple validation case study. We show that high-fidelity posterior inference is possible for full-scale cosmological data analyses with as few as ˜104 simulations, with substantial scope for further improvement, demonstrating the scalability of likelihood-free inference to large and complex cosmological datasets.

  2. Parameter estimation in astronomy through application of the likelihood ratio. [satellite data analysis techniques

    Science.gov (United States)

    Cash, W.

    1979-01-01

    Many problems in the experimental estimation of parameters for models can be solved through use of the likelihood ratio test. Applications of the likelihood ratio, with particular attention to photon counting experiments, are discussed. The procedures presented solve a greater range of problems than those currently in use, yet are no more difficult to apply. The procedures are proved analytically, and examples from current problems in astronomy are discussed.

  3. The fine-tuning cost of the likelihood in SUSY models

    International Nuclear Information System (INIS)

    Ghilencea, D.M.; Ross, G.G.

    2013-01-01

    In SUSY models, the fine-tuning of the electroweak (EW) scale with respect to their parameters γ i ={m 0 ,m 1/2 ,μ 0 ,A 0 ,B 0 ,…} and the maximal likelihood L to fit the experimental data are usually regarded as two different problems. We show that, if one regards the EW minimum conditions as constraints that fix the EW scale, this commonly held view is not correct and that the likelihood contains all the information about fine-tuning. In this case we show that the corrected likelihood is equal to the ratio L/Δ of the usual likelihood L and the traditional fine-tuning measure Δ of the EW scale. A similar result is obtained for the integrated likelihood over the set {γ i }, that can be written as a surface integral of the ratio L/Δ, with the surface in γ i space determined by the EW minimum constraints. As a result, a large likelihood actually demands a large ratio L/Δ or equivalently, a small χ new 2 =χ old 2 +2lnΔ. This shows the fine-tuning cost to the likelihood (χ new 2 ) of the EW scale stability enforced by SUSY, that is ignored in data fits. A good χ new 2 /d.o.f.≈1 thus demands SUSY models have a fine-tuning amount Δ≪exp(d.o.f./2), which provides a model-independent criterion for acceptable fine-tuning. If this criterion is not met, one can thus rule out SUSY models without a further χ 2 /d.o.f. analysis. Numerical methods to fit the data can easily be adapted to account for this effect.

  4. Biomechanical analysis of effects of neuromusculoskeletal training for older adults on the likelihood of slip-induced falls.

    OpenAIRE

    Kim, Sukwon

    2006-01-01

    Overview of the Study Title Biomechanical Analysis for Effects of Neuromusculoskeletal Training for Older Adults on Outcomes of Slip-induced Falls. Research Objectives The objective of this study was to evaluate if neuromusculoskeletal training (i.e., weight and balance training) for older adults could reduce the likelihood of slip-induced fall accidents. The study focused on evaluating biomechanics among the elderly at pre- and post-training stages during processes associated w...

  5. Modeling gene expression measurement error: a quasi-likelihood approach

    Directory of Open Access Journals (Sweden)

    Strimmer Korbinian

    2003-03-01

    Full Text Available Abstract Background Using suitable error models for gene expression measurements is essential in the statistical analysis of microarray data. However, the true probabilistic model underlying gene expression intensity readings is generally not known. Instead, in currently used approaches some simple parametric model is assumed (usually a transformed normal distribution or the empirical distribution is estimated. However, both these strategies may not be optimal for gene expression data, as the non-parametric approach ignores known structural information whereas the fully parametric models run the risk of misspecification. A further related problem is the choice of a suitable scale for the model (e.g. observed vs. log-scale. Results Here a simple semi-parametric model for gene expression measurement error is presented. In this approach inference is based an approximate likelihood function (the extended quasi-likelihood. Only partial knowledge about the unknown true distribution is required to construct this function. In case of gene expression this information is available in the form of the postulated (e.g. quadratic variance structure of the data. As the quasi-likelihood behaves (almost like a proper likelihood, it allows for the estimation of calibration and variance parameters, and it is also straightforward to obtain corresponding approximate confidence intervals. Unlike most other frameworks, it also allows analysis on any preferred scale, i.e. both on the original linear scale as well as on a transformed scale. It can also be employed in regression approaches to model systematic (e.g. array or dye effects. Conclusions The quasi-likelihood framework provides a simple and versatile approach to analyze gene expression data that does not make any strong distributional assumptions about the underlying error model. For several simulated as well as real data sets it provides a better fit to the data than competing models. In an example it also

  6. Comparison of least-squares vs. maximum likelihood estimation for standard spectrum technique of β−γ coincidence spectrum analysis

    International Nuclear Information System (INIS)

    Lowrey, Justin D.; Biegalski, Steven R.F.

    2012-01-01

    The spectrum deconvolution analysis tool (SDAT) software code was written and tested at The University of Texas at Austin utilizing the standard spectrum technique to determine activity levels of Xe-131m, Xe-133m, Xe-133, and Xe-135 in β–γ coincidence spectra. SDAT was originally written to utilize the method of least-squares to calculate the activity of each radionuclide component in the spectrum. Recently, maximum likelihood estimation was also incorporated into the SDAT tool. This is a robust statistical technique to determine the parameters that maximize the Poisson distribution likelihood function of the sample data. In this case it is used to parameterize the activity level of each of the radioxenon components in the spectra. A new test dataset was constructed utilizing Xe-131m placed on a Xe-133 background to compare the robustness of the least-squares and maximum likelihood estimation methods for low counting statistics data. The Xe-131m spectra were collected independently from the Xe-133 spectra and added to generate the spectra in the test dataset. The true independent counts of Xe-131m and Xe-133 are known, as they were calculated before the spectra were added together. Spectra with both high and low counting statistics are analyzed. Studies are also performed by analyzing only the 30 keV X-ray region of the β–γ coincidence spectra. Results show that maximum likelihood estimation slightly outperforms least-squares for low counting statistics data.

  7. Comparative genomic analysis of the WRKY III gene family in populus, grape, arabidopsis and rice.

    Science.gov (United States)

    Wang, Yiyi; Feng, Lin; Zhu, Yuxin; Li, Yuan; Yan, Hanwei; Xiang, Yan

    2015-09-08

    WRKY III genes have significant functions in regulating plant development and resistance. In plant, WRKY gene family has been studied in many species, however, there still lack a comprehensive analysis of WRKY III genes in the woody plant species poplar, three representative lineages of flowering plant species are incorporated in most analyses: Arabidopsis (a model plant for annual herbaceous dicots), grape (one model plant for perennial dicots) and Oryza sativa (a model plant for monocots). In this study, we identified 10, 6, 13 and 28 WRKY III genes in the genomes of Populus trichocarpa, grape (Vitis vinifera), Arabidopsis thaliana and rice (Oryza sativa), respectively. Phylogenetic analysis revealed that the WRKY III proteins could be divided into four clades. By microsynteny analysis, we found that the duplicated regions were more conserved between poplar and grape than Arabidopsis or rice. We dated their duplications by Ks analysis of Populus WRKY III genes and demonstrated that all the blocks were formed after the divergence of monocots and dicots. Strong purifying selection has played a key role in the maintenance of WRKY III genes in Populus. Tissue expression analysis of the WRKY III genes in Populus revealed that five were most highly expressed in the xylem. We also performed quantitative real-time reverse transcription PCR analysis of WRKY III genes in Populus treated with salicylic acid, abscisic acid and polyethylene glycol to explore their stress-related expression patterns. This study highlighted the duplication and diversification of the WRKY III gene family in Populus and provided a comprehensive analysis of this gene family in the Populus genome. Our results indicated that the majority of WRKY III genes of Populus was expanded by large-scale gene duplication. The expression pattern of PtrWRKYIII gene identified that these genes play important roles in the xylem during poplar growth and development, and may play crucial role in defense to drought

  8. Confirmatory Factor Analysis of the WISC-III with Child Psychiatric Inpatients.

    Science.gov (United States)

    Tupa, David J.; Wright, Margaret O'Dougherty; Fristad, Mary A.

    1997-01-01

    Factor models of the Wechsler Intelligence Scale for Children-Third Edition (WISC-III) for one, two, three, and four factors were tested using confirmatory factor analysis with a sample of 177 child psychiatric inpatients. The four-factor model proposed in the WISC-III manual provided the best fit to the data. (SLD)

  9. Maximum likelihood versus likelihood-free quantum system identification in the atom maser

    International Nuclear Information System (INIS)

    Catana, Catalin; Kypraios, Theodore; Guţă, Mădălin

    2014-01-01

    We consider the problem of estimating a dynamical parameter of a Markovian quantum open system (the atom maser), by performing continuous time measurements in the system's output (outgoing atoms). Two estimation methods are investigated and compared. Firstly, the maximum likelihood estimator (MLE) takes into account the full measurement data and is asymptotically optimal in terms of its mean square error. Secondly, the ‘likelihood-free’ method of approximate Bayesian computation (ABC) produces an approximation of the posterior distribution for a given set of summary statistics, by sampling trajectories at different parameter values and comparing them with the measurement data via chosen statistics. Building on previous results which showed that atom counts are poor statistics for certain values of the Rabi angle, we apply MLE to the full measurement data and estimate its Fisher information. We then select several correlation statistics such as waiting times, distribution of successive identical detections, and use them as input of the ABC algorithm. The resulting posterior distribution follows closely the data likelihood, showing that the selected statistics capture ‘most’ statistical information about the Rabi angle. (paper)

  10. Obtaining reliable Likelihood Ratio tests from simulated likelihood functions

    DEFF Research Database (Denmark)

    Andersen, Laura Mørch

    It is standard practice by researchers and the default option in many statistical programs to base test statistics for mixed models on simulations using asymmetric draws (e.g. Halton draws). This paper shows that when the estimated likelihood functions depend on standard deviations of mixed param...

  11. Factors Associated with Young Adults’ Pregnancy Likelihood

    Science.gov (United States)

    Kitsantas, Panagiota; Lindley, Lisa L.; Wu, Huichuan

    2014-01-01

    OBJECTIVES While progress has been made to reduce adolescent pregnancies in the United States, rates of unplanned pregnancy among young adults (18–29 years) remain high. In this study, we assessed factors associated with perceived likelihood of pregnancy (likelihood of getting pregnant/getting partner pregnant in the next year) among sexually experienced young adults who were not trying to get pregnant and had ever used contraceptives. METHODS We conducted a secondary analysis of 660 young adults, 18–29 years old in the United States, from the cross-sectional National Survey of Reproductive and Contraceptive Knowledge. Logistic regression and classification tree analyses were conducted to generate profiles of young adults most likely to report anticipating a pregnancy in the next year. RESULTS Nearly one-third (32%) of young adults indicated they believed they had at least some likelihood of becoming pregnant in the next year. Young adults who believed that avoiding pregnancy was not very important were most likely to report pregnancy likelihood (odds ratio [OR], 5.21; 95% CI, 2.80–9.69), as were young adults for whom avoiding a pregnancy was important but not satisfied with their current contraceptive method (OR, 3.93; 95% CI, 1.67–9.24), attended religious services frequently (OR, 3.0; 95% CI, 1.52–5.94), were uninsured (OR, 2.63; 95% CI, 1.31–5.26), and were likely to have unprotected sex in the next three months (OR, 1.77; 95% CI, 1.04–3.01). DISCUSSION These results may help guide future research and the development of pregnancy prevention interventions targeting sexually experienced young adults. PMID:25782849

  12. Maximum likelihood fitting of FROC curves under an initial-detection-and-candidate-analysis model

    International Nuclear Information System (INIS)

    Edwards, Darrin C.; Kupinski, Matthew A.; Metz, Charles E.; Nishikawa, Robert M.

    2002-01-01

    We have developed a model for FROC curve fitting that relates the observer's FROC performance not to the ROC performance that would be obtained if the observer's responses were scored on a per image basis, but rather to a hypothesized ROC performance that the observer would obtain in the task of classifying a set of 'candidate detections' as positive or negative. We adopt the assumptions of the Bunch FROC model, namely that the observer's detections are all mutually independent, as well as assumptions qualitatively similar to, but different in nature from, those made by Chakraborty in his AFROC scoring methodology. Under the assumptions of our model, we show that the observer's FROC performance is a linearly scaled version of the candidate analysis ROC curve, where the scaling factors are just given by the FROC operating point coordinates for detecting initial candidates. Further, we show that the likelihood function of the model parameters given observational data takes on a simple form, and we develop a maximum likelihood method for fitting a FROC curve to this data. FROC and AFROC curves are produced for computer vision observer datasets and compared with the results of the AFROC scoring method. Although developed primarily with computer vision schemes in mind, we hope that the methodology presented here will prove worthy of further study in other applications as well

  13. Complexes of lanthanum(III), cerium(III), samarium(III) and dysprosium(III) with substituted piperidines

    Energy Technology Data Exchange (ETDEWEB)

    Manhas, B S; Trikha, A K; Singh, H; Chander, M

    1983-11-01

    Complexes of the general formulae M/sub 2/Cl/sub 6/(L)/sub 3/.C/sub 2/H/sub 5/OH and M/sub 2/(NO/sub 3/)/sub 6/(L)/sub 2/.CH/sub 3/OH have been synthesised by the reactions of chlorides and nitrates of La(III), Ce(III), Sm(III) and Dy(III) with 2-methylpiperidine, 3-methylpiperidine and 4-methylpiperidine. These complexes have been characterised on the basis of their elemental analysis, and IR and electronic reflectance spectra. IR spectral data indicate the presence of coordinated ethanol and methanol molecules and bidentate nitrate groups. Coordination numbers of the metal ions vary from 5 to 8. 19 refs.

  14. Climate reconstruction analysis using coexistence likelihood estimation (CRACLE): a method for the estimation of climate using vegetation.

    Science.gov (United States)

    Harbert, Robert S; Nixon, Kevin C

    2015-08-01

    • Plant distributions have long been understood to be correlated with the environmental conditions to which species are adapted. Climate is one of the major components driving species distributions. Therefore, it is expected that the plants coexisting in a community are reflective of the local environment, particularly climate.• Presented here is a method for the estimation of climate from local plant species coexistence data. The method, Climate Reconstruction Analysis using Coexistence Likelihood Estimation (CRACLE), is a likelihood-based method that employs specimen collection data at a global scale for the inference of species climate tolerance. CRACLE calculates the maximum joint likelihood of coexistence given individual species climate tolerance characterization to estimate the expected climate.• Plant distribution data for more than 4000 species were used to show that this method accurately infers expected climate profiles for 165 sites with diverse climatic conditions. Estimates differ from the WorldClim global climate model by less than 1.5°C on average for mean annual temperature and less than ∼250 mm for mean annual precipitation. This is a significant improvement upon other plant-based climate-proxy methods.• CRACLE validates long hypothesized interactions between climate and local associations of plant species. Furthermore, CRACLE successfully estimates climate that is consistent with the widely used WorldClim model and therefore may be applied to the quantitative estimation of paleoclimate in future studies. © 2015 Botanical Society of America, Inc.

  15. Pain anticipation: an activation likelihood estimation meta-analysis of brain imaging studies.

    Science.gov (United States)

    Palermo, Sara; Benedetti, Fabrizio; Costa, Tommaso; Amanzio, Martina

    2015-05-01

    The anticipation of pain has been investigated in a variety of brain imaging studies. Importantly, today there is no clear overall picture of the areas that are involved in different studies and the exact role of these regions in pain expectation remains especially unexploited. To address this issue, we used activation likelihood estimation meta-analysis to analyze pain anticipation in several neuroimaging studies. A total of 19 functional magnetic resonance imaging were included in the analysis to search for the cortical areas involved in pain anticipation in human experimental models. During anticipation, activated foci were found in the dorsolateral prefrontal, midcingulate and anterior insula cortices, medial and inferior frontal gyri, inferior parietal lobule, middle and superior temporal gyrus, thalamus, and caudate. Deactivated foci were found in the anterior cingulate, superior frontal gyrus, parahippocampal gyrus and in the claustrum. The results of the meta-analytic connectivity analysis provide an overall view of the brain responses triggered by the anticipation of a noxious stimulus. Such a highly distributed perceptual set of self-regulation may prime brain regions to process information where emotion, action and perception as well as their related subcategories play a central role. Not only do these findings provide important information on the neural events when anticipating pain, but also they may give a perspective into nocebo responses, whereby negative expectations may lead to pain worsening. © 2014 Wiley Periodicals, Inc.

  16. Maximum Likelihood Blind Channel Estimation for Space-Time Coding Systems

    Directory of Open Access Journals (Sweden)

    Hakan A. Çırpan

    2002-05-01

    Full Text Available Sophisticated signal processing techniques have to be developed for capacity enhancement of future wireless communication systems. In recent years, space-time coding is proposed to provide significant capacity gains over the traditional communication systems in fading wireless channels. Space-time codes are obtained by combining channel coding, modulation, transmit diversity, and optional receive diversity in order to provide diversity at the receiver and coding gain without sacrificing the bandwidth. In this paper, we consider the problem of blind estimation of space-time coded signals along with the channel parameters. Both conditional and unconditional maximum likelihood approaches are developed and iterative solutions are proposed. The conditional maximum likelihood algorithm is based on iterative least squares with projection whereas the unconditional maximum likelihood approach is developed by means of finite state Markov process modelling. The performance analysis issues of the proposed methods are studied. Finally, some simulation results are presented.

  17. Design of simplified maximum-likelihood receivers for multiuser CPM systems.

    Science.gov (United States)

    Bing, Li; Bai, Baoming

    2014-01-01

    A class of simplified maximum-likelihood receivers designed for continuous phase modulation based multiuser systems is proposed. The presented receiver is built upon a front end employing mismatched filters and a maximum-likelihood detector defined in a low-dimensional signal space. The performance of the proposed receivers is analyzed and compared to some existing receivers. Some schemes are designed to implement the proposed receivers and to reveal the roles of different system parameters. Analysis and numerical results show that the proposed receivers can approach the optimum multiuser receivers with significantly (even exponentially in some cases) reduced complexity and marginal performance degradation.

  18. Complexes of 4-chlorophenoxyacetates of Nd(III), Gd(III) and Ho(III)

    International Nuclear Information System (INIS)

    Ferenc, W.; Bernat, M; Gluchowska, H.W.; Sarzynski, J.

    2010-01-01

    The complexes of 4-chlorophenoxyacetates of Nd(III), Gd(III) and Ho(III) have been synthesized as polycrystalline hydrated solids, and characterized by elemental analysis, spectroscopy, magnetic studies and also by X-ray diffraction and thermogravimetric measurements. The analysed complexes have the following colours: violet for Nd(III), white for Gd(III) and cream for Ho(III) compounds. The carboxylate groups bind as bidentate chelating (Ho) or bridging ligands (Nd, Gd). On heating to 1173K in air the complexes decompose in several steps. At first, they dehydrate in one step to form anhydrous salts, that next decompose to the oxides of respective metals. The gaseous products of their thermal decomposition in nitrogen were also determined and the magnetic susceptibilities were measured over the temperature range of 76-303K and the magnetic moments were calculated. The results show that 4-chlorophenoxyacetates of Nd(III), Gd(III) and Ho(III) are high-spin complexes with weak ligand fields. The solubility value in water at 293K for analysed 4-chlorophenoxyacetates is in the order of 10 -4 mol/dm 3 . (author)

  19. Essays on empirical likelihood in economics

    NARCIS (Netherlands)

    Gao, Z.

    2012-01-01

    This thesis intends to exploit the roots of empirical likelihood and its related methods in mathematical programming and computation. The roots will be connected and the connections will induce new solutions for the problems of estimation, computation, and generalization of empirical likelihood.

  20. Using DNA fingerprints to infer familial relationships within NHANES III households.

    Science.gov (United States)

    Katki, Hormuzd A; Sanders, Christopher L; Graubard, Barry I; Bergen, Andrew W

    2010-06-01

    Developing, targeting, and evaluating genomic strategies for population-based disease prevention require population-based data. In response to this urgent need, genotyping has been conducted within the Third National Health and Nutrition Examination (NHANES III), the nationally-representative household-interview health survey in the U.S. However, before these genetic analyses can occur, family relationships within households must be accurately ascertained. Unfortunately, reported family relationships within NHANES III households based on questionnaire data are incomplete and inconclusive with regards to actual biological relatedness of family members. We inferred family relationships within households using DNA fingerprints (Identifiler(R)) that contain the DNA loci used by law enforcement agencies for forensic identification of individuals. However, performance of these loci for relationship inference is not well understood. We evaluated two competing statistical methods for relationship inference on pairs of household members: an exact likelihood ratio relying on allele frequencies to an Identical By State (IBS) likelihood ratio that only requires matching alleles. We modified these methods to account for genotyping errors and population substructure. The two methods usually agree on the rankings of the most likely relationships. However, the IBS method underestimates the likelihood ratio by not accounting for the informativeness of matching rare alleles. The likelihood ratio is sensitive to estimates of population substructure, and parent-child relationships are sensitive to the specified genotyping error rate. These loci were unable to distinguish second-degree relationships and cousins from being unrelated. The genetic data is also useful for verifying reported relationships and identifying data quality issues. An important by-product is the first explicitly nationally-representative estimates of allele frequencies at these ubiquitous forensic loci.

  1. Design of Simplified Maximum-Likelihood Receivers for Multiuser CPM Systems

    Directory of Open Access Journals (Sweden)

    Li Bing

    2014-01-01

    Full Text Available A class of simplified maximum-likelihood receivers designed for continuous phase modulation based multiuser systems is proposed. The presented receiver is built upon a front end employing mismatched filters and a maximum-likelihood detector defined in a low-dimensional signal space. The performance of the proposed receivers is analyzed and compared to some existing receivers. Some schemes are designed to implement the proposed receivers and to reveal the roles of different system parameters. Analysis and numerical results show that the proposed receivers can approach the optimum multiuser receivers with significantly (even exponentially in some cases reduced complexity and marginal performance degradation.

  2. Improvement and comparison of likelihood functions for model calibration and parameter uncertainty analysis within a Markov chain Monte Carlo scheme

    Science.gov (United States)

    Cheng, Qin-Bo; Chen, Xi; Xu, Chong-Yu; Reinhardt-Imjela, Christian; Schulte, Achim

    2014-11-01

    In this study, the likelihood functions for uncertainty analysis of hydrological models are compared and improved through the following steps: (1) the equivalent relationship between the Nash-Sutcliffe Efficiency coefficient (NSE) and the likelihood function with Gaussian independent and identically distributed residuals is proved; (2) a new estimation method of the Box-Cox transformation (BC) parameter is developed to improve the effective elimination of the heteroscedasticity of model residuals; and (3) three likelihood functions-NSE, Generalized Error Distribution with BC (BC-GED) and Skew Generalized Error Distribution with BC (BC-SGED)-are applied for SWAT-WB-VSA (Soil and Water Assessment Tool - Water Balance - Variable Source Area) model calibration in the Baocun watershed, Eastern China. Performances of calibrated models are compared using the observed river discharges and groundwater levels. The result shows that the minimum variance constraint can effectively estimate the BC parameter. The form of the likelihood function significantly impacts on the calibrated parameters and the simulated results of high and low flow components. SWAT-WB-VSA with the NSE approach simulates flood well, but baseflow badly owing to the assumption of Gaussian error distribution, where the probability of the large error is low, but the small error around zero approximates equiprobability. By contrast, SWAT-WB-VSA with the BC-GED or BC-SGED approach mimics baseflow well, which is proved in the groundwater level simulation. The assumption of skewness of the error distribution may be unnecessary, because all the results of the BC-SGED approach are nearly the same as those of the BC-GED approach.

  3. A Walk on the Wild Side: The Impact of Music on Risk-Taking Likelihood.

    Science.gov (United States)

    Enström, Rickard; Schmaltz, Rodney

    2017-01-01

    From a marketing perspective, there has been substantial interest in on the role of risk-perception on consumer behavior. Specific 'problem music' like rap and heavy metal has long been associated with delinquent behavior, including violence, drug use, and promiscuous sex. Although individuals' risk preferences have been investigated across a range of decision-making situations, there has been little empirical work demonstrating the direct role music may have on the likelihood of engaging in risky activities. In the exploratory study reported here, we assessed the impact of listening to different styles of music while assessing risk-taking likelihood through a psychometric scale. Risk-taking likelihood was measured across ethical, financial, health and safety, recreational and social domains. Through the means of a canonical correlation analysis, the multivariate relationship between different music styles and individual risk-taking likelihood across the different domains is discussed. Our results indicate that listening to different types of music does influence risk-taking likelihood, though not in areas of health and safety.

  4. An automated land-use mapping comparison of the Bayesian maximum likelihood and linear discriminant analysis algorithms

    Science.gov (United States)

    Tom, C. H.; Miller, L. D.

    1984-01-01

    The Bayesian maximum likelihood parametric classifier has been tested against the data-based formulation designated 'linear discrimination analysis', using the 'GLIKE' decision and "CLASSIFY' classification algorithms in the Landsat Mapping System. Identical supervised training sets, USGS land use/land cover classes, and various combinations of Landsat image and ancilliary geodata variables, were used to compare the algorithms' thematic mapping accuracy on a single-date summer subscene, with a cellularized USGS land use map of the same time frame furnishing the ground truth reference. CLASSIFY, which accepts a priori class probabilities, is found to be more accurate than GLIKE, which assumes equal class occurrences, for all three mapping variable sets and both levels of detail. These results may be generalized to direct accuracy, time, cost, and flexibility advantages of linear discriminant analysis over Bayesian methods.

  5. Analysis of excess reactivity of JOYO MK-III performance test core

    International Nuclear Information System (INIS)

    Maeda, Shigetaka; Yokoyama, Kenji

    2003-10-01

    JOYO is currently being upgraded to the high performance irradiation bed JOYO MK-III core'. The MK-III core is divided into two fuel regions with different plutonium contents. To obtain a higher neutron flux, the active core height was reduced from 55 cm to 50 cm. The reflector subassemblies were replaced by shielding subassemblies in the outer two rows. Twenty of the MK-III outer core fuel subassemblies in the performance test core were partially burned in the transition core. Four irradiation test rigs, which do not contain any fuel material, were loaded in the center of the performance test core. In order to evaluate the excess reactivity of MK-III performance test core accurately, we evaluated it by applying not only the JOYO MK-II core management code system MAGI, but also the MK-III core management code system HESTIA, the JUPITER standard analysis method and the Monte Carlo method with JFS-3-J3.2R content set. The excess reactivity evaluations obtained by the JUPITER standard analysis method were corrected to results based on transport theory with zero mesh-size in space and angle. A bias factor based on the MK-II 35th core, which sensitivity was similar to MK-III performance test core's, was also applied, except in the case where an adjusted nuclear cross-section library was used. Exact three-dimensional, pin-by-pin geometry and continuous-energy cross sections were used in the Monte Carlo calculation. The estimated error components associated with cross-sections, methods correction factors and the bias factor were combined based on Takeda's theory. Those independently calculated values agree well and range from 2.8 to 3.4%Δk/kk'. The calculation result of the MK-III core management code system HESTLA was 3.13% Δk/kk'. The estimated errors for bias method range from 0.1 to 0.2%Δk/kk'. The error in the case using adjusted cross-section was 0.3%Δk/kk'. (author)

  6. The Laplace Likelihood Ratio Test for Heteroscedasticity

    Directory of Open Access Journals (Sweden)

    J. Martin van Zyl

    2011-01-01

    Full Text Available It is shown that the likelihood ratio test for heteroscedasticity, assuming the Laplace distribution, gives good results for Gaussian and fat-tailed data. The likelihood ratio test, assuming normality, is very sensitive to any deviation from normality, especially when the observations are from a distribution with fat tails. Such a likelihood test can also be used as a robust test for a constant variance in residuals or a time series if the data is partitioned into groups.

  7. Post-test analysis of ROSA-III experiment Run 702

    International Nuclear Information System (INIS)

    Koizumi, Yasuo; Kikuchi, Osamu; Soda, Kunihisa

    1980-01-01

    The purpose of the ROSA-III experiment with a scaled BWR test facility is to examine primary coolant thermal-hydraulic behavior and performance of ECCS during a posturated loss-of-coolant accident of BWR. The results provide information for verification and improvement of reactor safety analysis codes. Run 702 assumed a 200% split break at the recirculation pump suction line under an average core power without ECCS activation. Post - test analysis of the Run 702 experiment was made with computer code RELAP4J. Agreement of the calculated system pressure and the experiment one was good. However, the calculated heater surface temperatures were higher than the measured ones. Also, the axial temperature distribution was different in tendency from the experimental one. From these results, the necessity was indicated of improving the analytical model of void distribution in the core and the nodalization in the pressure vassel, in order to make the analysis more realistic. And also, the need of characteristic test was indicated for ROSA-III test facility components, such as jet pump and piping form loss coefficient; likewise, flow rate measurements must be increased and refined. (author)

  8. ldr: An R Software Package for Likelihood-Based Su?cient Dimension Reduction

    Directory of Open Access Journals (Sweden)

    Kofi Placid Adragni

    2014-11-01

    Full Text Available In regression settings, a su?cient dimension reduction (SDR method seeks the core information in a p-vector predictor that completely captures its relationship with a response. The reduced predictor may reside in a lower dimension d < p, improving ability to visualize data and predict future observations, and mitigating dimensionality issues when carrying out further analysis. We introduce ldr, a new R software package that implements three recently proposed likelihood-based methods for SDR: covariance reduction, likelihood acquired directions, and principal fitted components. All three methods reduce the dimensionality of the data by pro jection into lower dimensional subspaces. The package also implements a variable screening method built upon principal ?tted components which makes use of ?exible basis functions to capture the dependencies between the predictors and the response. Examples are given to demonstrate likelihood-based SDR analyses using ldr, including estimation of the dimension of reduction subspaces and selection of basis functions. The ldr package provides a framework that we hope to grow into a comprehensive library of likelihood-based SDR methodologies.

  9. Physical constraints on the likelihood of life on exoplanets

    Science.gov (United States)

    Lingam, Manasvi; Loeb, Abraham

    2018-04-01

    One of the most fundamental questions in exoplanetology is to determine whether a given planet is habitable. We estimate the relative likelihood of a planet's propensity towards habitability by considering key physical characteristics such as the role of temperature on ecological and evolutionary processes, and atmospheric losses via hydrodynamic escape and stellar wind erosion. From our analysis, we demonstrate that Earth-sized exoplanets in the habitable zone around M-dwarfs seemingly display much lower prospects of being habitable relative to Earth, owing to the higher incident ultraviolet fluxes and closer distances to the host star. We illustrate our results by specifically computing the likelihood (of supporting life) for the recently discovered exoplanets, Proxima b and TRAPPIST-1e, which we find to be several orders of magnitude smaller than that of Earth.

  10. Statistical analysis of the BOIL program in RSYST-III

    International Nuclear Information System (INIS)

    Beck, W.; Hausch, H.J.

    1978-11-01

    The paper describes a statistical analysis in the RSYST-III program system. Using the example of the BOIL program, it is shown how the effects of inaccurate input data on the output data can be discovered. The existing possibilities of data generation, data handling, and data evaluation are outlined. (orig.) [de

  11. A Walk on the Wild Side: The Impact of Music on Risk-Taking Likelihood

    Science.gov (United States)

    Enström, Rickard; Schmaltz, Rodney

    2017-01-01

    From a marketing perspective, there has been substantial interest in on the role of risk-perception on consumer behavior. Specific ‘problem music’ like rap and heavy metal has long been associated with delinquent behavior, including violence, drug use, and promiscuous sex. Although individuals’ risk preferences have been investigated across a range of decision-making situations, there has been little empirical work demonstrating the direct role music may have on the likelihood of engaging in risky activities. In the exploratory study reported here, we assessed the impact of listening to different styles of music while assessing risk-taking likelihood through a psychometric scale. Risk-taking likelihood was measured across ethical, financial, health and safety, recreational and social domains. Through the means of a canonical correlation analysis, the multivariate relationship between different music styles and individual risk-taking likelihood across the different domains is discussed. Our results indicate that listening to different types of music does influence risk-taking likelihood, though not in areas of health and safety. PMID:28539908

  12. A Walk on the Wild Side: The Impact of Music on Risk-Taking Likelihood

    Directory of Open Access Journals (Sweden)

    Rickard Enström

    2017-05-01

    Full Text Available From a marketing perspective, there has been substantial interest in on the role of risk-perception on consumer behavior. Specific ‘problem music’ like rap and heavy metal has long been associated with delinquent behavior, including violence, drug use, and promiscuous sex. Although individuals’ risk preferences have been investigated across a range of decision-making situations, there has been little empirical work demonstrating the direct role music may have on the likelihood of engaging in risky activities. In the exploratory study reported here, we assessed the impact of listening to different styles of music while assessing risk-taking likelihood through a psychometric scale. Risk-taking likelihood was measured across ethical, financial, health and safety, recreational and social domains. Through the means of a canonical correlation analysis, the multivariate relationship between different music styles and individual risk-taking likelihood across the different domains is discussed. Our results indicate that listening to different types of music does influence risk-taking likelihood, though not in areas of health and safety.

  13. Sulphate analysis in uranium leach iron(III) chloride solutions by inductively coupled argon plasma spectrometry

    International Nuclear Information System (INIS)

    Nirdosh, I.; Lakhani, S.; Yunus, M.Z.M.

    1993-01-01

    Inductively coupled Argon Plasma Spectrometry is used for the indirect determination of sulphate in iron(III) chloride leach solution of Elliot Lake uranium ores via addition of a known amount of barium ions and analyzing for excess of barium. The ore contains ∼ 7 wt% pyrite, FeS 2 , as the major mineral which oxidizes to generate sulphate during leaching with Fe(III). The effects of pH, the concentrations of Fe(III) and chloride ions and for presence of ethanol in the test samples on the accuracy of analysis are studied. It is found that unlike the Rhodizonate method, removal of iron(III) from or addition of ethanol to the test sample prior to analysis are not required. Linear calibration curves are obtained. (author)

  14. Crash test rating and likelihood of major thoracoabdominal injury in motor vehicle crashes: the new car assessment program side-impact crash test, 1998-2010.

    Science.gov (United States)

    Figler, Bradley D; Mack, Christopher D; Kaufman, Robert; Wessells, Hunter; Bulger, Eileen; Smith, Thomas G; Voelzke, Bryan

    2014-03-01

    The National Highway Traffic Safety Administration's New Car Assessment Program (NCAP) implemented side-impact crash testing on all new vehicles since 1998 to assess the likelihood of major thoracoabdominal injuries during a side-impact crash. Higher crash test rating is intended to indicate a safer car, but the real-world applicability of these ratings is unknown. Our objective was to determine the relationship between a vehicle's NCAP side-impact crash test rating and the risk of major thoracoabdominal injury among the vehicle's occupants in real-world side-impact motor vehicle crashes. The National Automotive Sampling System Crashworthiness Data System contains detailed crash and injury data in a sample of major crashes in the United States. For model years 1998 to 2010 and crash years 1999 to 2010, 68,124 occupants were identified in the Crashworthiness Data System database. Because 47% of cases were missing crash severity (ΔV), multiple imputation was used to estimate the missing values. The primary predictor of interest was the occupant vehicle's NCAP side-impact crash test rating, and the outcome of interest was the presence of major (Abbreviated Injury Scale [AIS] score ≥ 3) thoracoabdominal injury. In multivariate analysis, increasing NCAP crash test rating was associated with lower likelihood of major thoracoabdominal injury at high (odds ratio [OR], 0.8; 95% confidence interval [CI], 0.7-0.9; p NCAP side-impact crash test rating is associated with a lower likelihood of major thoracoabdominal trauma. Epidemiologic study, level III.

  15. Time to Angiographic Reperfusion and Clinical Outcome after Acute Ischemic Stroke in the Interventional Management of Stroke Phase III (IMS III) Trial: A Validation Study

    Science.gov (United States)

    Khatri, Pooja; Yeatts, Sharon D.; Mazighi, Mikael; Broderick, Joseph P.; Liebeskind, David S.; Demchuk, Andrew M.; Amarenco, Pierre; Carrozzella, Janice; Spilker, Judith; Foster, Lydia D.; Goyal, Mayank; Hill, Michael D.; Palesch, Yuko Y.; Jauch, Edward C.; Haley, E. Clarke; Vagal, Achala; Tomsick, Thomas A.

    2014-01-01

    BACKGROUND The IMS III Trial did not demonstrate clinical benefit of the endovascular approach compared to IV rt-PA alone for moderate or severe ischemic strokes (NIHSS≥8) enrolled within three hours of stroke onset. Late reperfusion of tissue that is no longer salvageable may be one explanation, as suggested by prior exploratory studies showing an association between time to reperfusion and good clinical outcome. We sought to validate this relationship in the large-scale IMS III trial, and consider its implications for future endovascular trials. METHODS The analysis consisted of the endovascular cohort with proximal arterial occlusions in the anterior circulation that achieved angiographic reperfusion (TICI 2–3) during the endovascular procedure (within 7 hours from the onset of symptoms). Logistic regression was used to model good clinical outcome (90-day modified Rankin 0–2) as a function of the time to reperfusion, and prespecified variables were considered for adjustment. FINDINGS Among 240 proximal vessel occlusions, angiographic reperfusion (TICI 2–3) was achieved in 182 (76%). Mean time to reperfusion was 325 minutes (range 180–418 minutes). Longer time for reperfusion was associated with a decreased likelihood of good clinical outcome (RR [95% CI] for every 30 minute delay: unadjusted 0·85 [0·77–0·94]; adjusted 0·88 [0·80–0·98]). INTERPRETATION We confirm that delay in time to angiographic reperfusion leads to a decreased likelihood of good clinical outcome. Achieving rapid reperfusion may be critical for the successes of future acute endovascular trials. FUNDING: NIH/NINDS (study sponsor), Genentech Inc. (study drug - intra-arterial t-PA), EKOS Corp. (device), Concentric Inc. (device), Cordis Neurovascular, Inc. (device), and Boehringer Ingelheim (European Investigator Meeting support). PMID:24784550

  16. Multiple Improvements of Multiple Imputation Likelihood Ratio Tests

    OpenAIRE

    Chan, Kin Wai; Meng, Xiao-Li

    2017-01-01

    Multiple imputation (MI) inference handles missing data by first properly imputing the missing values $m$ times, and then combining the $m$ analysis results from applying a complete-data procedure to each of the completed datasets. However, the existing method for combining likelihood ratio tests has multiple defects: (i) the combined test statistic can be negative in practice when the reference null distribution is a standard $F$ distribution; (ii) it is not invariant to re-parametrization; ...

  17. Should I Text or Call Here? A Situation-Based Analysis of Drivers' Perceived Likelihood of Engaging in Mobile Phone Multitasking.

    Science.gov (United States)

    Oviedo-Trespalacios, Oscar; Haque, Md Mazharul; King, Mark; Washington, Simon

    2018-05-29

    This study investigated how situational characteristics typically encountered in the transport system influence drivers' perceived likelihood of engaging in mobile phone multitasking. The impacts of mobile phone tasks, perceived environmental complexity/risk, and drivers' individual differences were evaluated as relevant individual predictors within the behavioral adaptation framework. An innovative questionnaire, which includes randomized textual and visual scenarios, was administered to collect data from a sample of 447 drivers in South East Queensland-Australia (66% females; n = 296). The likelihood of engaging in a mobile phone task across various scenarios was modeled by a random parameters ordered probit model. Results indicated that drivers who are female, are frequent users of phones for texting/answering calls, have less favorable attitudes towards safety, and are highly disinhibited were more likely to report stronger intentions of engaging in mobile phone multitasking. However, more years with a valid driving license, self-efficacy toward self-regulation in demanding traffic conditions and police enforcement, texting tasks, and demanding traffic conditions were negatively related to self-reported likelihood of mobile phone multitasking. The unobserved heterogeneity warned of riskier groups among female drivers and participants who need a lot of convincing to believe that multitasking while driving is dangerous. This research concludes that behavioral adaptation theory is a robust framework explaining self-regulation of distracted drivers. © 2018 Society for Risk Analysis.

  18. Fast maximum likelihood estimation of mutation rates using a birth-death process.

    Science.gov (United States)

    Wu, Xiaowei; Zhu, Hongxiao

    2015-02-07

    Since fluctuation analysis was first introduced by Luria and Delbrück in 1943, it has been widely used to make inference about spontaneous mutation rates in cultured cells. Under certain model assumptions, the probability distribution of the number of mutants that appear in a fluctuation experiment can be derived explicitly, which provides the basis of mutation rate estimation. It has been shown that, among various existing estimators, the maximum likelihood estimator usually demonstrates some desirable properties such as consistency and lower mean squared error. However, its application in real experimental data is often hindered by slow computation of likelihood due to the recursive form of the mutant-count distribution. We propose a fast maximum likelihood estimator of mutation rates, MLE-BD, based on a birth-death process model with non-differential growth assumption. Simulation studies demonstrate that, compared with the conventional maximum likelihood estimator derived from the Luria-Delbrück distribution, MLE-BD achieves substantial improvement on computational speed and is applicable to arbitrarily large number of mutants. In addition, it still retains good accuracy on point estimation. Published by Elsevier Ltd.

  19. Likelihood analysis of the chalcone synthase genes suggests the role of positive selection in morning glories (Ipomoea).

    Science.gov (United States)

    Yang, Ji; Gu, Hongya; Yang, Ziheng

    2004-01-01

    Chalcone synthase (CHS) is a key enzyme in the biosynthesis of flavonoides, which are important for the pigmentation of flowers and act as attractants to pollinators. Genes encoding CHS constitute a multigene family in which the copy number varies among plant species and functional divergence appears to have occurred repeatedly. In morning glories (Ipomoea), five functional CHS genes (A-E) have been described. Phylogenetic analysis of the Ipomoea CHS gene family revealed that CHS A, B, and C experienced accelerated rates of amino acid substitution relative to CHS D and E. To examine whether the CHS genes of the morning glories underwent adaptive evolution, maximum-likelihood models of codon substitution were used to analyze the functional sequences in the Ipomoea CHS gene family. These models used the nonsynonymous/synonymous rate ratio (omega = d(N)/ d(S)) as an indicator of selective pressure and allowed the ratio to vary among lineages or sites. Likelihood ratio test suggested significant variation in selection pressure among amino acid sites, with a small proportion of them detected to be under positive selection along the branches ancestral to CHS A, B, and C. Positive Darwinian selection appears to have promoted the divergence of subfamily ABC and subfamily DE and is at least partially responsible for a rate increase following gene duplication.

  20. Likelihood inference for unions of interacting discs

    DEFF Research Database (Denmark)

    Møller, Jesper; Helisová, Katarina

    To the best of our knowledge, this is the first paper which discusses likelihood inference or a random set using a germ-grain model, where the individual grains are unobservable edge effects occur, and other complications appear. We consider the case where the grains form a disc process modelled...... is specified with respect to a given marked Poisson model (i.e. a Boolean model). We show how edge effects and other complications can be handled by considering a certain conditional likelihood. Our methodology is illustrated by analyzing Peter Diggle's heather dataset, where we discuss the results...... of simulation-based maximum likelihood inference and the effect of specifying different reference Poisson models....

  1. Further Analysis on the Mystery of the Surveyor III Dust Deposits

    Science.gov (United States)

    Metzger, Philip; Hintze, Paul; Trigwell, Steven; Lane, John

    2012-01-01

    The Apollo 12 lunar module (LM) landing near the Surveyor III spacecraft at the end of 1969 has remained the primary experimental verification of the predicted physics of plume ejecta effects from a rocket engine interacting with the surface of the moon. This was made possible by the return of the Surveyor III camera housing by the Apollo 12 astronauts, allowing detailed analysis of the composition of dust deposited by the LM plume. It was soon realized after the initial analysis of the camera housing that the LM plume tended to remove more dust than it had deposited. In the present study, coupons from the camera housing have been reexamined. In addition, plume effects recorded in landing videos from each Apollo mission have been studied for possible clues.

  2. Performance and sensitivity analysis of the generalized likelihood ratio method for failure detection. M.S. Thesis

    Science.gov (United States)

    Bueno, R. A.

    1977-01-01

    Results of the generalized likelihood ratio (GLR) technique for the detection of failures in aircraft application are presented, and its relationship to the properties of the Kalman-Bucy filter is examined. Under the assumption that the system is perfectly modeled, the detectability and distinguishability of four failure types are investigated by means of analysis and simulations. Detection of failures is found satisfactory, but problems in identifying correctly the mode of a failure may arise. These issues are closely examined as well as the sensitivity of GLR to modeling errors. The advantages and disadvantages of this technique are discussed, and various modifications are suggested to reduce its limitations in performance and computational complexity.

  3. Democracy, Autocracy and the Likelihood of International Conflict

    OpenAIRE

    Tangerås, Thomas

    2008-01-01

    This is a game-theoretic analysis of the link between regime type and international conflict. The democratic electorate can credibly punish the leader for bad conflict outcomes, whereas the autocratic selectorate cannot. For the fear of being thrown out of office, democratic leaders are (i) more selective about the wars they initiate and (ii) on average win more of the wars they start. Foreign policy behaviour is found to display strategic complementarities. The likelihood of interstate war, ...

  4. Penalized Maximum Likelihood Estimation for univariate normal mixture distributions

    International Nuclear Information System (INIS)

    Ridolfi, A.; Idier, J.

    2001-01-01

    Due to singularities of the likelihood function, the maximum likelihood approach for the estimation of the parameters of normal mixture models is an acknowledged ill posed optimization problem. Ill posedness is solved by penalizing the likelihood function. In the Bayesian framework, it amounts to incorporating an inverted gamma prior in the likelihood function. A penalized version of the EM algorithm is derived, which is still explicit and which intrinsically assures that the estimates are not singular. Numerical evidence of the latter property is put forward with a test

  5. Maximum-Likelihood Detection Of Noncoherent CPM

    Science.gov (United States)

    Divsalar, Dariush; Simon, Marvin K.

    1993-01-01

    Simplified detectors proposed for use in maximum-likelihood-sequence detection of symbols in alphabet of size M transmitted by uncoded, full-response continuous phase modulation over radio channel with additive white Gaussian noise. Structures of receivers derived from particular interpretation of maximum-likelihood metrics. Receivers include front ends, structures of which depends only on M, analogous to those in receivers of coherent CPM. Parts of receivers following front ends have structures, complexity of which would depend on N.

  6. On Bayesian Testing of Additive Conjoint Measurement Axioms Using Synthetic Likelihood.

    Science.gov (United States)

    Karabatsos, George

    2018-06-01

    This article introduces a Bayesian method for testing the axioms of additive conjoint measurement. The method is based on an importance sampling algorithm that performs likelihood-free, approximate Bayesian inference using a synthetic likelihood to overcome the analytical intractability of this testing problem. This new method improves upon previous methods because it provides an omnibus test of the entire hierarchy of cancellation axioms, beyond double cancellation. It does so while accounting for the posterior uncertainty that is inherent in the empirical orderings that are implied by these axioms, together. The new method is illustrated through a test of the cancellation axioms on a classic survey data set, and through the analysis of simulated data.

  7. Application of the subchannel analysis code COBRA III C for liquid sodium

    International Nuclear Information System (INIS)

    Nissen, K.L.

    1981-01-01

    The subchannel-analysis code COBRA III C was developed to gain knowledge of mass flow and temperature distribution in rod bundles of light water reactors. A comparison of experimental results for the temperature distribution in a 19 rod bundle with calculations done by the computer program shows the capability of COBRA III C to handle liquid sodium cooling. The code needs sodium properties as well as changed correlations for turbulent mixing and heat transfer at the rod. (orig.) [de

  8. Altered sensorimotor activation patterns in idiopathic dystonia-an activation likelihood estimation meta-analysis of functional brain imaging studies

    DEFF Research Database (Denmark)

    Løkkegaard, Annemette; Herz, Damian M; Haagensen, Brian Numelin

    2016-01-01

    Dystonia is characterized by sustained or intermittent muscle contractions causing abnormal, often repetitive, movements or postures. Functional neuroimaging studies have yielded abnormal task-related sensorimotor activation in dystonia, but the results appear to be rather variable across studies....... Further, study size was usually small including different types of dystonia. Here we performed an activation likelihood estimation (ALE) meta-analysis of functional neuroimaging studies in patients with primary dystonia to test for convergence of dystonia-related alterations in task-related activity...... postcentral gyrus, right superior temporal gyrus and dorsal midbrain. Apart from the midbrain cluster, all between-group differences in task-related activity were retrieved in a sub-analysis including only the 14 studies on patients with focal dystonia. For focal dystonia, an additional cluster of increased...

  9. Sustainability likelihood of remediation options for metal-contaminated soil/sediment.

    Science.gov (United States)

    Chen, Season S; Taylor, Jessica S; Baek, Kitae; Khan, Eakalak; Tsang, Daniel C W; Ok, Yong Sik

    2017-05-01

    Multi-criteria analysis and detailed impact analysis were carried out to assess the sustainability of four remedial alternatives for metal-contaminated soil/sediment at former timber treatment sites and harbour sediment with different scales. The sustainability was evaluated in the aspects of human health and safety, environment, stakeholder concern, and land use, under four different scenarios with varying weighting factors. The Monte Carlo simulation was performed to reveal the likelihood of accomplishing sustainable remediation with different treatment options at different sites. The results showed that in-situ remedial technologies were more sustainable than ex-situ ones, where in-situ containment demonstrated both the most sustainable result and the highest probability to achieve sustainability amongst the four remedial alternatives in this study, reflecting the lesser extent of off-site and on-site impacts. Concerns associated with ex-situ options were adverse impacts tied to all four aspects and caused by excavation, extraction, and off-site disposal. The results of this study suggested the importance of considering the uncertainties resulting from the remedial options (i.e., stochastic analysis) in addition to the overall sustainability scores (i.e., deterministic analysis). The developed framework and model simulation could serve as an assessment for the sustainability likelihood of remedial options to ensure sustainable remediation of contaminated sites. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. COSMIC MICROWAVE BACKGROUND LIKELIHOOD APPROXIMATION BY A GAUSSIANIZED BLACKWELL-RAO ESTIMATOR

    International Nuclear Information System (INIS)

    Rudjord, Oe.; Groeneboom, N. E.; Eriksen, H. K.; Huey, Greg; Gorski, K. M.; Jewell, J. B.

    2009-01-01

    We introduce a new cosmic microwave background (CMB) temperature likelihood approximation called the Gaussianized Blackwell-Rao estimator. This estimator is derived by transforming the observed marginal power spectrum distributions obtained by the CMB Gibbs sampler into standard univariate Gaussians, and then approximating their joint transformed distribution by a multivariate Gaussian. The method is exact for full-sky coverage and uniform noise and an excellent approximation for sky cuts and scanning patterns relevant for modern satellite experiments such as the Wilkinson Microwave Anisotropy Probe (WMAP) and Planck. The result is a stable, accurate, and computationally very efficient CMB temperature likelihood representation that allows the user to exploit the unique error propagation capabilities of the Gibbs sampler to high ls. A single evaluation of this estimator between l = 2 and 200 takes ∼0.2 CPU milliseconds, while for comparison, a singe pixel space likelihood evaluation between l = 2 and 30 for a map with ∼2500 pixels requires ∼20 s. We apply this tool to the five-year WMAP temperature data, and re-estimate the angular temperature power spectrum, C l , and likelihood, L(C l ), for l ≤ 200, and derive new cosmological parameters for the standard six-parameter ΛCDM model. Our spectrum is in excellent agreement with the official WMAP spectrum, but we find slight differences in the derived cosmological parameters. Most importantly, the spectral index of scalar perturbations is n s = 0.973 ± 0.014, 1.9σ away from unity and 0.6σ higher than the official WMAP result, n s = 0.965 ± 0.014. This suggests that an exact likelihood treatment is required to higher ls than previously believed, reinforcing and extending our conclusions from the three-year WMAP analysis. In that case, we found that the suboptimal likelihood approximation adopted between l = 12 and 30 by the WMAP team biased n s low by 0.4σ, while here we find that the same approximation

  11. Analysis of gas-liquid metal two-phase flows using a reactor safety analysis code SIMMER-III

    International Nuclear Information System (INIS)

    Suzuki, Tohru; Tobita, Yoshiharu; Kondo, Satoru; Saito, Yasushi; Mishima, Kaichiro

    2003-01-01

    SIMMER-III, a safety analysis code for liquid-metal fast reactors (LMFRs), includes a momentum exchange model based on conventional correlations for ordinary gas-liquid flows, such as an air-water system. From the viewpoint of safety evaluation of core disruptive accidents (CDAs) in LMFRs, we need to confirm that the code can predict the two-phase flow behaviors with high liquid-to-gas density ratios formed during a CDA. In the present study, the momentum exchange model of SIMMER-III was assessed and improved using experimental data of two-phase flows containing liquid metal, on which fundamental information, such as bubble shapes, void fractions and velocity fields, has been lacking. It was found that the original SIMMER-III can suitably represent high liquid-to-gas density ratio flows including ellipsoidal bubbles as seen in lower gas fluxes. In addition, the employment of Kataoka-Ishii's correlation has improved the accuracy of SIMMER-III for gas-liquid metal flows with cap-shape bubbles as identified in higher gas fluxes. Moreover, a new procedure, in which an appropriate drag coefficient can be automatically selected according to bubble shape, was developed. Through this work, the reliability and the precision of SIMMER-III have been much raised with regard to bubbly flows for various liquid-to-gas density ratios

  12. Efficient Detection of Repeating Sites to Accelerate Phylogenetic Likelihood Calculations.

    Science.gov (United States)

    Kobert, K; Stamatakis, A; Flouri, T

    2017-03-01

    The phylogenetic likelihood function (PLF) is the major computational bottleneck in several applications of evolutionary biology such as phylogenetic inference, species delimitation, model selection, and divergence times estimation. Given the alignment, a tree and the evolutionary model parameters, the likelihood function computes the conditional likelihood vectors for every node of the tree. Vector entries for which all input data are identical result in redundant likelihood operations which, in turn, yield identical conditional values. Such operations can be omitted for improving run-time and, using appropriate data structures, reducing memory usage. We present a fast, novel method for identifying and omitting such redundant operations in phylogenetic likelihood calculations, and assess the performance improvement and memory savings attained by our method. Using empirical and simulated data sets, we show that a prototype implementation of our method yields up to 12-fold speedups and uses up to 78% less memory than one of the fastest and most highly tuned implementations of the PLF currently available. Our method is generic and can seamlessly be integrated into any phylogenetic likelihood implementation. [Algorithms; maximum likelihood; phylogenetic likelihood function; phylogenetics]. © The Author(s) 2016. Published by Oxford University Press, on behalf of the Society of Systematic Biologists.

  13. Maintaining symmetry of simulated likelihood functions

    DEFF Research Database (Denmark)

    Andersen, Laura Mørch

    This paper suggests solutions to two different types of simulation errors related to Quasi-Monte Carlo integration. Likelihood functions which depend on standard deviations of mixed parameters are symmetric in nature. This paper shows that antithetic draws preserve this symmetry and thereby...... improves precision substantially. Another source of error is that models testing away mixing dimensions must replicate the relevant dimensions of the quasi-random draws in the simulation of the restricted likelihood. These simulation errors are ignored in the standard estimation procedures used today...

  14. Uncertainty about the true source. A note on the likelihood ratio at the activity level.

    Science.gov (United States)

    Taroni, Franco; Biedermann, Alex; Bozza, Silvia; Comte, Jennifer; Garbolino, Paolo

    2012-07-10

    This paper focuses on likelihood ratio based evaluations of fibre evidence in cases in which there is uncertainty about whether or not the reference item available for analysis - that is, an item typically taken from the suspect or seized at his home - is the item actually worn at the time of the offence. A likelihood ratio approach is proposed that, for situations in which certain categorical assumptions can be made about additionally introduced parameters, converges to formula described in existing literature. The properties of the proposed likelihood ratio approach are analysed through sensitivity analyses and discussed with respect to possible argumentative implications that arise in practice. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  15. Thin-plate spline analysis of the cranial base in subjects with Class III malocclusion.

    Science.gov (United States)

    Singh, G D; McNamara, J A; Lozanoff, S

    1997-08-01

    The role of the cranial base in the emergence of Class III malocclusion is not fully understood. This study determines deformations that contribute to a Class III cranial base morphology, employing thin-plate spline analysis on lateral cephalographs. A total of 73 children of European-American descent aged between 5 and 11 years of age with Class III malocclusion were compared with an equivalent group of subjects with a normal, untreated, Class I molar occlusion. The cephalographs were traced, checked and subdivided into seven age- and sex-matched groups. Thirteen points on the cranial base were identified and digitized. The datasets were scaled to an equivalent size, and statistical analysis indicated significant differences between average Class I and Class III cranial base morphologies for each group. Thin-plate spline analysis indicated that both affine (uniform) and non-affine transformations contribute toward the total spline for each average cranial base morphology at each age group analysed. For non-affine transformations, Partial warps 10, 8 and 7 had high magnitudes, indicating large-scale deformations affecting Bolton point, basion, pterygo-maxillare, Ricketts' point and articulare. In contrast, high eigenvalues associated with Partial warps 1-3, indicating localized shape changes, were found at tuberculum sellae, sella, and the frontonasomaxillary suture. It is concluded that large spatial-scale deformations affect the occipital complex of the cranial base and sphenoidal region, in combination with localized distortions at the frontonasal suture. These deformations may contribute to reduced orthocephalization or deficient flattening of the cranial base antero-posteriorly that, in turn, leads to the formation of a Class III malocclusion.

  16. Event-related fMRI studies of false memory: An Activation Likelihood Estimation meta-analysis.

    Science.gov (United States)

    Kurkela, Kyle A; Dennis, Nancy A

    2016-01-29

    Over the last two decades, a wealth of research in the domain of episodic memory has focused on understanding the neural correlates mediating false memories, or memories for events that never happened. While several recent qualitative reviews have attempted to synthesize this literature, methodological differences amongst the empirical studies and a focus on only a sub-set of the findings has limited broader conclusions regarding the neural mechanisms underlying false memories. The current study performed a voxel-wise quantitative meta-analysis using activation likelihood estimation to investigate commonalities within the functional magnetic resonance imaging (fMRI) literature studying false memory. The results were broken down by memory phase (encoding, retrieval), as well as sub-analyses looking at differences in baseline (hit, correct rejection), memoranda (verbal, semantic), and experimental paradigm (e.g., semantic relatedness and perceptual relatedness) within retrieval. Concordance maps identified significant overlap across studies for each analysis. Several regions were identified in the general false retrieval analysis as well as multiple sub-analyses, indicating their ubiquitous, yet critical role in false retrieval (medial superior frontal gyrus, left precentral gyrus, left inferior parietal cortex). Additionally, several regions showed baseline- and paradigm-specific effects (hit/perceptual relatedness: inferior and middle occipital gyrus; CRs: bilateral inferior parietal cortex, precuneus, left caudate). With respect to encoding, analyses showed common activity in the left middle temporal gyrus and anterior cingulate cortex. No analysis identified a common cluster of activation in the medial temporal lobe. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Elaboration Likelihood Model and an Analysis of the Contexts of Its Application

    OpenAIRE

    Aslıhan Kıymalıoğlu

    2014-01-01

    Elaboration Likelihood Model (ELM), which supports the existence of two routes to persuasion: central and peripheral routes, has been one of the major models on persuasion. As the number of studies in the Turkish literature on ELM is limited, a detailed explanation of the model together with a comprehensive literature review was considered to be contributory for this gap. The findings of the review reveal that the model was mostly used in marketing and advertising researches, that the concept...

  18. Penggunaan Elaboration Likelihood Model dalam Menganalisis Penerimaan Teknologi Informasi

    OpenAIRE

    vitrian, vitrian2

    2010-01-01

    This article discusses some technology acceptance models in an organization. Thorough analysis of how technology is acceptable help managers make any planning to implement new teachnology and make sure that new technology could enhance organization's performance. Elaboration Likelihood Model (ELM) is the one which sheds light on some behavioral factors in acceptance of information technology. The basic tenet of ELM states that human behavior in principle can be influenced through central r...

  19. The fine-tuning cost of the likelihood in SUSY models

    CERN Document Server

    Ghilencea, D M

    2013-01-01

    In SUSY models, the fine tuning of the electroweak (EW) scale with respect to their parameters gamma_i={m_0, m_{1/2}, mu_0, A_0, B_0,...} and the maximal likelihood L to fit the experimental data are usually regarded as two different problems. We show that, if one regards the EW minimum conditions as constraints that fix the EW scale, this commonly held view is not correct and that the likelihood contains all the information about fine-tuning. In this case we show that the corrected likelihood is equal to the ratio L/Delta of the usual likelihood L and the traditional fine tuning measure Delta of the EW scale. A similar result is obtained for the integrated likelihood over the set {gamma_i}, that can be written as a surface integral of the ratio L/Delta, with the surface in gamma_i space determined by the EW minimum constraints. As a result, a large likelihood actually demands a large ratio L/Delta or equivalently, a small chi^2_{new}=chi^2_{old}+2*ln(Delta). This shows the fine-tuning cost to the likelihood ...

  20. Maximum-likelihood estimation of the hyperbolic parameters from grouped observations

    DEFF Research Database (Denmark)

    Jensen, Jens Ledet

    1988-01-01

    a least-squares problem. The second procedure Hypesti first approaches the maximum-likelihood estimate by iterating in the profile-log likelihood function for the scale parameter. Close to the maximum of the likelihood function, the estimation is brought to an end by iteration, using all four parameters...

  1. Thermodynamic data for predicting concentrations of Pu(III), Am(III), and Cm(III) in geologic environments

    Energy Technology Data Exchange (ETDEWEB)

    Rai, Dhanpat; Rao, Linfeng; Weger, H.T.; Felmy, A.R. [Pacific Northwest National Laboratory, WA (United States); Choppin, G.R. [Florida State University, Florida (United States); Yui, Mikazu [Japan Nuclear Cycle Development Inst., Tokai Works, Tokai, Ibaraki (Japan)

    1999-01-01

    This report provides thermodynamic data for predicting concentrations of Pu(III), Am(III), and Cm(III) in geologic environments, and contributes to an integration of the JNC chemical thermodynamic database, JNC-TDB (previously PNC-TDB), for the performance analysis of geological isolation system for high-level radioactive wastes. Thermodynamic data for the formation of complexes or compounds with hydroxide, chloride, fluoride, carbonate, nitrate, sulfate and phosphate are discussed in this report. Where data for specific actinide(III) species are lacking, the data were selected based on chemical analogy to other trivalent actinides. In this study, the Pitzer ion-interaction model is mainly used to extrapolate thermodynamic constants to zero ionic strength at 25degC. (author)

  2. Stability of Tl(III) in the context of speciation analysis of thallium in plants.

    Science.gov (United States)

    Sadowska, Monika; Biaduń, Ewa; Krasnodębska-Ostręga, Beata

    2016-02-01

    The paper presents both "good" and "bad" results obtained during speciation analysis of thallium in plant tissues of a hyperaccumulator of this metal. The object was white mustard - Sinapis alba L. In this plant there were found traces of trivalent thallium. The crucial point of this study (especially in the case of so unstable thallium form as Tl(III)) was to prove that the presence of Tl(III) was not caused by the procedure of sample preparation itself, and that the whole analytical method provides reliable results. Choice of the method for conservation of the initial speciation, extraction with the highest efficiency and proving the correctness of the obtained data were the most difficult parts of the presented study. It was found that: both freezing and drying cause significant changes in the speciation of thallium; quantitative analysis could be performed only with fresh tissues of mustard plants; only short-term storage of an extract from fresh plant tissues is possible; the methodology is not the source of thallium (III); only the presence of DTPA can greatly limit the reduction of TI(III) to TI(I) (up to 1-3%); the UV irradiation results in disintegration of TI(III)DTPA in the presence of plant matrix (reduction up to 90%). Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Likelihood ratio sequential sampling models of recognition memory.

    Science.gov (United States)

    Osth, Adam F; Dennis, Simon; Heathcote, Andrew

    2017-02-01

    The mirror effect - a phenomenon whereby a manipulation produces opposite effects on hit and false alarm rates - is benchmark regularity of recognition memory. A likelihood ratio decision process, basing recognition on the relative likelihood that a stimulus is a target or a lure, naturally predicts the mirror effect, and so has been widely adopted in quantitative models of recognition memory. Glanzer, Hilford, and Maloney (2009) demonstrated that likelihood ratio models, assuming Gaussian memory strength, are also capable of explaining regularities observed in receiver-operating characteristics (ROCs), such as greater target than lure variance. Despite its central place in theorising about recognition memory, however, this class of models has not been tested using response time (RT) distributions. In this article, we develop a linear approximation to the likelihood ratio transformation, which we show predicts the same regularities as the exact transformation. This development enabled us to develop a tractable model of recognition-memory RT based on the diffusion decision model (DDM), with inputs (drift rates) provided by an approximate likelihood ratio transformation. We compared this "LR-DDM" to a standard DDM where all targets and lures receive their own drift rate parameters. Both were implemented as hierarchical Bayesian models and applied to four datasets. Model selection taking into account parsimony favored the LR-DDM, which requires fewer parameters than the standard DDM but still fits the data well. These results support log-likelihood based models as providing an elegant explanation of the regularities of recognition memory, not only in terms of choices made but also in terms of the times it takes to make them. Copyright © 2016 Elsevier Inc. All rights reserved.

  4. Determination of point of maximum likelihood in failure domain using genetic algorithms

    International Nuclear Information System (INIS)

    Obadage, A.S.; Harnpornchai, N.

    2006-01-01

    The point of maximum likelihood in a failure domain yields the highest value of the probability density function in the failure domain. The maximum-likelihood point thus represents the worst combination of random variables that contribute in the failure event. In this work Genetic Algorithms (GAs) with an adaptive penalty scheme have been proposed as a tool for the determination of the maximum likelihood point. The utilization of only numerical values in the GAs operation makes the algorithms applicable to cases of non-linear and implicit single and multiple limit state function(s). The algorithmic simplicity readily extends its application to higher dimensional problems. When combined with Monte Carlo Simulation, the proposed methodology will reduce the computational complexity and at the same time will enhance the possibility in rare-event analysis under limited computational resources. Since, there is no approximation done in the procedure, the solution obtained is considered accurate. Consequently, GAs can be used as a tool for increasing the computational efficiency in the element and system reliability analyses

  5. MATHEMATICAL RISK ANALYSIS: VIA NICHOLAS RISK MODEL AND BAYESIAN ANALYSIS

    Directory of Open Access Journals (Sweden)

    Anass BAYAGA

    2010-07-01

    Full Text Available The objective of this second part of a two-phased study was to explorethe predictive power of quantitative risk analysis (QRA method andprocess within Higher Education Institution (HEI. The method and process investigated the use impact analysis via Nicholas risk model and Bayesian analysis, with a sample of hundred (100 risk analysts in a historically black South African University in the greater Eastern Cape Province.The first findings supported and confirmed previous literature (KingIII report, 2009: Nicholas and Steyn, 2008: Stoney, 2007: COSA, 2004 that there was a direct relationship between risk factor, its likelihood and impact, certiris paribus. The second finding in relation to either controlling the likelihood or the impact of occurrence of risk (Nicholas risk model was that to have a brighter risk reward, it was important to control the likelihood ofoccurrence of risks as compared with its impact so to have a direct effect on entire University. On the Bayesian analysis, thus third finding, the impact of risk should be predicted along three aspects. These aspects included the human impact (decisions made, the property impact (students and infrastructural based and the business impact. Lastly, the study revealed that although in most business cases, where as business cycles considerably vary dependingon the industry and or the institution, this study revealed that, most impacts in HEI (University was within the period of one academic.The recommendation was that application of quantitative risk analysisshould be related to current legislative framework that affects HEI.

  6. Extraction and separation studies of Ga(III, In(III and Tl(III using the neutral organophosphorous extractant, Cyanex-923

    Directory of Open Access Journals (Sweden)

    P. M. DHADKE

    2003-07-01

    Full Text Available The neutral extractant, Cyanes-923 has been used for the extraction and separation of gallium(III, indium(III and thallium(III from acidic solution. These metal ions were found to be quantitatively extracted with Cyanex-923 in toluene in the pH range 4.5–5.5, 5.0–6.5 and 1.5–3.0, respectively, and from the organic phase they can be stripped with 2.0 mol dm-3 HNO3, 3.0 mol dm-3 HNO3 and 3.0 mol dm-3 HCl, respectively. The effect of pH equilibration period, diluents, diverse ions and stripping agents on the extraction of Ga(III, In(III and Tl(III has been studied. The stroichiometry of the extracted species of these metal ions was determined on the basis of the slope analysis method. The reaction proceed by solvation and the probable extracted species found were [MCl3. 3Cyanex-923] [where M = Ga(III or In(III ] and [HTlCl4. 3Cyanex-923]. Based on these results a sequential procedure for the separation of Ga(III, In(III and Tl(III from each other was developed.

  7. Hypnosis and pain perception: An Activation Likelihood Estimation (ALE) meta-analysis of functional neuroimaging studies.

    Science.gov (United States)

    Del Casale, Antonio; Ferracuti, Stefano; Rapinesi, Chiara; De Rossi, Pietro; Angeletti, Gloria; Sani, Gabriele; Kotzalidis, Georgios D; Girardi, Paolo

    2015-12-01

    Several studies reported that hypnosis can modulate pain perception and tolerance by affecting cortical and subcortical activity in brain regions involved in these processes. We conducted an Activation Likelihood Estimation (ALE) meta-analysis on functional neuroimaging studies of pain perception under hypnosis to identify brain activation-deactivation patterns occurring during hypnotic suggestions aiming at pain reduction, including hypnotic analgesic, pleasant, or depersonalization suggestions (HASs). We searched the PubMed, Embase and PsycInfo databases; we included papers published in peer-reviewed journals dealing with functional neuroimaging and hypnosis-modulated pain perception. The ALE meta-analysis encompassed data from 75 healthy volunteers reported in 8 functional neuroimaging studies. HASs during experimentally-induced pain compared to control conditions correlated with significant activations of the right anterior cingulate cortex (Brodmann's Area [BA] 32), left superior frontal gyrus (BA 6), and right insula, and deactivation of right midline nuclei of the thalamus. HASs during experimental pain impact both cortical and subcortical brain activity. The anterior cingulate, left superior frontal, and right insular cortices activation increases could induce a thalamic deactivation (top-down inhibition), which may correlate with reductions in pain intensity. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Maximum likelihood estimation for integrated diffusion processes

    DEFF Research Database (Denmark)

    Baltazar-Larios, Fernando; Sørensen, Michael

    We propose a method for obtaining maximum likelihood estimates of parameters in diffusion models when the data is a discrete time sample of the integral of the process, while no direct observations of the process itself are available. The data are, moreover, assumed to be contaminated...... EM-algorithm to obtain maximum likelihood estimates of the parameters in the diffusion model. As part of the algorithm, we use a recent simple method for approximate simulation of diffusion bridges. In simulation studies for the Ornstein-Uhlenbeck process and the CIR process the proposed method works...... by measurement errors. Integrated volatility is an example of this type of observations. Another example is ice-core data on oxygen isotopes used to investigate paleo-temperatures. The data can be viewed as incomplete observations of a model with a tractable likelihood function. Therefore we propose a simulated...

  9. High-order Composite Likelihood Inference for Max-Stable Distributions and Processes

    KAUST Repository

    Castruccio, Stefano; Huser, Raphaë l; Genton, Marc G.

    2015-01-01

    In multivariate or spatial extremes, inference for max-stable processes observed at a large collection of locations is a very challenging problem in computational statistics, and current approaches typically rely on less expensive composite likelihoods constructed from small subsets of data. In this work, we explore the limits of modern state-of-the-art computational facilities to perform full likelihood inference and to efficiently evaluate high-order composite likelihoods. With extensive simulations, we assess the loss of information of composite likelihood estimators with respect to a full likelihood approach for some widely-used multivariate or spatial extreme models, we discuss how to choose composite likelihood truncation to improve the efficiency, and we also provide recommendations for practitioners. This article has supplementary material online.

  10. High-order Composite Likelihood Inference for Max-Stable Distributions and Processes

    KAUST Repository

    Castruccio, Stefano

    2015-09-29

    In multivariate or spatial extremes, inference for max-stable processes observed at a large collection of locations is a very challenging problem in computational statistics, and current approaches typically rely on less expensive composite likelihoods constructed from small subsets of data. In this work, we explore the limits of modern state-of-the-art computational facilities to perform full likelihood inference and to efficiently evaluate high-order composite likelihoods. With extensive simulations, we assess the loss of information of composite likelihood estimators with respect to a full likelihood approach for some widely-used multivariate or spatial extreme models, we discuss how to choose composite likelihood truncation to improve the efficiency, and we also provide recommendations for practitioners. This article has supplementary material online.

  11. Efficient estimators for likelihood ratio sensitivity indices of complex stochastic dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Arampatzis, Georgios; Katsoulakis, Markos A.; Rey-Bellet, Luc [Department of Mathematics and Statistics, University of Massachusetts, Amherst, Massachusetts 01003 (United States)

    2016-03-14

    We demonstrate that centered likelihood ratio estimators for the sensitivity indices of complex stochastic dynamics are highly efficient with low, constant in time variance and consequently they are suitable for sensitivity analysis in long-time and steady-state regimes. These estimators rely on a new covariance formulation of the likelihood ratio that includes as a submatrix a Fisher information matrix for stochastic dynamics and can also be used for fast screening of insensitive parameters and parameter combinations. The proposed methods are applicable to broad classes of stochastic dynamics such as chemical reaction networks, Langevin-type equations and stochastic models in finance, including systems with a high dimensional parameter space and/or disparate decorrelation times between different observables. Furthermore, they are simple to implement as a standard observable in any existing simulation algorithm without additional modifications.

  12. Sensitivity analysis of the terrestrial food chain model FOOD III

    International Nuclear Information System (INIS)

    Zach, Reto.

    1980-10-01

    As a first step in constructing a terrestrial food chain model suitable for long-term waste management situations, a numerical sensitivity analysis of FOOD III was carried out to identify important model parameters. The analysis involved 42 radionuclides, four pathways, 14 food types, 93 parameters and three percentages of parameter variation. We also investigated the importance of radionuclides, pathways and food types. The analysis involved a simple contamination model to render results from individual pathways comparable. The analysis showed that radionuclides vary greatly in their dose contribution to each of the four pathways, but relative contributions to each pathway are very similar. Man's and animals' drinking water pathways are much more important than the leaf and root pathways. However, this result depends on the contamination model used. All the pathways contain unimportant food types. Considering the number of parameters involved, FOOD III has too many different food types. Many of the parameters of the leaf and root pathway are important. However, this is true for only a few of the parameters of animals' drinking water pathway, and for neither of the two parameters of mans' drinking water pathway. The radiological decay constant increases the variability of these results. The dose factor is consistently the most important variable, and it explains most of the variability of radionuclide doses within pathways. Consideration of the variability of dose factors is important in contemporary as well as long-term waste management assessment models, if realistic estimates are to be made. (auth)

  13. Earthquake likelihood model testing

    Science.gov (United States)

    Schorlemmer, D.; Gerstenberger, M.C.; Wiemer, S.; Jackson, D.D.; Rhoades, D.A.

    2007-01-01

    INTRODUCTIONThe Regional Earthquake Likelihood Models (RELM) project aims to produce and evaluate alternate models of earthquake potential (probability per unit volume, magnitude, and time) for California. Based on differing assumptions, these models are produced to test the validity of their assumptions and to explore which models should be incorporated in seismic hazard and risk evaluation. Tests based on physical and geological criteria are useful but we focus on statistical methods using future earthquake catalog data only. We envision two evaluations: a test of consistency with observed data and a comparison of all pairs of models for relative consistency. Both tests are based on the likelihood method, and both are fully prospective (i.e., the models are not adjusted to fit the test data). To be tested, each model must assign a probability to any possible event within a specified region of space, time, and magnitude. For our tests the models must use a common format: earthquake rates in specified “bins” with location, magnitude, time, and focal mechanism limits.Seismology cannot yet deterministically predict individual earthquakes; however, it should seek the best possible models for forecasting earthquake occurrence. This paper describes the statistical rules of an experiment to examine and test earthquake forecasts. The primary purposes of the tests described below are to evaluate physical models for earthquakes, assure that source models used in seismic hazard and risk studies are consistent with earthquake data, and provide quantitative measures by which models can be assigned weights in a consensus model or be judged as suitable for particular regions.In this paper we develop a statistical method for testing earthquake likelihood models. A companion paper (Schorlemmer and Gerstenberger 2007, this issue) discusses the actual implementation of these tests in the framework of the RELM initiative.Statistical testing of hypotheses is a common task and a

  14. EPR spectrum deconvolution and dose assessment of fossil tooth enamel using maximum likelihood common factor analysis

    International Nuclear Information System (INIS)

    Vanhaelewyn, G.; Callens, F.; Gruen, R.

    2000-01-01

    In order to determine the components which give rise to the EPR spectrum around g = 2 we have applied Maximum Likelihood Common Factor Analysis (MLCFA) on the EPR spectra of enamel sample 1126 which has previously been analysed by continuous wave and pulsed EPR as well as EPR microscopy. MLCFA yielded agreeing results on three sets of X-band spectra and the following components were identified: an orthorhombic component attributed to CO - 2 , an axial component CO 3- 3 , as well as four isotropic components, three of which could be attributed to SO - 2 , a tumbling CO - 2 and a central line of a dimethyl radical. The X-band results were confirmed by analysis of Q-band spectra where three additional isotropic lines were found, however, these three components could not be attributed to known radicals. The orthorhombic component was used to establish dose response curves for the assessment of the past radiation dose, D E . The results appear to be more reliable than those based on conventional peak-to-peak EPR intensity measurements or simple Gaussian deconvolution methods

  15. CFHTLenS: a Gaussian likelihood is a sufficient approximation for a cosmological analysis of third-order cosmic shear statistics

    Science.gov (United States)

    Simon, P.; Semboloni, E.; van Waerbeke, L.; Hoekstra, H.; Erben, T.; Fu, L.; Harnois-Déraps, J.; Heymans, C.; Hildebrandt, H.; Kilbinger, M.; Kitching, T. D.; Miller, L.; Schrabback, T.

    2015-05-01

    We study the correlations of the shear signal between triplets of sources in the Canada-France-Hawaii Telescope Lensing Survey (CFHTLenS) to probe cosmological parameters via the matter bispectrum. In contrast to previous studies, we adopt a non-Gaussian model of the data likelihood which is supported by our simulations of the survey. We find that for state-of-the-art surveys, similar to CFHTLenS, a Gaussian likelihood analysis is a reasonable approximation, albeit small differences in the parameter constraints are already visible. For future surveys we expect that a Gaussian model becomes inaccurate. Our algorithm for a refined non-Gaussian analysis and data compression is then of great utility especially because it is not much more elaborate if simulated data are available. Applying this algorithm to the third-order correlations of shear alone in a blind analysis, we find a good agreement with the standard cosmological model: Σ _8=σ _8(Ω _m/0.27)^{0.64}=0.79^{+0.08}_{-0.11} for a flat Λ cold dark matter cosmology with h = 0.7 ± 0.04 (68 per cent credible interval). Nevertheless our models provide only moderately good fits as indicated by χ2/dof = 2.9, including a 20 per cent rms uncertainty in the predicted signal amplitude. The models cannot explain a signal drop on scales around 15 arcmin, which may be caused by systematics. It is unclear whether the discrepancy can be fully explained by residual point spread function systematics of which we find evidence at least on scales of a few arcmin. Therefore we need a better understanding of higher order correlations of cosmic shear and their systematics to confidently apply them as cosmological probes.

  16. Factors Affecting Adjuvant Therapy in Stage III Pancreatic Cancer—Analysis of the National Cancer Database

    Directory of Open Access Journals (Sweden)

    Mridula Krishnan

    2017-08-01

    Full Text Available Background: Adjuvant therapy after curative resection is associated with survival benefit in stage III pancreatic cancer. We analyzed the factors affecting the outcome of adjuvant therapy in stage III pancreatic cancer and compared overall survival with different modalities of adjuvant treatment. Methods: This is a retrospective study of patients with stage III pancreatic cancer listed in the National Cancer Database (NCDB who were diagnosed between 2004 and 2012. Patients were stratified based on adjuvant therapy they received. Unadjusted Kaplan-Meier and multivariable Cox regression analysis were performed. Results: We analyzed a cohort included 1731 patients who were recipients of adjuvant therapy for stage III pancreatic cancer within the limits of our database. Patients who received adjuvant chemoradiation had the longest postdiagnosis survival time, followed by patients who received adjuvant chemotherapy, and finally patients who received no adjuvant therapy. On multivariate analysis, advancing age and patients with Medicaid had worse survival, whereas Spanish origin and lower Charlson comorbidity score had better survival. Conclusions: Our study is the largest trial using the NCDB addressing the effects of adjuvant therapy specifically in stage III pancreatic cancer. Within the limits of our study, survival benefit with adjuvant therapy was more apparent with longer duration from date of diagnosis.

  17. Approximate likelihood approaches for detecting the influence of primordial gravitational waves in cosmic microwave background polarization

    Science.gov (United States)

    Pan, Zhen; Anderes, Ethan; Knox, Lloyd

    2018-05-01

    One of the major targets for next-generation cosmic microwave background (CMB) experiments is the detection of the primordial B-mode signal. Planning is under way for Stage-IV experiments that are projected to have instrumental noise small enough to make lensing and foregrounds the dominant source of uncertainty for estimating the tensor-to-scalar ratio r from polarization maps. This makes delensing a crucial part of future CMB polarization science. In this paper we present a likelihood method for estimating the tensor-to-scalar ratio r from CMB polarization observations, which combines the benefits of a full-scale likelihood approach with the tractability of the quadratic delensing technique. This method is a pixel space, all order likelihood analysis of the quadratic delensed B modes, and it essentially builds upon the quadratic delenser by taking into account all order lensing and pixel space anomalies. Its tractability relies on a crucial factorization of the pixel space covariance matrix of the polarization observations which allows one to compute the full Gaussian approximate likelihood profile, as a function of r , at the same computational cost of a single likelihood evaluation.

  18. Estimating likelihood of future crashes for crash-prone drivers

    OpenAIRE

    Subasish Das; Xiaoduan Sun; Fan Wang; Charles Leboeuf

    2015-01-01

    At-fault crash-prone drivers are usually considered as the high risk group for possible future incidents or crashes. In Louisiana, 34% of crashes are repeatedly committed by the at-fault crash-prone drivers who represent only 5% of the total licensed drivers in the state. This research has conducted an exploratory data analysis based on the driver faultiness and proneness. The objective of this study is to develop a crash prediction model to estimate the likelihood of future crashes for the a...

  19. Application of a stratum-specific likelihood ratio analysis in a screen for depression among a community-dwelling population in Japan

    Directory of Open Access Journals (Sweden)

    Sugawara N

    2017-09-01

    Full Text Available Norio Sugawara,1,2 Ayako Kaneda,2 Ippei Takahashi,3 Shigeyuki Nakaji,3 Norio Yasui-Furukori2 1Department of Clinical Epidemiology, Translational Medical Center, National Center of Neurology and Psychiatry, Kodaira, Tokyo, 2Department of Neuropsychiatry, Hirosaki University School of Medicine, Hirosaki, 3Department of Social Medicine, Hirosaki University School of Medicine, Hirosaki, Japan Background: Efficient screening for depression is important in community mental health. In this study, we applied a stratum-specific likelihood ratio (SSLR analysis, which is independent of the prevalence of the target disease, to screen for depression among community-dwelling individuals.Method: The Center for Epidemiologic Studies Depression Scale (CES-D and the Mini International Neuropsychiatric Interview (MINI were administered to 789 individuals (19–87 years of age who participated in the Iwaki Health Promotion Project 2011. Major depressive disorder (MDD was assessed using the MINI.Results: For MDD, the SSLRs were 0.13 (95% CI 0.04–0.40, 3.68 (95% CI 1.37–9.89, and 24.77 (95% CI 14.97–40.98 for CES–D scores of 0–16, 17–20, and above 21, respectively.Conclusion: The validity of the CES-D is confirmed, and SSLR analysis is recommended for its practical value for the detection of individuals with the risk of MDD in the Japanese community. Keywords: screening, depression, Center for Epidemiologic Studies Depression Scale, stratum-specific likelihood ratio

  20. Cox regression with missing covariate data using a modified partial likelihood method

    DEFF Research Database (Denmark)

    Martinussen, Torben; Holst, Klaus K.; Scheike, Thomas H.

    2016-01-01

    Missing covariate values is a common problem in survival analysis. In this paper we propose a novel method for the Cox regression model that is close to maximum likelihood but avoids the use of the EM-algorithm. It exploits that the observed hazard function is multiplicative in the baseline hazard...

  1. Gaussian likelihood inference on data from trans-Gaussian random fields with Matérn covariance function

    KAUST Repository

    Yan, Yuan; Genton, Marc G.

    2017-01-01

    Gaussian likelihood inference has been studied and used extensively in both statistical theory and applications due to its simplicity. However, in practice, the assumption of Gaussianity is rarely met in the analysis of spatial data. In this paper, we study the effect of non-Gaussianity on Gaussian likelihood inference for the parameters of the Matérn covariance model. By using Monte Carlo simulations, we generate spatial data from a Tukey g-and-h random field, a flexible trans-Gaussian random field, with the Matérn covariance function, where g controls skewness and h controls tail heaviness. We use maximum likelihood based on the multivariate Gaussian distribution to estimate the parameters of the Matérn covariance function. We illustrate the effects of non-Gaussianity of the data on the estimated covariance function by means of functional boxplots. Thanks to our tailored simulation design, a comparison of the maximum likelihood estimator under both the increasing and fixed domain asymptotics for spatial data is performed. We find that the maximum likelihood estimator based on Gaussian likelihood is overall satisfying and preferable than the non-distribution-based weighted least squares estimator for data from the Tukey g-and-h random field. We also present the result for Gaussian kriging based on Matérn covariance estimates with data from the Tukey g-and-h random field and observe an overall satisfactory performance.

  2. Gaussian likelihood inference on data from trans-Gaussian random fields with Matérn covariance function

    KAUST Repository

    Yan, Yuan

    2017-07-13

    Gaussian likelihood inference has been studied and used extensively in both statistical theory and applications due to its simplicity. However, in practice, the assumption of Gaussianity is rarely met in the analysis of spatial data. In this paper, we study the effect of non-Gaussianity on Gaussian likelihood inference for the parameters of the Matérn covariance model. By using Monte Carlo simulations, we generate spatial data from a Tukey g-and-h random field, a flexible trans-Gaussian random field, with the Matérn covariance function, where g controls skewness and h controls tail heaviness. We use maximum likelihood based on the multivariate Gaussian distribution to estimate the parameters of the Matérn covariance function. We illustrate the effects of non-Gaussianity of the data on the estimated covariance function by means of functional boxplots. Thanks to our tailored simulation design, a comparison of the maximum likelihood estimator under both the increasing and fixed domain asymptotics for spatial data is performed. We find that the maximum likelihood estimator based on Gaussian likelihood is overall satisfying and preferable than the non-distribution-based weighted least squares estimator for data from the Tukey g-and-h random field. We also present the result for Gaussian kriging based on Matérn covariance estimates with data from the Tukey g-and-h random field and observe an overall satisfactory performance.

  3. A three-dimensional soft tissue analysis of Class III malocclusion: a case-controlled cross-sectional study.

    Science.gov (United States)

    Johal, Ama; Chaggar, Amrit; Zou, Li Fong

    2018-03-01

    The present study used the optical surface laser scanning technique to compare the facial features of patients aged 8-18 years presenting with Class I and Class III incisor relationship in a case-control design. Subjects with a Class III incisor relationship, aged 8-18 years, were age and gender matched with Class I control and underwent a 3-dimensional (3-D) optical surface scan of the facial soft tissues. Landmark analysis revealed Class III subjects displayed greater mean dimensions compared to the control group most notably between the ages of 8-10 and 17-18 years in both males and females, in respect of antero-posterior (P = 0.01) and vertical (P = 0.006) facial dimensions. Surface-based analysis, revealed the greatest difference in the lower facial region, followed by the mid-face, whilst the upper face remained fairly consistent. Significant detectable differences were found in the surface facial features of developing Class III subjects.

  4. Analysis of the maximum likelihood channel estimator for OFDM systems in the presence of unknown interference

    Science.gov (United States)

    Dermoune, Azzouz; Simon, Eric Pierre

    2017-12-01

    This paper is a theoretical analysis of the maximum likelihood (ML) channel estimator for orthogonal frequency-division multiplexing (OFDM) systems in the presence of unknown interference. The following theoretical results are presented. Firstly, the uniqueness of the ML solution for practical applications, i.e., when thermal noise is present, is analytically demonstrated when the number of transmitted OFDM symbols is strictly greater than one. The ML solution is then derived from the iterative conditional ML (CML) algorithm. Secondly, it is shown that the channel estimate can be described as an algebraic function whose inputs are the initial value and the means and variances of the received samples. Thirdly, it is theoretically demonstrated that the channel estimator is not biased. The second and the third results are obtained by employing oblique projection theory. Furthermore, these results are confirmed by numerical results.

  5. The modified signed likelihood statistic and saddlepoint approximations

    DEFF Research Database (Denmark)

    Jensen, Jens Ledet

    1992-01-01

    SUMMARY: For a number of tests in exponential families we show that the use of a normal approximation to the modified signed likelihood ratio statistic r * is equivalent to the use of a saddlepoint approximation. This is also true in a large deviation region where the signed likelihood ratio...... statistic r is of order √ n. © 1992 Biometrika Trust....

  6. Quantifying the Establishment Likelihood of Invasive Alien Species Introductions Through Ports with Application to Honeybees in Australia.

    Science.gov (United States)

    Heersink, Daniel K; Caley, Peter; Paini, Dean R; Barry, Simon C

    2016-05-01

    The cost of an uncontrolled incursion of invasive alien species (IAS) arising from undetected entry through ports can be substantial, and knowledge of port-specific risks is needed to help allocate limited surveillance resources. Quantifying the establishment likelihood of such an incursion requires quantifying the ability of a species to enter, establish, and spread. Estimation of the approach rate of IAS into ports provides a measure of likelihood of entry. Data on the approach rate of IAS are typically sparse, and the combinations of risk factors relating to country of origin and port of arrival diverse. This presents challenges to making formal statistical inference on establishment likelihood. Here we demonstrate how these challenges can be overcome with judicious use of mixed-effects models when estimating the incursion likelihood into Australia of the European (Apis mellifera) and Asian (A. cerana) honeybees, along with the invasive parasites of biosecurity concern they host (e.g., Varroa destructor). Our results demonstrate how skewed the establishment likelihood is, with one-tenth of the ports accounting for 80% or more of the likelihood for both species. These results have been utilized by biosecurity agencies in the allocation of resources to the surveillance of maritime ports. © 2015 Society for Risk Analysis.

  7. Addition compounds of lanthamide (III) and yttrium (III) hexafluorophosphates and N,N - dimethylformamide

    International Nuclear Information System (INIS)

    Braga, L.S.P.

    1983-01-01

    Addition compounds of lanthanide (III) and yttrium (III) hexafluorophosphates and N-N-Dimetylformamide are described to characterize the complexes, elemental analysis, melting ranges, molar conductance measurements, X-ray powder patters infrared and Raman spectra, TG and DTA curves, are studied. Information concerning the decomposition of the adducts through the thermogravimetric curves and the differential thermal analysis curves is obtained. (M.J.C.) [pt

  8. Planck intermediate results: XVI. Profile likelihoods for cosmological parameters

    DEFF Research Database (Denmark)

    Bartlett, J.G.; Cardoso, J.-F.; Delabrouille, J.

    2014-01-01

    We explore the 2013 Planck likelihood function with a high-precision multi-dimensional minimizer (Minuit). This allows a refinement of the CDM best-fit solution with respect to previously-released results, and the construction of frequentist confidence intervals using profile likelihoods. The agr...

  9. Planck 2013 results. XV. CMB power spectra and likelihood

    DEFF Research Database (Denmark)

    Tauber, Jan; Bartlett, J.G.; Bucher, M.

    2014-01-01

    This paper presents the Planck 2013 likelihood, a complete statistical description of the two-point correlation function of the CMB temperature fluctuations that accounts for all known relevant uncertainties, both instrumental and astrophysical in nature. We use this likelihood to derive our best...

  10. Maximum likelihood convolutional decoding (MCD) performance due to system losses

    Science.gov (United States)

    Webster, L.

    1976-01-01

    A model for predicting the computational performance of a maximum likelihood convolutional decoder (MCD) operating in a noisy carrier reference environment is described. This model is used to develop a subroutine that will be utilized by the Telemetry Analysis Program to compute the MCD bit error rate. When this computational model is averaged over noisy reference phase errors using a high-rate interpolation scheme, the results are found to agree quite favorably with experimental measurements.

  11. Ego involvement increases doping likelihood.

    Science.gov (United States)

    Ring, Christopher; Kavussanu, Maria

    2018-08-01

    Achievement goal theory provides a framework to help understand how individuals behave in achievement contexts, such as sport. Evidence concerning the role of motivation in the decision to use banned performance enhancing substances (i.e., doping) is equivocal on this issue. The extant literature shows that dispositional goal orientation has been weakly and inconsistently associated with doping intention and use. It is possible that goal involvement, which describes the situational motivational state, is a stronger determinant of doping intention. Accordingly, the current study used an experimental design to examine the effects of goal involvement, manipulated using direct instructions and reflective writing, on doping likelihood in hypothetical situations in college athletes. The ego-involving goal increased doping likelihood compared to no goal and a task-involving goal. The present findings provide the first evidence that ego involvement can sway the decision to use doping to improve athletic performance.

  12. Components of soft tissue deformations in subjects with untreated angle's Class III malocclusions: thin-plate spline analysis.

    Science.gov (United States)

    Singh, G D; McNamara, J A; Lozanoff, S

    1998-01-01

    While the dynamics of maxillo-mandibular allometry associated with treatment modalities available for the management of Class III malocclusions currently are under investigation, developmental aberration of the soft tissues in untreated Class III malocclusions requires specification. In this study, lateral cephalographs of 124 prepubertal European-American children (71 with untreated Class III malocclusion; 53 with Class I occlusion) were traced, and 12 soft-tissue landmarks digitized. Resultant geometries were scaled to an equivalent size and mean Class III and Class I configurations compared. Procrustes analysis established statistical difference (P thin-plate spline (TPS) analysis indicated that both affine and non-affine transformations contribute towards the deformation (total spline) of the averaged Class III soft tissue configuration. For non-affine transformations, partial warp 8 had the highest magnitude, indicating large-scale deformations visualized as a combination of columellar retrusion and lower labial protrusion. In addition, partial warp 5 also had a high magnitude, demonstrating upper labial vertical compression with antero-inferior elongation of the lower labio-mental soft tissue complex. Thus, children with Class III malocclusions demonstrate antero-posterior and vertical deformations of the maxillary soft tissue complex in combination with antero-inferior mandibular soft tissue elongation. This pattern of deformations may represent gene-environment interactions, resulting in Class III malocclusions with characteristic phenotypes, that are amenable to orthodontic and dentofacial orthopedic manipulations.

  13. Computational Analysis of the G-III Laminar Flow Glove

    Science.gov (United States)

    Malik, Mujeeb R.; Liao, Wei; Lee-Rausch, Elizabeth M.; Li, Fei; Choudhari, Meelan M.; Chang, Chau-Lyan

    2011-01-01

    Under NASA's Environmentally Responsible Aviation Project, flight experiments are planned with the primary objective of demonstrating the Discrete Roughness Elements (DRE) technology for passive laminar flow control at chord Reynolds numbers relevant to transport aircraft. In this paper, we present a preliminary computational assessment of the Gulfstream-III (G-III) aircraft wing-glove designed to attain natural laminar flow for the leading-edge sweep angle of 34.6deg. Analysis for a flight Mach number of 0.75 shows that it should be possible to achieve natural laminar flow for twice the transition Reynolds number ever achieved at this sweep angle. However, the wing-glove needs to be redesigned to effectively demonstrate passive laminar flow control using DREs. As a by-product of the computational assessment, effect of surface curvature on stationary crossflow disturbances is found to be strongly stabilizing for the current design, and it is suggested that convex surface curvature could be used as a control parameter for natural laminar flow design, provided transition occurs via stationary crossflow disturbances.

  14. Maximum likelihood estimation and EM algorithm of Copas-like selection model for publication bias correction.

    Science.gov (United States)

    Ning, Jing; Chen, Yong; Piao, Jin

    2017-07-01

    Publication bias occurs when the published research results are systematically unrepresentative of the population of studies that have been conducted, and is a potential threat to meaningful meta-analysis. The Copas selection model provides a flexible framework for correcting estimates and offers considerable insight into the publication bias. However, maximizing the observed likelihood under the Copas selection model is challenging because the observed data contain very little information on the latent variable. In this article, we study a Copas-like selection model and propose an expectation-maximization (EM) algorithm for estimation based on the full likelihood. Empirical simulation studies show that the EM algorithm and its associated inferential procedure performs well and avoids the non-convergence problem when maximizing the observed likelihood. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  15. Likelihood-ratio-based biometric verification

    NARCIS (Netherlands)

    Bazen, A.M.; Veldhuis, Raymond N.J.

    2002-01-01

    This paper presents results on optimal similarity measures for biometric verification based on fixed-length feature vectors. First, we show that the verification of a single user is equivalent to the detection problem, which implies that for single-user verification the likelihood ratio is optimal.

  16. Likelihood Ratio-Based Biometric Verification

    NARCIS (Netherlands)

    Bazen, A.M.; Veldhuis, Raymond N.J.

    The paper presents results on optimal similarity measures for biometric verification based on fixed-length feature vectors. First, we show that the verification of a single user is equivalent to the detection problem, which implies that, for single-user verification, the likelihood ratio is optimal.

  17. Planck 2013 results. XV. CMB power spectra and likelihood

    CERN Document Server

    Ade, P.A.R.; Armitage-Caplan, C.; Arnaud, M.; Ashdown, M.; Atrio-Barandela, F.; Aumont, J.; Baccigalupi, C.; Banday, A.J.; Barreiro, R.B.; Bartlett, J.G.; Battaner, E.; Benabed, K.; Benoit, A.; Benoit-Levy, A.; Bernard, J.P.; Bersanelli, M.; Bielewicz, P.; Bobin, J.; Bock, J.J.; Bonaldi, A.; Bonavera, L.; Bond, J.R.; Borrill, J.; Bouchet, F.R.; Boulanger, F.; Bridges, M.; Bucher, M.; Burigana, C.; Butler, R.C.; Calabrese, E.; Cardoso, J.F.; Catalano, A.; Challinor, A.; Chamballu, A.; Chiang, L.Y.; Chiang, H.C.; Christensen, P.R.; Church, S.; Clements, D.L.; Colombi, S.; Colombo, L.P.L.; Combet, C.; Couchot, F.; Coulais, A.; Crill, B.P.; Curto, A.; Cuttaia, F.; Danese, L.; Davies, R.D.; Davis, R.J.; de Bernardis, P.; de Rosa, A.; de Zotti, G.; Delabrouille, J.; Delouis, J.M.; Desert, F.X.; Dickinson, C.; Diego, J.M.; Dole, H.; Donzelli, S.; Dore, O.; Douspis, M.; Dunkley, J.; Dupac, X.; Efstathiou, G.; Elsner, F.; Ensslin, T.A.; Eriksen, H.K.; Finelli, F.; Forni, O.; Frailis, M.; Fraisse, A.A.; Franceschi, E.; Gaier, T.C.; Galeotta, S.; Galli, S.; Ganga, K.; Giard, M.; Giardino, G.; Giraud-Heraud, Y.; Gjerlow, E.; Gonzalez-Nuevo, J.; Gorski, K.M.; Gratton, S.; Gregorio, A.; Gruppuso, A.; Gudmundsson, J.E.; Hansen, F.K.; Hanson, D.; Harrison, D.; Helou, G.; Henrot-Versille, S.; Hernandez-Monteagudo, C.; Herranz, D.; Hildebrandt, S.R.; Hivon, E.; Hobson, M.; Holmes, W.A.; Hornstrup, A.; Hovest, W.; Huffenberger, K.M.; Hurier, G.; Jaffe, T.R.; Jaffe, A.H.; Jewell, J.; Jones, W.C.; Juvela, M.; Keihanen, E.; Keskitalo, R.; Kiiveri, K.; Kisner, T.S.; Kneissl, R.; Knoche, J.; Knox, L.; Kunz, M.; Kurki-Suonio, H.; Lagache, G.; Lahteenmaki, A.; Lamarre, J.M.; Lasenby, A.; Lattanzi, M.; Laureijs, R.J.; Lawrence, C.R.; Le Jeune, M.; Leach, S.; Leahy, J.P.; Leonardi, R.; Leon-Tavares, J.; Lesgourgues, J.; Liguori, M.; Lilje, P.B.; Lindholm, V.; Linden-Vornle, M.; Lopez-Caniego, M.; Lubin, P.M.; Macias-Perez, J.F.; Maffei, B.; Maino, D.; Mandolesi, N.; Marinucci, D.; Maris, M.; Marshall, D.J.; Martin, P.G.; Martinez-Gonzalez, E.; Masi, S.; Matarrese, S.; Matthai, F.; Mazzotta, P.; Meinhold, P.R.; Melchiorri, A.; Mendes, L.; Menegoni, E.; Mennella, A.; Migliaccio, M.; Millea, M.; Mitra, S.; Miville-Deschenes, M.A.; Molinari, D.; Moneti, A.; Montier, L.; Morgante, G.; Mortlock, D.; Moss, A.; Munshi, D.; Naselsky, P.; Nati, F.; Natoli, P.; Netterfield, C.B.; Norgaard-Nielsen, H.U.; Noviello, F.; Novikov, D.; Novikov, I.; O'Dwyer, I.J.; Orieux, F.; Osborne, S.; Oxborrow, C.A.; Paci, F.; Pagano, L.; Pajot, F.; Paladini, R.; Paoletti, D.; Partridge, B.; Pasian, F.; Patanchon, G.; Paykari, P.; Perdereau, O.; Perotto, L.; Perrotta, F.; Piacentini, F.; Piat, M.; Pierpaoli, E.; Pietrobon, D.; Plaszczynski, S.; Pointecouteau, E.; Polenta, G.; Ponthieu, N.; Popa, L.; Poutanen, T.; Pratt, G.W.; Prezeau, G.; Prunet, S.; Puget, J.L.; Rachen, J.P.; Rahlin, A.; Rebolo, R.; Reinecke, M.; Remazeilles, M.; Renault, C.; Ricciardi, S.; Riller, T.; Ringeval, C.; Ristorcelli, I.; Rocha, G.; Rosset, C.; Roudier, G.; Rowan-Robinson, M.; Rubino-Martin, J.A.; Rusholme, B.; Sandri, M.; Sanselme, L.; Santos, D.; Savini, G.; Scott, D.; Seiffert, M.D.; Shellard, E.P.S.; Spencer, L.D.; Starck, J.L.; Stolyarov, V.; Stompor, R.; Sudiwala, R.; Sureau, F.; Sutton, D.; Suur-Uski, A.S.; Sygnet, J.F.; Tauber, J.A.; Tavagnacco, D.; Terenzi, L.; Toffolatti, L.; Tomasi, M.; Tristram, M.; Tucci, M.; Tuovinen, J.; Turler, M.; Valenziano, L.; Valiviita, J.; Van Tent, B.; Varis, J.; Vielva, P.; Villa, F.; Vittorio, N.; Wade, L.A.; Wandelt, B.D.; Wehus, I.K.; White, M.; White, S.D.M.; Yvon, D.; Zacchei, A.; Zonca, A.

    2014-01-01

    We present the Planck likelihood, a complete statistical description of the two-point correlation function of the CMB temperature fluctuations. We use this likelihood to derive the Planck CMB power spectrum over three decades in l, covering 2 = 50, we employ a correlated Gaussian likelihood approximation based on angular cross-spectra derived from the 100, 143 and 217 GHz channels. We validate our likelihood through an extensive suite of consistency tests, and assess the impact of residual foreground and instrumental uncertainties on cosmological parameters. We find good internal agreement among the high-l cross-spectra with residuals of a few uK^2 at l <= 1000. We compare our results with foreground-cleaned CMB maps, and with cross-spectra derived from the 70 GHz Planck map, and find broad agreement in terms of spectrum residuals and cosmological parameters. The best-fit LCDM cosmology is in excellent agreement with preliminary Planck polarisation spectra. The standard LCDM cosmology is well constrained b...

  18. Gaussian copula as a likelihood function for environmental models

    Science.gov (United States)

    Wani, O.; Espadas, G.; Cecinati, F.; Rieckermann, J.

    2017-12-01

    Parameter estimation of environmental models always comes with uncertainty. To formally quantify this parametric uncertainty, a likelihood function needs to be formulated, which is defined as the probability of observations given fixed values of the parameter set. A likelihood function allows us to infer parameter values from observations using Bayes' theorem. The challenge is to formulate a likelihood function that reliably describes the error generating processes which lead to the observed monitoring data, such as rainfall and runoff. If the likelihood function is not representative of the error statistics, the parameter inference will give biased parameter values. Several uncertainty estimation methods that are currently being used employ Gaussian processes as a likelihood function, because of their favourable analytical properties. Box-Cox transformation is suggested to deal with non-symmetric and heteroscedastic errors e.g. for flow data which are typically more uncertain in high flows than in periods with low flows. Problem with transformations is that the results are conditional on hyper-parameters, for which it is difficult to formulate the analyst's belief a priori. In an attempt to address this problem, in this research work we suggest learning the nature of the error distribution from the errors made by the model in the "past" forecasts. We use a Gaussian copula to generate semiparametric error distributions . 1) We show that this copula can be then used as a likelihood function to infer parameters, breaking away from the practice of using multivariate normal distributions. Based on the results from a didactical example of predicting rainfall runoff, 2) we demonstrate that the copula captures the predictive uncertainty of the model. 3) Finally, we find that the properties of autocorrelation and heteroscedasticity of errors are captured well by the copula, eliminating the need to use transforms. In summary, our findings suggest that copulas are an

  19. Effects of stimulus type and strategy on mental rotation network:an Activation Likelihood Estimation meta-analysis

    Directory of Open Access Journals (Sweden)

    Barbara eTomasino

    2016-01-01

    Full Text Available We could predict how an object would look like if we were to see it from different viewpoints. The brain network governing mental rotation (MR has been studied using a variety of stimuli and tasks instructions. By using activation likelihood estimation (ALE meta-analysis we tested whether different MR networks can be modulated by the type of stimulus (body vs. non body parts or by the type of tasks instructions (motor imagery-based vs. non-motor imagery-based MR instructions. Testing for the bodily and non-bodily stimulus axis revealed a bilateral sensorimotor activation for bodily-related as compared to non bodily-related stimuli and a posterior right lateralized activation for non bodily-related as compared to bodily-related stimuli. A top-down modulation of the network was exerted by the MR tasks instructions frame with a bilateral (preferentially sensorimotor left network for motor imagery- vs. non-motor imagery-based MR instructions and the latter activating a preferentially posterior right occipito-temporal-parietal network. The present quantitative meta-analysis summarizes and amends previous descriptions of the brain network related to MR and shows how it is modulated by top-down and bottom-up experimental factors.

  20. Comparative analysis of SLB for OPR1000 by using MEDUSA and CESEC-III codes

    International Nuclear Information System (INIS)

    Park, Jong Cheol; Park, Chan Eok; Kim, Shin Whan

    2005-01-01

    The MEDUSA is a system thermal hydraulics code developed by Korea Power Engineering Company (KOPEC) for Non-LOCA and LOCA analysis, using two fluid, three-field governing equations for two phase flow. The detailed descriptions for the MEDUSA code are given in Reference. A lot of effort is now being made to investigate the applicability of the MEDUSA code especially to Non-LOCA analysis, by comparing the analysis results with those from the current licensing code, CESEC-III: The comparative simulations of Pressurizer Level Control System(PLCS) Malfunction and Feedwater Line Break(FLB), which have been accomplished by C.E.Park and M.T.Oh, respectively, already showed that the MEDUSA code is applicable to the analysis of Non-LOCA events. In this paper, detailed thermal hydraulic analyses for Steam Line Break(SLB) without loss of off-site power were performed using the MEDUSA code. The calculation results were also compared with the CESEC-III, 1000(OPR1000), for the purpose of the code verification

  1. Incorporating Nuisance Parameters in Likelihoods for Multisource Spectra

    CERN Document Server

    Conway, J.S.

    2011-01-01

    We describe here the general mathematical approach to constructing likelihoods for fitting observed spectra in one or more dimensions with multiple sources, including the effects of systematic uncertainties represented as nuisance parameters, when the likelihood is to be maximized with respect to these parameters. We consider three types of nuisance parameters: simple multiplicative factors, source spectra "morphing" parameters, and parameters representing statistical uncertainties in the predicted source spectra.

  2. Nitrato-complexes of Y(III), La(III), Ce(III), Pr(III), Nd(III), Sm(III), Gd(III), Tb(III), Dy(III) and Ho(III) with 2-(2'-pyridyl) benzimidazole

    Energy Technology Data Exchange (ETDEWEB)

    Mishra, A; Singh, M P; Singh, V K

    1982-05-01

    The nitrato-complexes, (Y(PyBzH)/sub 2/(NO/sub 3/)/sub 2/)NO/sub 3/.H/sub 2/O and Nd, Sm, Gd, Tb, Dy, Ho ; n=1-3, m=0-0.5 ; PyBzh=2-(2 -pyridyl)benzimidazole) are formed on interaction of the ligand with metal nitrates in ethanol. The electrical conductance values (116-129 ohm/sup -1/cm/sup 2/mol/sup -1/) suggest 1:1 electrolyte-nature of the complexes. Magnetic moment values of Ce(2.53 B.M.), Pr(3.62 B.M.), Nd(3.52 B.M.), Sm(1.70 B.M.), Gd(8.06 B.M.), Tb(9.44 B.M.), Dy(10.56 B.M.) and Ho(10.51 B.M.) in the complexes confirm the positive state of the metals. Infrared evidences are obtained for the existance of both coordinated (C/sub 2/v) and uncoordinated (D/sub 3/h) nitrate groups. Electronic absorption spectra of Pr(III)-, Nd(III)-, Sm(III)-, Tb(III)-, Dy(III)- and Ho(III)-complexes have been analysed in the light of LSJ terms.

  3. Nitrato-complexes of Y(III), La(III), Ce(III), Pr(III), Nd(III), Sm(III), Gd(III), Tb(III), Dy(III) and Ho(III) with 2-(2'-pyridyl) benzimidazole

    International Nuclear Information System (INIS)

    Mishra, A.; Singh, M.P.; Singh, V.K.

    1982-01-01

    The nitrato-complexes, [Y(PyBzH) 2 (NO 3 ) 2 ]NO 3 .H 2 O and Nd, Sm, Gd, Tb, Dy, Ho ; n=1-3, m=0-0.5 ; PyBzh=2-(2 -pyridyl)benzimidazole] are formed on interaction of the ligand with metal nitrates in ethanol. The electrical conductance values (116-129 ohm -1 cm 2 mol -1 ) suggest 1:1 electrolyte-nature of the complexes. Magnetic moment values of Ce(2.53 B.M.), Pr(3.62 B.M.), Nd(3.52 B.M.), Sm(1.70 B.M.), Gd(8.06 B.M.), Tb(9.44 B.M.), Dy(10.56 B.M.) and Ho(10.51 B.M.) in the complexes confirm the terpositive state of the metals. Infrared evidences are obtained for the existance of both coordinated (C 2 v) and uncoordinated (D 3 h) nitrate groups. Electronic absorption spectra of Pr(III)-, Nd(III)-, Sm(III)-, Tb(III)-, Dy(III)- and Ho(III)-complexes have been analysed in the light of LSJ terms. (author)

  4. Speech perception in autism spectrum disorder: An activation likelihood estimation meta-analysis.

    Science.gov (United States)

    Tryfon, Ana; Foster, Nicholas E V; Sharda, Megha; Hyde, Krista L

    2018-02-15

    Autism spectrum disorder (ASD) is often characterized by atypical language profiles and auditory and speech processing. These can contribute to aberrant language and social communication skills in ASD. The study of the neural basis of speech perception in ASD can serve as a potential neurobiological marker of ASD early on, but mixed results across studies renders it difficult to find a reliable neural characterization of speech processing in ASD. To this aim, the present study examined the functional neural basis of speech perception in ASD versus typical development (TD) using an activation likelihood estimation (ALE) meta-analysis of 18 qualifying studies. The present study included separate analyses for TD and ASD, which allowed us to examine patterns of within-group brain activation as well as both common and distinct patterns of brain activation across the ASD and TD groups. Overall, ASD and TD showed mostly common brain activation of speech processing in bilateral superior temporal gyrus (STG) and left inferior frontal gyrus (IFG). However, the results revealed trends for some distinct activation in the TD group showing additional activation in higher-order brain areas including left superior frontal gyrus (SFG), left medial frontal gyrus (MFG), and right IFG. These results provide a more reliable neural characterization of speech processing in ASD relative to previous single neuroimaging studies and motivate future work to investigate how these brain signatures relate to behavioral measures of speech processing in ASD. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Generalized Likelihood Uncertainty Estimation (GLUE) Using Multi-Optimization Algorithm as Sampling Method

    Science.gov (United States)

    Wang, Z.

    2015-12-01

    For decades, distributed and lumped hydrological models have furthered our understanding of hydrological system. The development of hydrological simulation in large scale and high precision elaborated the spatial descriptions and hydrological behaviors. Meanwhile, the new trend is also followed by the increment of model complexity and number of parameters, which brings new challenges of uncertainty quantification. Generalized Likelihood Uncertainty Estimation (GLUE) has been widely used in uncertainty analysis for hydrological models referring to Monte Carlo method coupled with Bayesian estimation. However, the stochastic sampling method of prior parameters adopted by GLUE appears inefficient, especially in high dimensional parameter space. The heuristic optimization algorithms utilizing iterative evolution show better convergence speed and optimality-searching performance. In light of the features of heuristic optimization algorithms, this study adopted genetic algorithm, differential evolution, shuffled complex evolving algorithm to search the parameter space and obtain the parameter sets of large likelihoods. Based on the multi-algorithm sampling, hydrological model uncertainty analysis is conducted by the typical GLUE framework. To demonstrate the superiority of the new method, two hydrological models of different complexity are examined. The results shows the adaptive method tends to be efficient in sampling and effective in uncertainty analysis, providing an alternative path for uncertainty quantilization.

  6. Comparisons of likelihood and machine learning methods of individual classification

    Science.gov (United States)

    Guinand, B.; Topchy, A.; Page, K.S.; Burnham-Curtis, M. K.; Punch, W.F.; Scribner, K.T.

    2002-01-01

    Classification methods used in machine learning (e.g., artificial neural networks, decision trees, and k-nearest neighbor clustering) are rarely used with population genetic data. We compare different nonparametric machine learning techniques with parametric likelihood estimations commonly employed in population genetics for purposes of assigning individuals to their population of origin (“assignment tests”). Classifier accuracy was compared across simulated data sets representing different levels of population differentiation (low and high FST), number of loci surveyed (5 and 10), and allelic diversity (average of three or eight alleles per locus). Empirical data for the lake trout (Salvelinus namaycush) exhibiting levels of population differentiation comparable to those used in simulations were examined to further evaluate and compare classification methods. Classification error rates associated with artificial neural networks and likelihood estimators were lower for simulated data sets compared to k-nearest neighbor and decision tree classifiers over the entire range of parameters considered. Artificial neural networks only marginally outperformed the likelihood method for simulated data (0–2.8% lower error rates). The relative performance of each machine learning classifier improved relative likelihood estimators for empirical data sets, suggesting an ability to “learn” and utilize properties of empirical genotypic arrays intrinsic to each population. Likelihood-based estimation methods provide a more accessible option for reliable assignment of individuals to the population of origin due to the intricacies in development and evaluation of artificial neural networks. In recent years, characterization of highly polymorphic molecular markers such as mini- and microsatellites and development of novel methods of analysis have enabled researchers to extend investigations of ecological and evolutionary processes below the population level to the level of

  7. Proteomic and properties analysis of botanical insecticide rhodojaponin III-induced response of the diamondback moth, Plutella xyllostella (L..

    Directory of Open Access Journals (Sweden)

    Xiaolin Dong

    Full Text Available BACKGROUND: Rhodojaponin III, as a botanical insecticide, affects a wide variety of biological processes in insects, including reduction of feeding, suspension of development, and oviposition deterring of adults in a dose-dependent manner. However, the mode of these actions remains obscure. PRINCIPAL FINDINGS: In this study, a comparative proteomic approach was adopted to examine the effect of rhodojaponin III on the Plutella xyllostella (L.. Following treating 48 hours, newly emergence moths were collected and protein samples were prepared. The proteins were separated by 2-DE, and total 31 proteins were significantly affected by rhodojaponin III compared to the control identified by MALDI-TOF/TOF-MS/MS. These differentially expressed proteins act in the nervous transduction, odorant degradation and metabolic change pathways. Further, gene expression patterns in treated and untreated moths were confirmed by qRT-PCR and western blot analysis. RNAi of the chemosensory protein (PxCSP gene resulted in oviposition significantly increased on cabbage plants treated with rhodojaponin III. CONCLUSIONS: These rhodojaponin III-induced proteins and gene properties analysis would be essential for a better understanding of the potential molecular mechanism of the response to rhodojaponin III from moths of P. xylostella.

  8. The likelihood principle and its proof – a never-ending story…

    DEFF Research Database (Denmark)

    Jørgensen, Thomas Martini

    2015-01-01

    An ongoing controversy in philosophy of statistics is the so-called “likelihood principle” essentially stating that all evidence which is obtained from an experiment about an unknown quantity θ is contained in the likelihood function of θ. Common classical statistical methodology, such as the use...... of significance tests, and confidence intervals, depends on the experimental procedure and unrealized events and thus violates the likelihood principle. The likelihood principle was identified by that name and proved in a famous paper by Allan Birnbaum in 1962. However, ever since both the principle itself...... as well as the proof has been highly debated. This presentation will illustrate the debate of both the principle and its proof, from 1962 and up to today. An often-used experiment to illustrate the controversy between classical interpretation and evidential confirmation based on the likelihood principle...

  9. Development of likelihood estimation method for criticality accidents of mixed oxide fuel fabrication facilities

    International Nuclear Information System (INIS)

    Tamaki, Hitoshi; Yoshida, Kazuo; Kimoto, Tatsuya; Hamaguchi, Yoshikane

    2010-01-01

    A criticality accident in a MOX fuel fabrication facility may occur depending on several parameters, such as mass inventory and plutonium enrichment. MOX handling units in the facility are designed and operated based on the double contingency principle to prevent criticality accidents. Control failures of at least two parameters are needed for the occurrence of criticality accident. To evaluate the probability of such control failures, the criticality conditions of each parameter for a specific handling unit are necessary for accident scenario analysis to be clarified quantitatively with a criticality analysis computer code. In addition to this issue, a computer-based control system for mass inventory is planned to be installed into MOX handling equipment in a commercial MOX fuel fabrication plant. The reliability analysis is another important issue in evaluating the likelihood of control failure caused by software malfunction. A likelihood estimation method for criticality accident has been developed with these issues been taken into consideration. In this paper, an example of analysis with the proposed method and the applicability of the method are also shown through a trial application to a model MOX fabrication facility. (author)

  10. Likelihood of Suicidality at Varying Levels of Depression Severity: A Re-Analysis of NESARC Data

    Science.gov (United States)

    Uebelacker, Lisa A.; Strong, David; Weinstock, Lauren M.; Miller, Ivan W.

    2010-01-01

    Although it is clear that increasing depression severity is associated with more risk for suicidality, less is known about at what levels of depression severity the risk for different suicide symptoms increases. We used item response theory to estimate the likelihood of endorsing suicide symptoms across levels of depression severity in an…

  11. Synthesis, characterization and single crystal X-ray analysis of chlorobis(N,N-dimethyldithiocarbamato-S,S′antimony(III

    Directory of Open Access Journals (Sweden)

    H.P.S. Chauhan

    2015-07-01

    Full Text Available The title compound chlorobis(N,N-dimethyldithiocarbamato-S,S′antimony(III has been prepared in distilled acetonitrile and characterized by physicochemical [melting point and molecular weight determination, elemental analysis (C, H, N, S & Sb], spectral [FT–IR, far IR, NMR (1H & 13C] studies. The crystal and molecular structure was further confirmed using single crystal X-ray diffraction analysis which features a five-coordinate geometry for antimony(III within a ClS4 donor set. The distortion in the co-planarity of ClSbS3 evidences the stereochemical influence exerts by the lone pair of electrons on antimony(III. Two centrosymmetrically related molecule held together via C–H···Cl secondary interaction result in molecular aggregation of the compound.

  12. Development and Implementation of Efficiency-Improving Analysis Methods for the SAGE III on ISS Thermal Model Originating

    Science.gov (United States)

    Liles, Kaitlin; Amundsen, Ruth; Davis, Warren; Scola, Salvatore; Tobin, Steven; McLeod, Shawn; Mannu, Sergio; Guglielmo, Corrado; Moeller, Timothy

    2013-01-01

    The Stratospheric Aerosol and Gas Experiment III (SAGE III) instrument is the fifth in a series of instruments developed for monitoring aerosols and gaseous constituents in the stratosphere and troposphere. SAGE III will be delivered to the International Space Station (ISS) via the SpaceX Dragon vehicle in 2015. A detailed thermal model of the SAGE III payload has been developed in Thermal Desktop (TD). Several novel methods have been implemented to facilitate efficient payload-level thermal analysis, including the use of a design of experiments (DOE) methodology to determine the worst-case orbits for SAGE III while on ISS, use of TD assemblies to move payloads from the Dragon trunk to the Enhanced Operational Transfer Platform (EOTP) to its final home on the Expedite the Processing of Experiments to Space Station (ExPRESS) Logistics Carrier (ELC)-4, incorporation of older models in varying unit sets, ability to change units easily (including hardcoded logic blocks), case-based logic to facilitate activating heaters and active elements for varying scenarios within a single model, incorporation of several coordinate frames to easily map to structural models with differing geometries and locations, and streamlined results processing using an Excel-based text file plotter developed in-house at LaRC. This document presents an overview of the SAGE III thermal model and describes the development and implementation of these efficiency-improving analysis methods.

  13. Dissociating response conflict and error likelihood in anterior cingulate cortex.

    Science.gov (United States)

    Yeung, Nick; Nieuwenhuis, Sander

    2009-11-18

    Neuroimaging studies consistently report activity in anterior cingulate cortex (ACC) in conditions of high cognitive demand, leading to the view that ACC plays a crucial role in the control of cognitive processes. According to one prominent theory, the sensitivity of ACC to task difficulty reflects its role in monitoring for the occurrence of competition, or "conflict," between responses to signal the need for increased cognitive control. However, a contrasting theory proposes that ACC is the recipient rather than source of monitoring signals, and that ACC activity observed in relation to task demand reflects the role of this region in learning about the likelihood of errors. Response conflict and error likelihood are typically confounded, making the theories difficult to distinguish empirically. The present research therefore used detailed computational simulations to derive contrasting predictions regarding ACC activity and error rate as a function of response speed. The simulations demonstrated a clear dissociation between conflict and error likelihood: fast response trials are associated with low conflict but high error likelihood, whereas slow response trials show the opposite pattern. Using the N2 component as an index of ACC activity, an EEG study demonstrated that when conflict and error likelihood are dissociated in this way, ACC activity tracks conflict and is negatively correlated with error likelihood. These findings support the conflict-monitoring theory and suggest that, in speeded decision tasks, ACC activity reflects current task demands rather than the retrospective coding of past performance.

  14. VIPR III VADR SPIDER Structural Design and Analysis

    Science.gov (United States)

    Li, Wesley; Chen, Tony

    2016-01-01

    In support of the National Aeronautics and Space Administration (NASA) Vehicle Integrated Propulsion Research (VIPR) Phase III team to evaluate the volcanic ash environment effects on the Pratt & Whitney F117-PW-100 turbofan engine, NASA Armstrong Flight Research Center has successfully performed structural design and analysis on the Volcanic Ash Distribution Rig (VADR) and the Structural Particulate Integration Device for Engine Research (SPIDER) for the ash ingestion test. Static and dynamic load analyses were performed to ensure no structural failure would occur during the test. Modal analysis was conducted, and the results were used to develop engine power setting avoidance zones. These engine power setting avoidance zones were defined to minimize the dwell time when the natural frequencies of the VADR/SPIDER system coincided with the excitation frequencies of the engine which was operating at various revolutions per minute. Vortex-induced vibration due to engine suction air flow during the ingestion test was also evaluated, but was not a concern.

  15. A simulation study of likelihood inference procedures in rayleigh distribution with censored data

    International Nuclear Information System (INIS)

    Baklizi, S. A.; Baker, H. M.

    2001-01-01

    Inference procedures based on the likelihood function are considered for the one parameter Rayleigh distribution with type1 and type 2 censored data. Using simulation techniques, the finite sample performances of the maximum likelihood estimator and the large sample likelihood interval estimation procedures based on the Wald, the Rao, and the likelihood ratio statistics are investigated. It appears that the maximum likelihood estimator is unbiased. The approximate variance estimates obtained from the asymptotic normal distribution of the maximum likelihood estimator are accurate under type 2 censored data while they tend to be smaller than the actual variances when considering type1 censored data of small size. It appears also that interval estimation based on the Wald and Rao statistics need much more sample size than interval estimation based on the likelihood ratio statistic to attain reasonable accuracy. (authors). 15 refs., 4 tabs

  16. From reads to genes to pathways: differential expression analysis of RNA-Seq experiments using Rsubread and the edgeR quasi-likelihood pipeline.

    Science.gov (United States)

    Chen, Yunshun; Lun, Aaron T L; Smyth, Gordon K

    2016-01-01

    In recent years, RNA sequencing (RNA-seq) has become a very widely used technology for profiling gene expression. One of the most common aims of RNA-seq profiling is to identify genes or molecular pathways that are differentially expressed (DE) between two or more biological conditions. This article demonstrates a computational workflow for the detection of DE genes and pathways from RNA-seq data by providing a complete analysis of an RNA-seq experiment profiling epithelial cell subsets in the mouse mammary gland. The workflow uses R software packages from the open-source Bioconductor project and covers all steps of the analysis pipeline, including alignment of read sequences, data exploration, differential expression analysis, visualization and pathway analysis. Read alignment and count quantification is conducted using the Rsubread package and the statistical analyses are performed using the edgeR package. The differential expression analysis uses the quasi-likelihood functionality of edgeR.

  17. MEGA5: Molecular Evolutionary Genetics Analysis Using Maximum Likelihood, Evolutionary Distance, and Maximum Parsimony Methods

    Science.gov (United States)

    Tamura, Koichiro; Peterson, Daniel; Peterson, Nicholas; Stecher, Glen; Nei, Masatoshi; Kumar, Sudhir

    2011-01-01

    Comparative analysis of molecular sequence data is essential for reconstructing the evolutionary histories of species and inferring the nature and extent of selective forces shaping the evolution of genes and species. Here, we announce the release of Molecular Evolutionary Genetics Analysis version 5 (MEGA5), which is a user-friendly software for mining online databases, building sequence alignments and phylogenetic trees, and using methods of evolutionary bioinformatics in basic biology, biomedicine, and evolution. The newest addition in MEGA5 is a collection of maximum likelihood (ML) analyses for inferring evolutionary trees, selecting best-fit substitution models (nucleotide or amino acid), inferring ancestral states and sequences (along with probabilities), and estimating evolutionary rates site-by-site. In computer simulation analyses, ML tree inference algorithms in MEGA5 compared favorably with other software packages in terms of computational efficiency and the accuracy of the estimates of phylogenetic trees, substitution parameters, and rate variation among sites. The MEGA user interface has now been enhanced to be activity driven to make it easier for the use of both beginners and experienced scientists. This version of MEGA is intended for the Windows platform, and it has been configured for effective use on Mac OS X and Linux desktops. It is available free of charge from http://www.megasoftware.net. PMID:21546353

  18. Likelihood inference for unions of interacting discs

    DEFF Research Database (Denmark)

    Møller, Jesper; Helisova, K.

    2010-01-01

    This is probably the first paper which discusses likelihood inference for a random set using a germ-grain model, where the individual grains are unobservable, edge effects occur and other complications appear. We consider the case where the grains form a disc process modelled by a marked point...... process, where the germs are the centres and the marks are the associated radii of the discs. We propose to use a recent parametric class of interacting disc process models, where the minimal sufficient statistic depends on various geometric properties of the random set, and the density is specified......-based maximum likelihood inference and the effect of specifying different reference Poisson models....

  19. Extended maximum likelihood analysis of apparent flattenings of S0 and spiral galaxies

    International Nuclear Information System (INIS)

    Okamura, Sadanori; Takase, Bunshiro; Hamabe, Masaru; Nakada, Yoshikazu; Kodaira, Keiichi.

    1981-01-01

    Apparent flattenings of S0 and spiral galaxies compiled by Sandage et al. (1970) and van den Bergh (1977), and those listed in the Second Reference Catalogue (RC2) are analyzed by means of the extended maximum likelihood method which was recently developed in the information theory for statistical model identification. Emphasis is put on the possible difference in the distribution of intrinsic flattenings between S0's and spirals as a group, and on the apparent disagreements present in the previous results. The present analysis shows that (1) One cannot conclude on the basis of the data in the Reference Catalogue of Bright Galaxies (RCBG) that the distribution of intrinsic flattenings of spirals is almost identical to that of S0's; spirals have wider dispersion than S0's, and there are more round systems in spirals than in S0's. (2) The distribution of intrinsic flattenings of S0's and spirals derived from the data in RC2 again indicates a significant difference from each other. (3) The distribution of intrinsic flattenings of S0's exhibits different characteristics depending upon the surface-brightness level; the distribution with one component is obtained from the data at RCBG level (--23.5 mag arcsec -2 ) and that with two components at RC2 level (25 mag arcsec -2 ). (author)

  20. Final Safety Analysis Report (FSAR) for Building 332, Increment III

    Energy Technology Data Exchange (ETDEWEB)

    Odell, B. N.; Toy, Jr., A. J.

    1977-08-31

    This Final Safety Analysis Report (FSAR) supplements the Preliminary Safety Analysis Report (PSAR), dated January 18, 1974, for Building 332, Increment III of the Plutonium Materials Engineering Facility located at the Lawrence Livermore Laboratory (LLL). The FSAR, in conjunction with the PSAR, shows that the completed increment provides facilities for safely conducting the operations as described. These documents satisfy the requirements of ERDA Manual Appendix 6101, Annex C, dated April 8, 1971. The format and content of this FSAR complies with the basic requirements of the letter of request from ERDA San to LLL, dated March 10, 1972. Included as appendices in support of th FSAR are the Building 332 Operational Safety Procedure and the LLL Disaster Control Plan.

  1. Maximum Likelihood Estimation and Inference With Examples in R, SAS and ADMB

    CERN Document Server

    Millar, Russell B

    2011-01-01

    This book takes a fresh look at the popular and well-established method of maximum likelihood for statistical estimation and inference. It begins with an intuitive introduction to the concepts and background of likelihood, and moves through to the latest developments in maximum likelihood methodology, including general latent variable models and new material for the practical implementation of integrated likelihood using the free ADMB software. Fundamental issues of statistical inference are also examined, with a presentation of some of the philosophical debates underlying the choice of statis

  2. Generalized empirical likelihood methods for analyzing longitudinal data

    KAUST Repository

    Wang, S.

    2010-02-16

    Efficient estimation of parameters is a major objective in analyzing longitudinal data. We propose two generalized empirical likelihood based methods that take into consideration within-subject correlations. A nonparametric version of the Wilks theorem for the limiting distributions of the empirical likelihood ratios is derived. It is shown that one of the proposed methods is locally efficient among a class of within-subject variance-covariance matrices. A simulation study is conducted to investigate the finite sample properties of the proposed methods and compare them with the block empirical likelihood method by You et al. (2006) and the normal approximation with a correctly estimated variance-covariance. The results suggest that the proposed methods are generally more efficient than existing methods which ignore the correlation structure, and better in coverage compared to the normal approximation with correctly specified within-subject correlation. An application illustrating our methods and supporting the simulation study results is also presented.

  3. Maximum likelihood estimation of phase-type distributions

    DEFF Research Database (Denmark)

    Esparza, Luz Judith R

    for both univariate and multivariate cases. Methods like the EM algorithm and Markov chain Monte Carlo are applied for this purpose. Furthermore, this thesis provides explicit formulae for computing the Fisher information matrix for discrete and continuous phase-type distributions, which is needed to find......This work is concerned with the statistical inference of phase-type distributions and the analysis of distributions with rational Laplace transform, known as matrix-exponential distributions. The thesis is focused on the estimation of the maximum likelihood parameters of phase-type distributions...... confidence regions for their estimated parameters. Finally, a new general class of distributions, called bilateral matrix-exponential distributions, is defined. These distributions have the entire real line as domain and can be used, for instance, for modelling. In addition, this class of distributions...

  4. Three-dimensional heat transfer analysis of the Doublet III beamline calorimeter

    International Nuclear Information System (INIS)

    Kamperschroer, J.H.; Pipkins, J.F.

    1979-10-01

    A general three-dimensional analysis has been formulated to study the flow of heat in a neutral beam calorimeter. The boundary value problem with an arbitrary incident heat flux has been solved using Fourier analysis and Laplace transform techniques. A general solution has been obtained and subsequently studied using numerical techniques as applied to the particular geometry and incident heat flux conditions of the Doublet III injection system. Negligible errors result in unfolding the incident heat flux through the use of thermocouples located near the rear surface, if data taking is initiated at the proper time and proceeds at a sufficiently rapid rate

  5. Comparison of standard maximum likelihood classification and polytomous logistic regression used in remote sensing

    Science.gov (United States)

    John Hogland; Nedret Billor; Nathaniel Anderson

    2013-01-01

    Discriminant analysis, referred to as maximum likelihood classification within popular remote sensing software packages, is a common supervised technique used by analysts. Polytomous logistic regression (PLR), also referred to as multinomial logistic regression, is an alternative classification approach that is less restrictive, more flexible, and easy to interpret. To...

  6. Evidence of seasonal variation in longitudinal growth of height in a sample of boys from Stuttgart Carlsschule, 1771-1793, using combined principal component analysis and maximum likelihood principle.

    Science.gov (United States)

    Lehmann, A; Scheffler, Ch; Hermanussen, M

    2010-02-01

    Recent progress in modelling individual growth has been achieved by combining the principal component analysis and the maximum likelihood principle. This combination models growth even in incomplete sets of data and in data obtained at irregular intervals. We re-analysed late 18th century longitudinal growth of German boys from the boarding school Carlsschule in Stuttgart. The boys, aged 6-23 years, were measured at irregular 3-12 monthly intervals during the period 1771-1793. At the age of 18 years, mean height was 1652 mm, but height variation was large. The shortest boy reached 1474 mm, the tallest 1826 mm. Measured height closely paralleled modelled height, with mean difference of 4 mm, SD 7 mm. Seasonal height variation was found. Low growth rates occurred in spring and high growth rates in summer and autumn. The present study demonstrates that combining the principal component analysis and the maximum likelihood principle enables growth modelling in historic height data also. Copyright (c) 2009 Elsevier GmbH. All rights reserved.

  7. On the likelihood function of Gaussian max-stable processes

    KAUST Repository

    Genton, M. G.; Ma, Y.; Sang, H.

    2011-01-01

    We derive a closed form expression for the likelihood function of a Gaussian max-stable process indexed by ℝd at p≤d+1 sites, d≥1. We demonstrate the gain in efficiency in the maximum composite likelihood estimators of the covariance matrix from p=2 to p=3 sites in ℝ2 by means of a Monte Carlo simulation study. © 2011 Biometrika Trust.

  8. On the likelihood function of Gaussian max-stable processes

    KAUST Repository

    Genton, M. G.

    2011-05-24

    We derive a closed form expression for the likelihood function of a Gaussian max-stable process indexed by ℝd at p≤d+1 sites, d≥1. We demonstrate the gain in efficiency in the maximum composite likelihood estimators of the covariance matrix from p=2 to p=3 sites in ℝ2 by means of a Monte Carlo simulation study. © 2011 Biometrika Trust.

  9. Sampling variability in forensic likelihood-ratio computation: A simulation study

    NARCIS (Netherlands)

    Ali, Tauseef; Spreeuwers, Lieuwe Jan; Veldhuis, Raymond N.J.; Meuwly, Didier

    2015-01-01

    Recently, in the forensic biometric community, there is a growing interest to compute a metric called “likelihood- ratio‿ when a pair of biometric specimens is compared using a biometric recognition system. Generally, a biomet- ric recognition system outputs a score and therefore a likelihood-ratio

  10. Maximum likelihood estimation of finite mixture model for economic data

    Science.gov (United States)

    Phoong, Seuk-Yen; Ismail, Mohd Tahir

    2014-06-01

    Finite mixture model is a mixture model with finite-dimension. This models are provides a natural representation of heterogeneity in a finite number of latent classes. In addition, finite mixture models also known as latent class models or unsupervised learning models. Recently, maximum likelihood estimation fitted finite mixture models has greatly drawn statistician's attention. The main reason is because maximum likelihood estimation is a powerful statistical method which provides consistent findings as the sample sizes increases to infinity. Thus, the application of maximum likelihood estimation is used to fit finite mixture model in the present paper in order to explore the relationship between nonlinear economic data. In this paper, a two-component normal mixture model is fitted by maximum likelihood estimation in order to investigate the relationship among stock market price and rubber price for sampled countries. Results described that there is a negative effect among rubber price and stock market price for Malaysia, Thailand, Philippines and Indonesia.

  11. Incorporating real-time traffic and weather data to explore road accident likelihood and severity in urban arterials.

    Science.gov (United States)

    Theofilatos, Athanasios

    2017-06-01

    The effective treatment of road accidents and thus the enhancement of road safety is a major concern to societies due to the losses in human lives and the economic and social costs. The investigation of road accident likelihood and severity by utilizing real-time traffic and weather data has recently received significant attention by researchers. However, collected data mainly stem from freeways and expressways. Consequently, the aim of the present paper is to add to the current knowledge by investigating accident likelihood and severity by exploiting real-time traffic and weather data collected from urban arterials in Athens, Greece. Random Forests (RF) are firstly applied for preliminary analysis purposes. More specifically, it is aimed to rank candidate variables according to their relevant importance and provide a first insight on the potential significant variables. Then, Bayesian logistic regression as well finite mixture and mixed effects logit models are applied to further explore factors associated with accident likelihood and severity respectively. Regarding accident likelihood, the Bayesian logistic regression showed that variations in traffic significantly influence accident occurrence. On the other hand, accident severity analysis revealed a generally mixed influence of traffic variations on accident severity, although international literature states that traffic variations increase severity. Lastly, weather parameters did not find to have a direct influence on accident likelihood or severity. The study added to the current knowledge by incorporating real-time traffic and weather data from urban arterials to investigate accident occurrence and accident severity mechanisms. The identification of risk factors can lead to the development of effective traffic management strategies to reduce accident occurrence and severity of injuries in urban arterials. Copyright © 2017 Elsevier Ltd and National Safety Council. All rights reserved.

  12. Analysis of gold(I/III)-complexes by HPLC-ICP-MS demonstrates gold(III) stability in surface waters.

    Science.gov (United States)

    Ta, Christine; Reith, Frank; Brugger, Joël; Pring, Allan; Lenehan, Claire E

    2014-05-20

    Understanding the form in which gold is transported in surface- and groundwaters underpins our understanding of gold dispersion and (bio)geochemical cycling. Yet, to date, there are no direct techniques capable of identifying the oxidation state and complexation of gold in natural waters. We present a reversed phase ion-pairing HPLC-ICP-MS method for the separation and determination of aqueous gold(III)-chloro-hydroxyl, gold(III)-bromo-hydroxyl, gold(I)-thiosulfate, and gold(I)-cyanide complexes. Detection limits for the gold species range from 0.05 to 0.30 μg L(-1). The [Au(CN)2](-) gold cyanide complex was detected in five of six waters from tailings and adjacent monitoring bores of working gold mines. Contrary to thermodynamic predictions, evidence was obtained for the existence of Au(III)-complexes in circumneutral, hypersaline waters of a natural lake overlying a gold deposit in Western Australia. This first direct evidence for the existence and stability of Au(III)-complexes in natural surface waters suggests that Au(III)-complexes may be important for the transport and biogeochemical cycling of gold in surface environments. Overall, these results show that near-μg L(-1) enrichments of Au in environmental waters result from metastable ligands (e.g., CN(-)) as well as kinetically controlled redox processes leading to the stability of highly soluble Au(III)-complexes.

  13. New BFA Method Based on Attractor Neural Network and Likelihood Maximization

    Czech Academy of Sciences Publication Activity Database

    Frolov, A. A.; Húsek, Dušan; Polyakov, P.Y.; Snášel, V.

    2014-01-01

    Roč. 132, 20 May (2014), s. 14-29 ISSN 0925-2312 Grant - others:GA MŠk(CZ) ED1.1.00/02.0070; GA MŠk(CZ) EE.2.3.20.0073 Program:ED Institutional support: RVO:67985807 Keywords : recurrent neural network * associative memory * Hebbian learning rule * neural network application * data mining * statistics * Boolean factor analysis * information gain * dimension reduction * likelihood-maximization * bars problem Subject RIV: IN - Informatics, Computer Science Impact factor: 2.083, year: 2014

  14. WISC-III e WAIS-III na avaliação da inteligência de cegos WISC-III/WAIS-III en ciegos WISC-III and WAIS-III in intellectual assessment of blind people

    Directory of Open Access Journals (Sweden)

    Elizabeth do Nascimento

    2007-12-01

    verbal scales. After adaptations in stimuli and instructions, scales were applied to 120 children and 52 adults in Belo Horizonte MG Brazil. Results show that modified verbal scales had a good internal consistency (alpha > 0.80 and the factorial analysis clearly indicated the presence of a single principal component. Actually it explains a total of 81% and 64% for WISC III and WAIS III respectively. Since adaptations do not affect the factorial structure of the above-mentioned scales, professionals may use the modified scales to measure the intelligence of blind people.

  15. Assessing Compatibility of Direct Detection Data: Halo-Independent Global Likelihood Analyses

    CERN Document Server

    Gelmini, Graciela B.

    2016-10-18

    We present two different halo-independent methods utilizing a global maximum likelihood that can assess the compatibility of dark matter direct detection data given a particular dark matter model. The global likelihood we use is comprised of at least one extended likelihood and an arbitrary number of Poisson or Gaussian likelihoods. In the first method we find the global best fit halo function and construct a two sided pointwise confidence band, which can then be compared with those derived from the extended likelihood alone to assess the joint compatibility of the data. In the second method we define a "constrained parameter goodness-of-fit" test statistic, whose $p$-value we then use to define a "plausibility region" (e.g. where $p \\geq 10\\%$). For any halo function not entirely contained within the plausibility region, the level of compatibility of the data is very low (e.g. $p < 10 \\%$). As an example we apply these methods to CDMS-II-Si and SuperCDMS data, assuming dark matter particles with elastic s...

  16. A short proof that phylogenetic tree reconstruction by maximum likelihood is hard.

    Science.gov (United States)

    Roch, Sebastien

    2006-01-01

    Maximum likelihood is one of the most widely used techniques to infer evolutionary histories. Although it is thought to be intractable, a proof of its hardness has been lacking. Here, we give a short proof that computing the maximum likelihood tree is NP-hard by exploiting a connection between likelihood and parsimony observed by Tuffley and Steel.

  17. A Short Proof that Phylogenetic Tree Reconstruction by Maximum Likelihood is Hard

    OpenAIRE

    Roch, S.

    2005-01-01

    Maximum likelihood is one of the most widely used techniques to infer evolutionary histories. Although it is thought to be intractable, a proof of its hardness has been lacking. Here, we give a short proof that computing the maximum likelihood tree is NP-hard by exploiting a connection between likelihood and parsimony observed by Tuffley and Steel.

  18. Analysis of hourly crash likelihood using unbalanced panel data mixed logit model and real-time driving environmental big data.

    Science.gov (United States)

    Chen, Feng; Chen, Suren; Ma, Xiaoxiang

    2018-06-01

    Driving environment, including road surface conditions and traffic states, often changes over time and influences crash probability considerably. It becomes stretched for traditional crash frequency models developed in large temporal scales to capture the time-varying characteristics of these factors, which may cause substantial loss of critical driving environmental information on crash prediction. Crash prediction models with refined temporal data (hourly records) are developed to characterize the time-varying nature of these contributing factors. Unbalanced panel data mixed logit models are developed to analyze hourly crash likelihood of highway segments. The refined temporal driving environmental data, including road surface and traffic condition, obtained from the Road Weather Information System (RWIS), are incorporated into the models. Model estimation results indicate that the traffic speed, traffic volume, curvature and chemically wet road surface indicator are better modeled as random parameters. The estimation results of the mixed logit models based on unbalanced panel data show that there are a number of factors related to crash likelihood on I-25. Specifically, weekend indicator, November indicator, low speed limit and long remaining service life of rutting indicator are found to increase crash likelihood, while 5-am indicator and number of merging ramps per lane per mile are found to decrease crash likelihood. The study underscores and confirms the unique and significant impacts on crash imposed by the real-time weather, road surface, and traffic conditions. With the unbalanced panel data structure, the rich information from real-time driving environmental big data can be well incorporated. Copyright © 2018 National Safety Council and Elsevier Ltd. All rights reserved.

  19. Inference for the Sharpe Ratio Using a Likelihood-Based Approach

    Directory of Open Access Journals (Sweden)

    Ying Liu

    2012-01-01

    Full Text Available The Sharpe ratio is the prominent risk-adjusted performance measure used by practitioners. Statistical testing of this ratio using its asymptotic distribution has lagged behind its use. In this paper, highly accurate likelihood analysis is applied for inference on the Sharpe ratio. Both the one- and two-sample problems are considered. The methodology has O(n−3/2 distributional accuracy and can be implemented using any parametric return distribution structure. Simulations are provided to demonstrate the method's superior accuracy over existing methods used for testing in the literature.

  20. Simplified likelihood for the re-interpretation of public CMS results

    CERN Document Server

    The CMS Collaboration

    2017-01-01

    In this note, a procedure for the construction of simplified likelihoods for the re-interpretation of the results of CMS searches for new physics is presented. The procedure relies on the use of a reduced set of information on the background models used in these searches which can readily be provided by the CMS collaboration. A toy example is used to demonstrate the procedure and its accuracy in reproducing the full likelihood for setting limits in models for physics beyond the standard model. Finally, two representative searches from the CMS collaboration are used to demonstrate the validity of the simplified likelihood approach under realistic conditions.

  1. Semi-Parametric Maximum Likelihood Method for Interaction in Case-Mother Control-Mother Designs: Package SPmlficmcm

    Directory of Open Access Journals (Sweden)

    Moliere Nguile-Makao

    2015-12-01

    Full Text Available The analysis of interaction effects involving genetic variants and environmental exposures on the risk of adverse obstetric and early-life outcomes is generally performed using standard logistic regression in the case-mother and control-mother design. However such an analysis is inefficient because it does not take into account the natural family-based constraints present in the parent-child relationship. Recently, a new approach based on semi-parametric maximum likelihood estimation was proposed. The advantage of this approach is that it takes into account the parental relationship between the mother and her child in estimation. But a package implementing this method has not been widely available. In this paper, we present SPmlficmcm, an R package implementing this new method and we propose an extension of the method to handle missing offspring genotype data by maximum likelihood estimation. Our choice to treat missing data of the offspring genotype was motivated by the fact that in genetic association studies where the genetic data of mother and child are available, there are usually more missing data on the genotype of the offspring than that of the mother. The package builds a non-linear system from the data and solves and computes the estimates from the gradient and the Hessian matrix of the log profile semi-parametric likelihood function. Finally, we analyze a simulated dataset to show the usefulness of the package.

  2. Moral Identity Predicts Doping Likelihood via Moral Disengagement and Anticipated Guilt.

    Science.gov (United States)

    Kavussanu, Maria; Ring, Christopher

    2017-08-01

    In this study, we integrated elements of social cognitive theory of moral thought and action and the social cognitive model of moral identity to better understand doping likelihood in athletes. Participants (N = 398) recruited from a variety of team sports completed measures of moral identity, moral disengagement, anticipated guilt, and doping likelihood. Moral identity predicted doping likelihood indirectly via moral disengagement and anticipated guilt. Anticipated guilt about potential doping mediated the relationship between moral disengagement and doping likelihood. Our findings provide novel evidence to suggest that athletes, who feel that being a moral person is central to their self-concept, are less likely to use banned substances due to their lower tendency to morally disengage and the more intense feelings of guilt they expect to experience for using banned substances.

  3. Maximum likelihood of phylogenetic networks.

    Science.gov (United States)

    Jin, Guohua; Nakhleh, Luay; Snir, Sagi; Tuller, Tamir

    2006-11-01

    Horizontal gene transfer (HGT) is believed to be ubiquitous among bacteria, and plays a major role in their genome diversification as well as their ability to develop resistance to antibiotics. In light of its evolutionary significance and implications for human health, developing accurate and efficient methods for detecting and reconstructing HGT is imperative. In this article we provide a new HGT-oriented likelihood framework for many problems that involve phylogeny-based HGT detection and reconstruction. Beside the formulation of various likelihood criteria, we show that most of these problems are NP-hard, and offer heuristics for efficient and accurate reconstruction of HGT under these criteria. We implemented our heuristics and used them to analyze biological as well as synthetic data. In both cases, our criteria and heuristics exhibited very good performance with respect to identifying the correct number of HGT events as well as inferring their correct location on the species tree. Implementation of the criteria as well as heuristics and hardness proofs are available from the authors upon request. Hardness proofs can also be downloaded at http://www.cs.tau.ac.il/~tamirtul/MLNET/Supp-ML.pdf

  4. Risk Presentation Using the Three Dimensions of Likelihood, Severity, and Level of Control

    Science.gov (United States)

    Watson, Clifford

    2010-01-01

    Traditional hazard analysis techniques utilize a two-dimensional representation of the results determined by relative likelihood and severity of the residual risk. These matrices present a quick-look at the Likelihood (Y-axis) and Severity (X-axis) of the probable outcome of a hazardous event. A three-dimensional method, described herein, utilizes the traditional X and Y axes, while adding a new, third dimension, shown as the Z-axis, and referred to as the Level of Control. The elements of the Z-axis are modifications of the Hazard Elimination and Control steps (also known as the Hazard Reduction Precedence Sequence). These steps are: 1. Eliminate risk through design. 2. Substitute less risky materials for more hazardous materials. 3. Install safety devices. 4. Install caution and warning devices. 5. Develop administrative controls (to include special procedures and training.) 6. Provide protective clothing and equipment. When added to the two-dimensional models, the level of control adds a visual representation of the risk associated with the hazardous condition, creating a tall-pole for the leastwell-controlled failure while establishing the relative likelihood and severity of all causes and effects for an identified hazard. Computer modeling of the analytical results, using spreadsheets and three-dimensional charting gives a visual confirmation of the relationship between causes and their controls.

  5. MAXIMUM LIKELIHOOD CLASSIFICATION OF HIGH-RESOLUTION SAR IMAGES IN URBAN AREA

    Directory of Open Access Journals (Sweden)

    M. Soheili Majd

    2012-09-01

    Full Text Available In this work, we propose a state-of-the-art on statistical analysis of polarimetric synthetic aperture radar (SAR data, through the modeling of several indices. We concentrate on eight ground classes which have been carried out from amplitudes, co-polarisation ratio, depolarization ratios, and other polarimetric descriptors. To study their different statistical behaviours, we consider Gauss, log- normal, Beta I, Weibull, Gamma, and Fisher statistical models and estimate their parameters using three methods: method of moments (MoM, maximum-likelihood (ML methodology, and log-cumulants method (MoML. Then, we study the opportunity of introducing this information in an adapted supervised classification scheme based on Maximum–Likelihood and Fisher pdf. Our work relies on an image of a suburban area, acquired by the airborne RAMSES SAR sensor of ONERA. The results prove the potential of such data to discriminate urban surfaces and show the usefulness of adapting any classical classification algorithm however classification maps present a persistant class confusion between flat gravelled or concrete roofs and trees.

  6. Heterogeneity in the Likelihood of Market Advisory Service Use by U.S. Crop Producers

    NARCIS (Netherlands)

    Pennings, J.M.E.; Irwin, S.; Good, D.; Isengildina, O.

    2005-01-01

    Abstract Analysis of a unique data set of 1,400 U.S. crop producers using a mixture-modeling framework shows that the likelihood of Marketing Advisory Services (MAS) use is, among others, driven by the perceived performance of MAS in terms of return and risk reduction, the match between the MAS and

  7. Transfer Entropy as a Log-Likelihood Ratio

    Science.gov (United States)

    Barnett, Lionel; Bossomaier, Terry

    2012-09-01

    Transfer entropy, an information-theoretic measure of time-directed information transfer between joint processes, has steadily gained popularity in the analysis of complex stochastic dynamics in diverse fields, including the neurosciences, ecology, climatology, and econometrics. We show that for a broad class of predictive models, the log-likelihood ratio test statistic for the null hypothesis of zero transfer entropy is a consistent estimator for the transfer entropy itself. For finite Markov chains, furthermore, no explicit model is required. In the general case, an asymptotic χ2 distribution is established for the transfer entropy estimator. The result generalizes the equivalence in the Gaussian case of transfer entropy and Granger causality, a statistical notion of causal influence based on prediction via vector autoregression, and establishes a fundamental connection between directed information transfer and causality in the Wiener-Granger sense.

  8. Workplace air measurements and likelihood of exposure to manufactured nano-objects, agglomerates, and aggregates

    International Nuclear Information System (INIS)

    Brouwer, Derk H.; Duuren-Stuurman, Birgit van; Berges, Markus; Bard, Delphine; Jankowska, Elzbieta; Moehlmann, Carsten; Pelzer, Johannes; Mark, Dave

    2013-01-01

    Manufactured nano-objects, agglomerates, and aggregates (NOAA) may have adverse effect on human health, but little is known about occupational risks since actual estimates of exposure are lacking. In a large-scale workplace air-monitoring campaign, 19 enterprises were visited and 120 potential exposure scenarios were measured. A multi-metric exposure assessment approach was followed and a decision logic was developed to afford analysis of all results in concert. The overall evaluation was classified by categories of likelihood of exposure. At task level about 53 % showed increased particle number or surface area concentration compared to “background” level, whereas 72 % of the TEM samples revealed an indication that NOAA were present in the workplace. For 54 out of the 120 task-based exposure scenarios, an overall evaluation could be made based on all parameters of the decision logic. For only 1 exposure scenario (approximately 2 %), the highest level of potential likelihood was assigned, whereas in total in 56 % of the exposure scenarios the overall evaluation revealed the lowest level of likelihood. However, for the remaining 42 % exposure to NOAA could not be excluded

  9. Updated logistic regression equations for the calculation of post-fire debris-flow likelihood in the western United States

    Science.gov (United States)

    Staley, Dennis M.; Negri, Jacquelyn A.; Kean, Jason W.; Laber, Jayme L.; Tillery, Anne C.; Youberg, Ann M.

    2016-06-30

    Wildfire can significantly alter the hydrologic response of a watershed to the extent that even modest rainstorms can generate dangerous flash floods and debris flows. To reduce public exposure to hazard, the U.S. Geological Survey produces post-fire debris-flow hazard assessments for select fires in the western United States. We use publicly available geospatial data describing basin morphology, burn severity, soil properties, and rainfall characteristics to estimate the statistical likelihood that debris flows will occur in response to a storm of a given rainfall intensity. Using an empirical database and refined geospatial analysis methods, we defined new equations for the prediction of debris-flow likelihood using logistic regression methods. We showed that the new logistic regression model outperformed previous models used to predict debris-flow likelihood.

  10. Predictors of Self-Reported Likelihood of Working with Older Adults

    Science.gov (United States)

    Eshbaugh, Elaine M.; Gross, Patricia E.; Satrom, Tatum

    2010-01-01

    This study examined the self-reported likelihood of working with older adults in a future career among 237 college undergraduates at a midsized Midwestern university. Although aging anxiety was not significantly related to likelihood of working with older adults, those students who had a greater level of death anxiety were less likely than other…

  11. Performance and Complexity Analysis of Blind FIR Channel Identification Algorithms Based on Deterministic Maximum Likelihood in SIMO Systems

    DEFF Research Database (Denmark)

    De Carvalho, Elisabeth; Omar, Samir; Slock, Dirk

    2013-01-01

    We analyze two algorithms that have been introduced previously for Deterministic Maximum Likelihood (DML) blind estimation of multiple FIR channels. The first one is a modification of the Iterative Quadratic ML (IQML) algorithm. IQML gives biased estimates of the channel and performs poorly at low...... to the initialization. Its asymptotic performance does not reach the DML performance though. The second strategy, called Pseudo-Quadratic ML (PQML), is naturally denoised. The denoising in PQML is furthermore more efficient than in DIQML: PQML yields the same asymptotic performance as DML, as opposed to DIQML......, but requires a consistent initialization. We furthermore compare DIQML and PQML to the strategy of alternating minimization w.r.t. symbols and channel for solving DML (AQML). An asymptotic performance analysis, a complexity evaluation and simulation results are also presented. The proposed DIQML and PQML...

  12. Anticipating cognitive effort: roles of perceived error-likelihood and time demands.

    Science.gov (United States)

    Dunn, Timothy L; Inzlicht, Michael; Risko, Evan F

    2017-11-13

    Why are some actions evaluated as effortful? In the present set of experiments we address this question by examining individuals' perception of effort when faced with a trade-off between two putative cognitive costs: how much time a task takes vs. how error-prone it is. Specifically, we were interested in whether individuals anticipate engaging in a small amount of hard work (i.e., low time requirement, but high error-likelihood) vs. a large amount of easy work (i.e., high time requirement, but low error-likelihood) as being more effortful. In between-subject designs, Experiments 1 through 3 demonstrated that individuals anticipate options that are high in perceived error-likelihood (yet less time consuming) as more effortful than options that are perceived to be more time consuming (yet low in error-likelihood). Further, when asked to evaluate which of the two tasks was (a) more effortful, (b) more error-prone, and (c) more time consuming, effort-based and error-based choices closely tracked one another, but this was not the case for time-based choices. Utilizing a within-subject design, Experiment 4 demonstrated overall similar pattern of judgments as Experiments 1 through 3. However, both judgments of error-likelihood and time demand similarly predicted effort judgments. Results are discussed within the context of extant accounts of cognitive control, with considerations of how error-likelihood and time demands may independently and conjunctively factor into judgments of cognitive effort.

  13. Calibration of two complex ecosystem models with different likelihood functions

    Science.gov (United States)

    Hidy, Dóra; Haszpra, László; Pintér, Krisztina; Nagy, Zoltán; Barcza, Zoltán

    2014-05-01

    goodness metric on calibration. The different likelihoods are different functions of RMSE (root mean squared error) weighted by measurement uncertainty: exponential / linear / quadratic / linear normalized by correlation. As a first calibration step sensitivity analysis was performed in order to select the influential parameters which have strong effect on the output data. In the second calibration step only the sensitive parameters were calibrated (optimal values and confidence intervals were calculated). In case of PaSim more parameters were found responsible for the 95% of the output data variance than is case of BBGC MuSo. Analysis of the results of the optimized models revealed that the exponential likelihood estimation proved to be the most robust (best model simulation with optimized parameter, highest confidence interval increase). The cross-validation of the model simulations can help in constraining the highly uncertain greenhouse gas budget of grasslands.

  14. CLONING, EXPRESSION, AND MUTATIONAL ANALYSIS OF RAT S-ADENOSYL-1-METHIONINE: ARSENIC (III) METHYLTRANSFERASE

    Science.gov (United States)

    CLONING, EXPRESSION, AND MUTATIONAL ANALYSIS OF RAT S-ADENOSYL-L-METHIONINE: ARSENIC(III) METHYLTRANSFERASEStephen B. Waters, Ph.D., Miroslav Styblo, Ph.D., Melinda A. Beck, Ph.D., University of North Carolina at Chapel Hill; David J. Thomas, Ph.D., U.S. Environmental...

  15. Theoretical Analysis of Penalized Maximum-Likelihood Patlak Parametric Image Reconstruction in Dynamic PET for Lesion Detection.

    Science.gov (United States)

    Yang, Li; Wang, Guobao; Qi, Jinyi

    2016-04-01

    Detecting cancerous lesions is a major clinical application of emission tomography. In a previous work, we studied penalized maximum-likelihood (PML) image reconstruction for lesion detection in static PET. Here we extend our theoretical analysis of static PET reconstruction to dynamic PET. We study both the conventional indirect reconstruction and direct reconstruction for Patlak parametric image estimation. In indirect reconstruction, Patlak parametric images are generated by first reconstructing a sequence of dynamic PET images, and then performing Patlak analysis on the time activity curves (TACs) pixel-by-pixel. In direct reconstruction, Patlak parametric images are estimated directly from raw sinogram data by incorporating the Patlak model into the image reconstruction procedure. PML reconstruction is used in both the indirect and direct reconstruction methods. We use a channelized Hotelling observer (CHO) to assess lesion detectability in Patlak parametric images. Simplified expressions for evaluating the lesion detectability have been derived and applied to the selection of the regularization parameter value to maximize detection performance. The proposed method is validated using computer-based Monte Carlo simulations. Good agreements between the theoretical predictions and the Monte Carlo results are observed. Both theoretical predictions and Monte Carlo simulation results show the benefit of the indirect and direct methods under optimized regularization parameters in dynamic PET reconstruction for lesion detection, when compared with the conventional static PET reconstruction.

  16. Profile-likelihood Confidence Intervals in Item Response Theory Models.

    Science.gov (United States)

    Chalmers, R Philip; Pek, Jolynn; Liu, Yang

    2017-01-01

    Confidence intervals (CIs) are fundamental inferential devices which quantify the sampling variability of parameter estimates. In item response theory, CIs have been primarily obtained from large-sample Wald-type approaches based on standard error estimates, derived from the observed or expected information matrix, after parameters have been estimated via maximum likelihood. An alternative approach to constructing CIs is to quantify sampling variability directly from the likelihood function with a technique known as profile-likelihood confidence intervals (PL CIs). In this article, we introduce PL CIs for item response theory models, compare PL CIs to classical large-sample Wald-type CIs, and demonstrate important distinctions among these CIs. CIs are then constructed for parameters directly estimated in the specified model and for transformed parameters which are often obtained post-estimation. Monte Carlo simulation results suggest that PL CIs perform consistently better than Wald-type CIs for both non-transformed and transformed parameters.

  17. Supplementary Material for: High-Order Composite Likelihood Inference for Max-Stable Distributions and Processes

    KAUST Repository

    Castruccio, Stefano; Huser, Raphaë l; Genton, Marc G.

    2016-01-01

    In multivariate or spatial extremes, inference for max-stable processes observed at a large collection of points is a very challenging problem and current approaches typically rely on less expensive composite likelihoods constructed from small subsets of data. In this work, we explore the limits of modern state-of-the-art computational facilities to perform full likelihood inference and to efficiently evaluate high-order composite likelihoods. With extensive simulations, we assess the loss of information of composite likelihood estimators with respect to a full likelihood approach for some widely used multivariate or spatial extreme models, we discuss how to choose composite likelihood truncation to improve the efficiency, and we also provide recommendations for practitioners. This article has supplementary material online.

  18. Exclusion probabilities and likelihood ratios with applications to mixtures.

    Science.gov (United States)

    Slooten, Klaas-Jan; Egeland, Thore

    2016-01-01

    The statistical evidence obtained from mixed DNA profiles can be summarised in several ways in forensic casework including the likelihood ratio (LR) and the Random Man Not Excluded (RMNE) probability. The literature has seen a discussion of the advantages and disadvantages of likelihood ratios and exclusion probabilities, and part of our aim is to bring some clarification to this debate. In a previous paper, we proved that there is a general mathematical relationship between these statistics: RMNE can be expressed as a certain average of the LR, implying that the expected value of the LR, when applied to an actual contributor to the mixture, is at least equal to the inverse of the RMNE. While the mentioned paper presented applications for kinship problems, the current paper demonstrates the relevance for mixture cases, and for this purpose, we prove some new general properties. We also demonstrate how to use the distribution of the likelihood ratio for donors of a mixture, to obtain estimates for exceedance probabilities of the LR for non-donors, of which the RMNE is a special case corresponding to L R>0. In order to derive these results, we need to view the likelihood ratio as a random variable. In this paper, we describe how such a randomization can be achieved. The RMNE is usually invoked only for mixtures without dropout. In mixtures, artefacts like dropout and drop-in are commonly encountered and we address this situation too, illustrating our results with a basic but widely implemented model, a so-called binary model. The precise definitions, modelling and interpretation of the required concepts of dropout and drop-in are not entirely obvious, and we attempt to clarify them here in a general likelihood framework for a binary model.

  19. Elaboration Likelihood Model and an Analysis of the Contexts of Its Application

    Directory of Open Access Journals (Sweden)

    Aslıhan Kıymalıoğlu

    2014-12-01

    Full Text Available Elaboration Likelihood Model (ELM, which supports the existence of two routes to persuasion: central and peripheral routes, has been one of the major models on persuasion. As the number of studies in the Turkish literature on ELM is limited, a detailed explanation of the model together with a comprehensive literature review was considered to be contributory for this gap. The findings of the review reveal that the model was mostly used in marketing and advertising researches, that the concept most frequently used in elaboration process was involvement, and that argument quality and endorser credibility were the factors most often employed in measuring their effect on the dependant variables. The review provides valuable insights as it presents a holistic view of the model and the variables used in the model.

  20. FEMAXI-III, a computer code for fuel rod performance analysis

    International Nuclear Information System (INIS)

    Ito, K.; Iwano, Y.; Ichikawa, M.; Okubo, T.

    1983-01-01

    This paper presents a method of fuel rod thermal-mechanical performance analysis used in the FEMAXI-III code. The code incorporates the models describing thermal-mechanical processes such as pellet-cladding thermal expansion, pellet irradiation swelling, densification, relocation and fission gas release as they affect pellet-cladding gap thermal conductance. The code performs the thermal behavior analysis of a full-length fuel rod within the framework of one-dimensional multi-zone modeling. The mechanical effects including ridge deformation is rigorously analyzed by applying the axisymmetric finite element method. The finite element geometrical model is confined to a half-pellet-height region with the assumption that pellet-pellet interaction is symmetrical. The 8-node quadratic isoparametric ring elements are adopted for obtaining accurate finite element solutions. The Newton-Raphson iteration with an implicit algorithm is applied to perform the analysis of non-linear material behaviors accurately and stably. The pellet-cladding interaction mechanism is exactly treated using the nodal continuity conditions. The code is applicable to the thermal-mechanical analysis of water reactor fuel rods experiencing variable power histories. (orig.)

  1. Sampling of systematic errors to estimate likelihood weights in nuclear data uncertainty propagation

    International Nuclear Information System (INIS)

    Helgesson, P.; Sjöstrand, H.; Koning, A.J.; Rydén, J.; Rochman, D.; Alhassan, E.; Pomp, S.

    2016-01-01

    In methodologies for nuclear data (ND) uncertainty assessment and propagation based on random sampling, likelihood weights can be used to infer experimental information into the distributions for the ND. As the included number of correlated experimental points grows large, the computational time for the matrix inversion involved in obtaining the likelihood can become a practical problem. There are also other problems related to the conventional computation of the likelihood, e.g., the assumption that all experimental uncertainties are Gaussian. In this study, a way to estimate the likelihood which avoids matrix inversion is investigated; instead, the experimental correlations are included by sampling of systematic errors. It is shown that the model underlying the sampling methodology (using univariate normal distributions for random and systematic errors) implies a multivariate Gaussian for the experimental points (i.e., the conventional model). It is also shown that the likelihood estimates obtained through sampling of systematic errors approach the likelihood obtained with matrix inversion as the sample size for the systematic errors grows large. In studied practical cases, it is seen that the estimates for the likelihood weights converge impractically slowly with the sample size, compared to matrix inversion. The computational time is estimated to be greater than for matrix inversion in cases with more experimental points, too. Hence, the sampling of systematic errors has little potential to compete with matrix inversion in cases where the latter is applicable. Nevertheless, the underlying model and the likelihood estimates can be easier to intuitively interpret than the conventional model and the likelihood function involving the inverted covariance matrix. Therefore, this work can both have pedagogical value and be used to help motivating the conventional assumption of a multivariate Gaussian for experimental data. The sampling of systematic errors could also

  2. Approximate maximum likelihood estimation for population genetic inference.

    Science.gov (United States)

    Bertl, Johanna; Ewing, Gregory; Kosiol, Carolin; Futschik, Andreas

    2017-11-27

    In many population genetic problems, parameter estimation is obstructed by an intractable likelihood function. Therefore, approximate estimation methods have been developed, and with growing computational power, sampling-based methods became popular. However, these methods such as Approximate Bayesian Computation (ABC) can be inefficient in high-dimensional problems. This led to the development of more sophisticated iterative estimation methods like particle filters. Here, we propose an alternative approach that is based on stochastic approximation. By moving along a simulated gradient or ascent direction, the algorithm produces a sequence of estimates that eventually converges to the maximum likelihood estimate, given a set of observed summary statistics. This strategy does not sample much from low-likelihood regions of the parameter space, and is fast, even when many summary statistics are involved. We put considerable efforts into providing tuning guidelines that improve the robustness and lead to good performance on problems with high-dimensional summary statistics and a low signal-to-noise ratio. We then investigate the performance of our resulting approach and study its properties in simulations. Finally, we re-estimate parameters describing the demographic history of Bornean and Sumatran orang-utans.

  3. Likelihood ratio decisions in memory: three implied regularities.

    Science.gov (United States)

    Glanzer, Murray; Hilford, Andrew; Maloney, Laurence T

    2009-06-01

    We analyze four general signal detection models for recognition memory that differ in their distributional assumptions. Our analyses show that a basic assumption of signal detection theory, the likelihood ratio decision axis, implies three regularities in recognition memory: (1) the mirror effect, (2) the variance effect, and (3) the z-ROC length effect. For each model, we present the equations that produce the three regularities and show, in computed examples, how they do so. We then show that the regularities appear in data from a range of recognition studies. The analyses and data in our study support the following generalization: Individuals make efficient recognition decisions on the basis of likelihood ratios.

  4. Zero-inflated Poisson model based likelihood ratio test for drug safety signal detection.

    Science.gov (United States)

    Huang, Lan; Zheng, Dan; Zalkikar, Jyoti; Tiwari, Ram

    2017-02-01

    In recent decades, numerous methods have been developed for data mining of large drug safety databases, such as Food and Drug Administration's (FDA's) Adverse Event Reporting System, where data matrices are formed by drugs such as columns and adverse events as rows. Often, a large number of cells in these data matrices have zero cell counts and some of them are "true zeros" indicating that the drug-adverse event pairs cannot occur, and these zero counts are distinguished from the other zero counts that are modeled zero counts and simply indicate that the drug-adverse event pairs have not occurred yet or have not been reported yet. In this paper, a zero-inflated Poisson model based likelihood ratio test method is proposed to identify drug-adverse event pairs that have disproportionately high reporting rates, which are also called signals. The maximum likelihood estimates of the model parameters of zero-inflated Poisson model based likelihood ratio test are obtained using the expectation and maximization algorithm. The zero-inflated Poisson model based likelihood ratio test is also modified to handle the stratified analyses for binary and categorical covariates (e.g. gender and age) in the data. The proposed zero-inflated Poisson model based likelihood ratio test method is shown to asymptotically control the type I error and false discovery rate, and its finite sample performance for signal detection is evaluated through a simulation study. The simulation results show that the zero-inflated Poisson model based likelihood ratio test method performs similar to Poisson model based likelihood ratio test method when the estimated percentage of true zeros in the database is small. Both the zero-inflated Poisson model based likelihood ratio test and likelihood ratio test methods are applied to six selected drugs, from the 2006 to 2011 Adverse Event Reporting System database, with varying percentages of observed zero-count cells.

  5. QUALIS PERIODIC EVALUATION: ANALYSIS OF QUALIS UPGRADE IN MEDICINE III.

    Science.gov (United States)

    Jukemura, José; Diniz, Márcio Augusto

    2015-01-01

    To evaluate the preliminary results related to journals up-grade that was used by Medicine III, through opportunity offered by Capes to all agency areas programs. Were used area document of Medicine I, II and III, besides other relevant topics available online at Capes site, between 2009 and 2013. The research was focused to answer two questions: 1) the stratification of Qualis is similar in the three areas of medicine? and 2) the evolution of Qualis in Medicine III was higher? Medicine III showed an increase in its Qualis classification and is publishing in journals with higher impact factors, virtually the same as the Medicine I and II. The area showed the strongest growth in recent three-year periods. Avaliar os resultados preliminares sobre a Medicina III do up-grade oportunizado pela Capes para todas as áreas. Foram utilizados os documentos de áreas e os relevantes ao tema disponíveis online no site da Capes entre 2009 e 2013. Procurou-se focar a pesquisa em dois aspectos para responder duas perguntas: 1) a estratificação do Qualis é semelhante nas três áreas da medicina? e 2) a evolução do Qualis da Medicina III foi maior? A Medicina III apresentou evolução em sua classificação Qualis e está publicando em revistas com maior fator de impacto e é praticamente igual ao da Medicina I e II. A área foi a que apresentou maior evolução nestes últimos triênios.

  6. Asymptotic Likelihood Distribution for Correlated & Constrained Systems

    CERN Document Server

    Agarwal, Ujjwal

    2016-01-01

    It describes my work as summer student at CERN. The report discusses the asymptotic distribution of the likelihood ratio for total no. of parameters being h and 2 out of these being are constrained and correlated.

  7. Anatomical likelihood estimation meta-analysis of grey and white matter anomalies in autism spectrum disorders

    Directory of Open Access Journals (Sweden)

    Thomas P. DeRamus

    2015-01-01

    Full Text Available Autism spectrum disorders (ASD are characterized by impairments in social communication and restrictive, repetitive behaviors. While behavioral symptoms are well-documented, investigations into the neurobiological underpinnings of ASD have not resulted in firm biomarkers. Variability in findings across structural neuroimaging studies has contributed to difficulty in reliably characterizing the brain morphology of individuals with ASD. These inconsistencies may also arise from the heterogeneity of ASD, and wider age-range of participants included in MRI studies and in previous meta-analyses. To address this, the current study used coordinate-based anatomical likelihood estimation (ALE analysis of 21 voxel-based morphometry (VBM studies examining high-functioning individuals with ASD, resulting in a meta-analysis of 1055 participants (506 ASD, and 549 typically developing individuals. Results consisted of grey, white, and global differences in cortical matter between the groups. Modeled anatomical maps consisting of concentration, thickness, and volume metrics of grey and white matter revealed clusters suggesting age-related decreases in grey and white matter in parietal and inferior temporal regions of the brain in ASD, and age-related increases in grey matter in frontal and anterior-temporal regions. White matter alterations included fiber tracts thought to play key roles in information processing and sensory integration. Many current theories of pathobiology ASD suggest that the brains of individuals with ASD may have less-functional long-range (anterior-to-posterior connections. Our findings of decreased cortical matter in parietal–temporal and occipital regions, and thickening in frontal cortices in older adults with ASD may entail altered cortical anatomy, and neurodevelopmental adaptations.

  8. Structural Properties of the Cr(III)-Fe(III) (Oxy)Hydroxide Compositional Series: Insights for a Nanomaterial 'Solid Solution'

    International Nuclear Information System (INIS)

    Tang, Y.; Zhang, L.; Michel, F.M.; Harrington, R.; Parise, J.B.; Reeder, R.J.

    2010-01-01

    Chromium(III) (oxy)hydroxide and mixed Cr(III)-Fe(III) (oxy)hydroxides are environmentally important compounds for controlling chromium speciation and bioaccessibility in soils and aquatic systems and are also industrially important as precursors for materials and catalyst synthesis. However, direct characterization of the atomic arrangements of these materials is complicated because of their amorphous X-ray properties. This study involves synthesis of the complete Cr(III)-Fe(III) (oxy)hydroxide compositional series, and the use of complementary thermal, microscopic, spectroscopic, and scattering techniques for the evaluation of their structural properties. Thermal analysis results show that the Cr end member has a higher hydration state than the Fe end member, likely associated with the difference in water exchange rates in the first hydration spheres of Cr(III) and Fe(III). Three stages of weight loss are observed and are likely related to the loss of surface/structural water and hydroxyl groups. As compared to the Cr end member, the intermediate composition sample shows lower dehydration temperatures and a higher exothermic transition temperature. XANES analysis shows Cr(III) and Fe(III) to be the dominant oxidation states. XANES spectra also show progressive changes in the local structure around Cr and Fe atoms over the series. Pair distribution function (PDF) analysis of synchrotron X-ray total scattering data shows that the Fe end member is nanocrystalline ferrihydrite with an intermediate-range order and average coherent domain size of ∼27 (angstrom). The Cr end member, with a coherent domain size of ∼10 (angstrom), has only short-range order. The PDFs show progressive structural changes across the compositional series. High-resolution transmission electron microscopy (HRTEM) results also show the loss of structural order with increasing Cr content. These observations provide strong structural evidence of chemical substitution and progressive structural

  9. Comparison of likelihood testing procedures for parallel systems with covariances

    International Nuclear Information System (INIS)

    Ayman Baklizi; Isa Daud; Noor Akma Ibrahim

    1998-01-01

    In this paper we considered investigating and comparing the behavior of the likelihood ratio, the Rao's and the Wald's statistics for testing hypotheses on the parameters of the simple linear regression model based on parallel systems with covariances. These statistics are asymptotically equivalent (Barndorff-Nielsen and Cox, 1994). However, their relative performances in finite samples are generally known. A Monte Carlo experiment is conducted to stimulate the sizes and the powers of these statistics for complete samples and in the presence of time censoring. Comparisons of the statistics are made according to the attainment of assumed size of the test and their powers at various points in the parameter space. The results show that the likelihood ratio statistics appears to have the best performance in terms of the attainment of the assumed size of the test. Power comparisons show that the Rao statistic has some advantage over the Wald statistic in almost all of the space of alternatives while likelihood ratio statistic occupies either the first or the last position in term of power. Overall, the likelihood ratio statistic appears to be more appropriate to the model under study, especially for small sample sizes

  10. Isothiocyanato complexes of Gd(III), Tb(III), Dy(III) and Ho(III) with 2-(2'-pyridyl)benzimidazole

    Energy Technology Data Exchange (ETDEWEB)

    Mishra, A; Singh, V K

    1982-01-01

    Six-coordinated complexes of the type (Ln(PyBzH)/sub 2/NCS.H/sub 2/O) (NCS)/sub 2/.nH/sub 2/O/mC/sub 2/H/sub 5/OH (Ln = Gd(III), Tb(III), Dy(III) and Ho(III), n=1-2; m=1) have been prepared from Ln(NCS)/sub 6//sup 3 -/. The room temperature magnetic moment values confirm the terpositive state of the lanthanide ions. Infrared spectra suggest the N-coordination of thiocyanate group. Electronic spectral studies of Tb(III), Dy(III) and Ho(III) complexes have been made in terms of LSJ term energies. 13 refs.

  11. Likelihood ratio model for classification of forensic evidence

    Energy Technology Data Exchange (ETDEWEB)

    Zadora, G., E-mail: gzadora@ies.krakow.pl [Institute of Forensic Research, Westerplatte 9, 31-033 Krakow (Poland); Neocleous, T., E-mail: tereza@stats.gla.ac.uk [University of Glasgow, Department of Statistics, 15 University Gardens, Glasgow G12 8QW (United Kingdom)

    2009-05-29

    One of the problems of analysis of forensic evidence such as glass fragments, is the determination of their use-type category, e.g. does a glass fragment originate from an unknown window or container? Very small glass fragments arise during various accidents and criminal offences, and could be carried on the clothes, shoes and hair of participants. It is therefore necessary to obtain information on their physicochemical composition in order to solve the classification problem. Scanning Electron Microscopy coupled with an Energy Dispersive X-ray Spectrometer and the Glass Refractive Index Measurement method are routinely used in many forensic institutes for the investigation of glass. A natural form of glass evidence evaluation for forensic purposes is the likelihood ratio-LR = p(E|H{sub 1})/p(E|H{sub 2}). The main aim of this paper was to study the performance of LR models for glass object classification which considered one or two sources of data variability, i.e. between-glass-object variability and(or) within-glass-object variability. Within the proposed model a multivariate kernel density approach was adopted for modelling the between-object distribution and a multivariate normal distribution was adopted for modelling within-object distributions. Moreover, a graphical method of estimating the dependence structure was employed to reduce the highly multivariate problem to several lower-dimensional problems. The performed analysis showed that the best likelihood model was the one which allows to include information about between and within-object variability, and with variables derived from elemental compositions measured by SEM-EDX, and refractive values determined before (RI{sub b}) and after (RI{sub a}) the annealing process, in the form of dRI = log{sub 10}|RI{sub a} - RI{sub b}|. This model gave better results than the model with only between-object variability considered. In addition, when dRI and variables derived from elemental compositions were used, this

  12. Likelihood ratio model for classification of forensic evidence

    International Nuclear Information System (INIS)

    Zadora, G.; Neocleous, T.

    2009-01-01

    One of the problems of analysis of forensic evidence such as glass fragments, is the determination of their use-type category, e.g. does a glass fragment originate from an unknown window or container? Very small glass fragments arise during various accidents and criminal offences, and could be carried on the clothes, shoes and hair of participants. It is therefore necessary to obtain information on their physicochemical composition in order to solve the classification problem. Scanning Electron Microscopy coupled with an Energy Dispersive X-ray Spectrometer and the Glass Refractive Index Measurement method are routinely used in many forensic institutes for the investigation of glass. A natural form of glass evidence evaluation for forensic purposes is the likelihood ratio-LR = p(E|H 1 )/p(E|H 2 ). The main aim of this paper was to study the performance of LR models for glass object classification which considered one or two sources of data variability, i.e. between-glass-object variability and(or) within-glass-object variability. Within the proposed model a multivariate kernel density approach was adopted for modelling the between-object distribution and a multivariate normal distribution was adopted for modelling within-object distributions. Moreover, a graphical method of estimating the dependence structure was employed to reduce the highly multivariate problem to several lower-dimensional problems. The performed analysis showed that the best likelihood model was the one which allows to include information about between and within-object variability, and with variables derived from elemental compositions measured by SEM-EDX, and refractive values determined before (RI b ) and after (RI a ) the annealing process, in the form of dRI = log 10 |RI a - RI b |. This model gave better results than the model with only between-object variability considered. In addition, when dRI and variables derived from elemental compositions were used, this model outperformed two other

  13. Finite mixture model: A maximum likelihood estimation approach on time series data

    Science.gov (United States)

    Yen, Phoong Seuk; Ismail, Mohd Tahir; Hamzah, Firdaus Mohamad

    2014-09-01

    Recently, statistician emphasized on the fitting of finite mixture model by using maximum likelihood estimation as it provides asymptotic properties. In addition, it shows consistency properties as the sample sizes increases to infinity. This illustrated that maximum likelihood estimation is an unbiased estimator. Moreover, the estimate parameters obtained from the application of maximum likelihood estimation have smallest variance as compared to others statistical method as the sample sizes increases. Thus, maximum likelihood estimation is adopted in this paper to fit the two-component mixture model in order to explore the relationship between rubber price and exchange rate for Malaysia, Thailand, Philippines and Indonesia. Results described that there is a negative effect among rubber price and exchange rate for all selected countries.

  14. Expert elicitation on ultrafine particles: likelihood of health effects and causal pathways

    Directory of Open Access Journals (Sweden)

    Brunekreef Bert

    2009-07-01

    Full Text Available Abstract Background Exposure to fine ambient particulate matter (PM has consistently been associated with increased morbidity and mortality. The relationship between exposure to ultrafine particles (UFP and health effects is less firmly established. If UFP cause health effects independently from coarser fractions, this could affect health impact assessment of air pollution, which would possibly lead to alternative policy options to be considered to reduce the disease burden of PM. Therefore, we organized an expert elicitation workshop to assess the evidence for a causal relationship between exposure to UFP and health endpoints. Methods An expert elicitation on the health effects of ambient ultrafine particle exposure was carried out, focusing on: 1 the likelihood of causal relationships with key health endpoints, and 2 the likelihood of potential causal pathways for cardiac events. Based on a systematic peer-nomination procedure, fourteen European experts (epidemiologists, toxicologists and clinicians were selected, of whom twelve attended. They were provided with a briefing book containing key literature. After a group discussion, individual expert judgments in the form of ratings of the likelihood of causal relationships and pathways were obtained using a confidence scheme adapted from the one used by the Intergovernmental Panel on Climate Change. Results The likelihood of an independent causal relationship between increased short-term UFP exposure and increased all-cause mortality, hospital admissions for cardiovascular and respiratory diseases, aggravation of asthma symptoms and lung function decrements was rated medium to high by most experts. The likelihood for long-term UFP exposure to be causally related to all cause mortality, cardiovascular and respiratory morbidity and lung cancer was rated slightly lower, mostly medium. The experts rated the likelihood of each of the six identified possible causal pathways separately. Out of these

  15. Maximum likelihood estimation of dose-response parameters for therapeutic operating characteristic (TOC) analysis of carcinoma of the nasopharynx

    International Nuclear Information System (INIS)

    Metz, C.E.; Tokars, R.P.; Kronman, H.B.; Griem, M.L.

    1982-01-01

    A Therapeutic Operating Characteristic (TOC) curve for radiation therapy plots, for all possible treatment doses, the probability of tumor ablation as a function of the probability of radiation-induced complication. Application of this analysis to actual therapeutic situation requires that dose-response curves for ablation and for complication be estimated from clinical data. We describe an approach in which ''maximum likelihood estimates'' of these dose-response curves are made, and we apply this approach to data collected on responses to radiotherapy for carcinoma of the nasopharynx. TOC curves constructed from the estimated dose-response curves are subject to moderately large uncertainties because of the limitations of available data.These TOC curves suggest, however, that treatment doses greater than 1800 rem may substantially increase the probability of tumor ablation with little increase in the risk of radiation-induced cervical myelopathy, especially for T1 and T2 tumors

  16. Analysis of Prognostic Factors and Patterns of Recurrence in Patients With Pathologic Stage III Endometrial Cancer

    International Nuclear Information System (INIS)

    Patel, Samir; Portelance, Lorraine; Gilbert, Lucy; Tan, Leonard; Stanimir, Gerald; Duclos, Marie; Souhami, Luis

    2007-01-01

    Purpose: To retrospectively assess prognostic factors and patterns of recurrence in patients with pathologic Stage III endometrial cancer. Methods and Materials: Between 1989 and 2003, 107 patients with pathologic International Federation of Gynecology and Obstetrics Stage III endometrial adenocarcinoma confined to the pelvis were treated at our institution. Adjuvant radiotherapy (RT) was delivered to 68 patients (64%). The influence of multiple patient- and treatment-related factors on pelvic and distant control and overall survival (OS) was evaluated. Results: Median follow-up for patients at risk was 41 months. Five-year actuarial OS was significantly improved in patients treated with adjuvant RT (68%) compared with those with resection alone (50%; p = 0.029). Age, histology, grade, uterine serosal invasion, adnexal involvement, number of extrauterine sites, and treatment with adjuvant RT predicted for improved survival in univariate analysis. Multivariate analysis revealed that grade, uterine serosal invasion, and treatment with adjuvant RT were independent predictors of survival. Five-year actuarial pelvic control was improved significantly with the delivery of adjuvant RT (74% vs. 49%; p = 0.011). Depth of myometrial invasion and treatment with adjuvant RT were independent predictors of pelvic control in multivariate analysis. Conclusions: Multiple prognostic factors predicting for the outcome of pathologic Stage III endometrial cancer patients were identified in this analysis. In particular, delivery of adjuvant RT seems to be a significant independent predictor for improved survival and pelvic control, suggesting that pelvic RT should be routinely considered in the management of these patients

  17. A note on estimating errors from the likelihood function

    International Nuclear Information System (INIS)

    Barlow, Roger

    2005-01-01

    The points at which the log likelihood falls by 12 from its maximum value are often used to give the 'errors' on a result, i.e. the 68% central confidence interval. The validity of this is examined for two simple cases: a lifetime measurement and a Poisson measurement. Results are compared with the exact Neyman construction and with the simple Bartlett approximation. It is shown that the accuracy of the log likelihood method is poor, and the Bartlett construction explains why it is flawed

  18. Dark Energy Survey Year 1 Results: Multi-Probe Methodology and Simulated Likelihood Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Krause, E.; et al.

    2017-06-28

    We present the methodology for and detail the implementation of the Dark Energy Survey (DES) 3x2pt DES Year 1 (Y1) analysis, which combines configuration-space two-point statistics from three different cosmological probes: cosmic shear, galaxy-galaxy lensing, and galaxy clustering, using data from the first year of DES observations. We have developed two independent modeling pipelines and describe the code validation process. We derive expressions for analytical real-space multi-probe covariances, and describe their validation with numerical simulations. We stress-test the inference pipelines in simulated likelihood analyses that vary 6-7 cosmology parameters plus 20 nuisance parameters and precisely resemble the analysis to be presented in the DES 3x2pt analysis paper, using a variety of simulated input data vectors with varying assumptions. We find that any disagreement between pipelines leads to changes in assigned likelihood $\\Delta \\chi^2 \\le 0.045$ with respect to the statistical error of the DES Y1 data vector. We also find that angular binning and survey mask do not impact our analytic covariance at a significant level. We determine lower bounds on scales used for analysis of galaxy clustering (8 Mpc$~h^{-1}$) and galaxy-galaxy lensing (12 Mpc$~h^{-1}$) such that the impact of modeling uncertainties in the non-linear regime is well below statistical errors, and show that our analysis choices are robust against a variety of systematics. These tests demonstrate that we have a robust analysis pipeline that yields unbiased cosmological parameter inferences for the flagship 3x2pt DES Y1 analysis. We emphasize that the level of independent code development and subsequent code comparison as demonstrated in this paper is necessary to produce credible constraints from increasingly complex multi-probe analyses of current data.

  19. Determination of As(III and As(V in waters by chronopotentiometric stripping analysis

    Directory of Open Access Journals (Sweden)

    Švarc-Gajić Jaroslava V.

    2006-01-01

    Full Text Available Arsenic is a naturally occurring toxic and carcinogenic element. The degree of the toxicity depends on its chemical form and the concentration. Application of a sensitive, selective, simple and rapid method for detection and monitoring of different oxidation states of arsenic in waters is of great importance because main route of population exposure is through drinking water. In this work chronopotentiometric stripping analysis (CSA was used for the determination of As(III and As(V in tap, well, river and rain waters from Vojvodina (Serbia. Gold film electrode on the glassy carbon support was used as the working electrode. The experimental parameters of the technique were investigated and optimized. Detection limit of the method for the electrolysis time of 600 s was 2 μg/dm3 of As(III.

  20. A Fast Algorithm for Maximum Likelihood Estimation of Harmonic Chirp Parameters

    DEFF Research Database (Denmark)

    Jensen, Tobias Lindstrøm; Nielsen, Jesper Kjær; Jensen, Jesper Rindom

    2017-01-01

    . A statistically efficient estimator for extracting the parameters of the harmonic chirp model in additive white Gaussian noise is the maximum likelihood (ML) estimator which recently has been demonstrated to be robust to noise and accurate --- even when the model order is unknown. The main drawback of the ML......The analysis of (approximately) periodic signals is an important element in numerous applications. One generalization of standard periodic signals often occurring in practice are harmonic chirp signals where the instantaneous frequency increases/decreases linearly as a function of time...

  1. New algorithms and methods to estimate maximum-likelihood phylogenies: assessing the performance of PhyML 3.0.

    Science.gov (United States)

    Guindon, Stéphane; Dufayard, Jean-François; Lefort, Vincent; Anisimova, Maria; Hordijk, Wim; Gascuel, Olivier

    2010-05-01

    PhyML is a phylogeny software based on the maximum-likelihood principle. Early PhyML versions used a fast algorithm performing nearest neighbor interchanges to improve a reasonable starting tree topology. Since the original publication (Guindon S., Gascuel O. 2003. A simple, fast and accurate algorithm to estimate large phylogenies by maximum likelihood. Syst. Biol. 52:696-704), PhyML has been widely used (>2500 citations in ISI Web of Science) because of its simplicity and a fair compromise between accuracy and speed. In the meantime, research around PhyML has continued, and this article describes the new algorithms and methods implemented in the program. First, we introduce a new algorithm to search the tree space with user-defined intensity using subtree pruning and regrafting topological moves. The parsimony criterion is used here to filter out the least promising topology modifications with respect to the likelihood function. The analysis of a large collection of real nucleotide and amino acid data sets of various sizes demonstrates the good performance of this method. Second, we describe a new test to assess the support of the data for internal branches of a phylogeny. This approach extends the recently proposed approximate likelihood-ratio test and relies on a nonparametric, Shimodaira-Hasegawa-like procedure. A detailed analysis of real alignments sheds light on the links between this new approach and the more classical nonparametric bootstrap method. Overall, our tests show that the last version (3.0) of PhyML is fast, accurate, stable, and ready to use. A Web server and binary files are available from http://www.atgc-montpellier.fr/phyml/.

  2. Predicting likelihood of seeking help through the employee assistance program among salaried and union hourly employees.

    Science.gov (United States)

    Delaney, W; Grube, J W; Ames, G M

    1998-03-01

    This research investigated belief, social support and background predictors of employee likelihood to use an Employee Assistance Program (EAP) for a drinking problem. An anonymous cross-sectional survey was administered in the home. Bivariate analyses and simultaneous equations path analysis were used to explore a model of EAP use. Survey and ethnographic research were conducted in a unionized heavy machinery manufacturing plant in the central states of the United States. A random sample of 852 hourly and salaried employees was selected. In addition to background variables, measures included: likelihood of going to an EAP for a drinking problem, belief the EAP can help, social support for the EAP from co-workers/others, belief that EAP use will harm employment, and supervisor encourages the EAP for potential drinking problems. Belief in EAP efficacy directly increased the likelihood of going to an EAP. Greater perceived social support and supervisor encouragement increased the likelihood of going to an EAP both directly and indirectly through perceived EAP efficacy. Black and union hourly employees were more likely to say they would use an EAP. Males and those who reported drinking during working hours were less likely to say they would use an EAP for a drinking problem. EAP beliefs and social support have significant effects on likelihood to go to an EAP for a drinking problem. EAPs may wish to focus their efforts on creating an environment where there is social support from coworkers and encouragement from supervisors for using EAP services. Union networks and team members have an important role to play in addition to conventional supervisor intervention.

  3. Efficient Bit-to-Symbol Likelihood Mappings

    Science.gov (United States)

    Moision, Bruce E.; Nakashima, Michael A.

    2010-01-01

    This innovation is an efficient algorithm designed to perform bit-to-symbol and symbol-to-bit likelihood mappings that represent a significant portion of the complexity of an error-correction code decoder for high-order constellations. Recent implementation of the algorithm in hardware has yielded an 8- percent reduction in overall area relative to the prior design.

  4. LDR: A Package for Likelihood-Based Sufficient Dimension Reduction

    Directory of Open Access Journals (Sweden)

    R. Dennis Cook

    2011-03-01

    Full Text Available We introduce a new mlab software package that implements several recently proposed likelihood-based methods for sufficient dimension reduction. Current capabilities include estimation of reduced subspaces with a fixed dimension d, as well as estimation of d by use of likelihood-ratio testing, permutation testing and information criteria. The methods are suitable for preprocessing data for both regression and classification. Implementations of related estimators are also available. Although the software is more oriented to command-line operation, a graphical user interface is also provided for prototype computations.

  5. Image properties of list mode likelihood reconstruction for a rectangular positron emission mammography with DOI measurements

    International Nuclear Information System (INIS)

    Qi, Jinyi; Klein, Gregory J.; Huesman, Ronald H.

    2000-01-01

    A positron emission mammography scanner is under development at our Laboratory. The tomograph has a rectangular geometry consisting of four banks of detector modules. For each detector, the system can measure the depth of interaction information inside the crystal. The rectangular geometry leads to irregular radial and angular sampling and spatially variant sensitivity that are different from conventional PET systems. Therefore, it is of importance to study the image properties of the reconstructions. We adapted the theoretical analysis that we had developed for conventional PET systems to the list mode likelihood reconstruction for this tomograph. The local impulse response and covariance of the reconstruction can be easily computed using FFT. These theoretical results are also used with computer observer models to compute the signal-to-noise ratio for lesion detection. The analysis reveals the spatially variant resolution and noise properties of the list mode likelihood reconstruction. The theoretical predictions are in good agreement with Monte Carlo results

  6. Complexation of trivalent actinides and lanthanides with hydrophilic N-donor ligands for Am(III)/Cm(III) and An(III)/Ln(III) separation; Komplexierung von trivalenten Actiniden und Lanthaniden mit hydrophilen N-Donorliganden zur Am(III)/Cm(III)- bzw. An(III)/Ln(III)-Trennung

    Energy Technology Data Exchange (ETDEWEB)

    Wagner, Christoph

    2017-07-24

    The implementation of actinide recycling processes is considered in several countries, aiming at the reduction of long-term radiotoxicity and heat load of used nuclear fuel. This requires the separation of the actinides from the fission and corrosion products. The separation of the trivalent actinides (An(III)) Am(III) and Cm(III), however, is complicated by the presence of the chemically similar fission lanthanides (Ln(III)). Hydrophilic N-donor ligands are employed as An(III) or Am(III) selective complexing agents in solvent extraction to strip An(III) or Am(III) from an organic phase loaded with An(III) and Ln(III). Though they exhibit excellent selectivity, the complexation chemistry of these ligands and the complexes formed during solvent extraction are not sufficiently characterized. In the present thesis the complexation of An(III) and Ln(III) with hydrophilic N-donor ligands is studied by time resolved laser fluorescence spectroscopy (TRLFS), UV/Vis, vibronic sideband spectroscopy and solvent extraction. TRLFS studies on the complexation of Cm(III) and Eu(III) with the Am(III) selective complexing agent SO{sub 3}-Ph-BTBP (tetrasodium 3,3{sup '},3'',3{sup '''}-([2,2{sup '}-bipyridine]-6,6{sup '}-diylbis(1,2,4-triazine-3,5,6-triyl)) tetrabenzenesulfonate) revealed the formation of [M(SO{sub 3}-Ph-BTBP){sub n}]{sup (4n-3)-} complexes (M = Cm(III), Eu(III); n = 1, 2). The conditional stability constants were determined in different media yielding two orders of magnitude larger β{sub 2}-values for the Cm(III) complexes, independently from the applied medium. A strong impact of ionic strength on the stability and stoichiometry of the formed complexes was identified, resulting from the stabilization of the pentaanionic [M(SO{sub 3}-Ph-BTBP){sub 2}]{sup 5-} complex with increasing ionic strength. Thermodynamic studies of Cm(III)-SO{sub 3}-Ph-BTBP complexation showed that the proton concentration of the applied medium impacts

  7. Diagonal Likelihood Ratio Test for Equality of Mean Vectors in High-Dimensional Data

    KAUST Repository

    Hu, Zongliang

    2017-10-27

    We propose a likelihood ratio test framework for testing normal mean vectors in high-dimensional data under two common scenarios: the one-sample test and the two-sample test with equal covariance matrices. We derive the test statistics under the assumption that the covariance matrices follow a diagonal matrix structure. In comparison with the diagonal Hotelling\\'s tests, our proposed test statistics display some interesting characteristics. In particular, they are a summation of the log-transformed squared t-statistics rather than a direct summation of those components. More importantly, to derive the asymptotic normality of our test statistics under the null and local alternative hypotheses, we do not require the assumption that the covariance matrix follows a diagonal matrix structure. As a consequence, our proposed test methods are very flexible and can be widely applied in practice. Finally, simulation studies and a real data analysis are also conducted to demonstrate the advantages of our likelihood ratio test method.

  8. Diagonal Likelihood Ratio Test for Equality of Mean Vectors in High-Dimensional Data

    KAUST Repository

    Hu, Zongliang; Tong, Tiejun; Genton, Marc G.

    2017-01-01

    We propose a likelihood ratio test framework for testing normal mean vectors in high-dimensional data under two common scenarios: the one-sample test and the two-sample test with equal covariance matrices. We derive the test statistics under the assumption that the covariance matrices follow a diagonal matrix structure. In comparison with the diagonal Hotelling's tests, our proposed test statistics display some interesting characteristics. In particular, they are a summation of the log-transformed squared t-statistics rather than a direct summation of those components. More importantly, to derive the asymptotic normality of our test statistics under the null and local alternative hypotheses, we do not require the assumption that the covariance matrix follows a diagonal matrix structure. As a consequence, our proposed test methods are very flexible and can be widely applied in practice. Finally, simulation studies and a real data analysis are also conducted to demonstrate the advantages of our likelihood ratio test method.

  9. Detecting the contagion effect in mass killings; a constructive example of the statistical advantages of unbinned likelihood methods.

    Science.gov (United States)

    Towers, Sherry; Mubayi, Anuj; Castillo-Chavez, Carlos

    2018-01-01

    When attempting to statistically distinguish between a null and an alternative hypothesis, many researchers in the life and social sciences turn to binned statistical analysis methods, or methods that are simply based on the moments of a distribution (such as the mean, and variance). These methods have the advantage of simplicity of implementation, and simplicity of explanation. However, when null and alternative hypotheses manifest themselves in subtle differences in patterns in the data, binned analysis methods may be insensitive to these differences, and researchers may erroneously fail to reject the null hypothesis when in fact more sensitive statistical analysis methods might produce a different result when the null hypothesis is actually false. Here, with a focus on two recent conflicting studies of contagion in mass killings as instructive examples, we discuss how the use of unbinned likelihood methods makes optimal use of the information in the data; a fact that has been long known in statistical theory, but perhaps is not as widely appreciated amongst general researchers in the life and social sciences. In 2015, Towers et al published a paper that quantified the long-suspected contagion effect in mass killings. However, in 2017, Lankford & Tomek subsequently published a paper, based upon the same data, that claimed to contradict the results of the earlier study. The former used unbinned likelihood methods, and the latter used binned methods, and comparison of distribution moments. Using these analyses, we also discuss how visualization of the data can aid in determination of the most appropriate statistical analysis methods to distinguish between a null and alternate hypothesis. We also discuss the importance of assessment of the robustness of analysis results to methodological assumptions made (for example, arbitrary choices of number of bins and bin widths when using binned methods); an issue that is widely overlooked in the literature, but is critical

  10. Communicating likelihoods and probabilities in forecasts of volcanic eruptions

    Science.gov (United States)

    Doyle, Emma E. H.; McClure, John; Johnston, David M.; Paton, Douglas

    2014-02-01

    The issuing of forecasts and warnings of natural hazard events, such as volcanic eruptions, earthquake aftershock sequences and extreme weather often involves the use of probabilistic terms, particularly when communicated by scientific advisory groups to key decision-makers, who can differ greatly in relative expertise and function in the decision making process. Recipients may also differ in their perception of relative importance of political and economic influences on interpretation. Consequently, the interpretation of these probabilistic terms can vary greatly due to the framing of the statements, and whether verbal or numerical terms are used. We present a review from the psychology literature on how the framing of information influences communication of these probability terms. It is also unclear as to how people rate their perception of an event's likelihood throughout a time frame when a forecast time window is stated. Previous research has identified that, when presented with a 10-year time window forecast, participants viewed the likelihood of an event occurring ‘today’ as being of less than that in year 10. Here we show that this skew in perception also occurs for short-term time windows (under one week) that are of most relevance for emergency warnings. In addition, unlike the long-time window statements, the use of the phrasing “within the next…” instead of “in the next…” does not mitigate this skew, nor do we observe significant differences between the perceived likelihoods of scientists and non-scientists. This finding suggests that effects occurring due to the shorter time window may be ‘masking’ any differences in perception due to wording or career background observed for long-time window forecasts. These results have implications for scientific advice, warning forecasts, emergency management decision-making, and public information as any skew in perceived event likelihood towards the end of a forecast time window may result in

  11. Soft tissue thin-plate spline analysis of pre-pubertal Korean and European-Americans with untreated Angle's Class III malocclusions.

    Science.gov (United States)

    Singh, G D; McNamara, J A; Lozanoff, S

    1999-01-01

    The purpose of this study was to assess soft tissue facial matrices in subjects of diverse ethnic origins with underlying dentoskeletal malocclusions. Pre-treatment lateral cephalographs of 71 Korean and 70 European-American children aged between 5 and 11 years with Angle's Class III malocclusions were traced, and 12 homologous, soft tissue landmarks digitized. Comparing mean Korean and European-American Class III soft tissue profiles, Procrustes analysis established statistical difference (P thin-plate spline analysis indicated that both affine and non-affine transformations contribute towards the total spline (deformation) of the averaged Class III soft tissue configurations. For non-affine transformations, partial warp (PW) 8 had the highest magnitude, indicating large-scale deformations visualized as labio-mental protrusion, predominantly. In addition, PW9, PW4, and PW5 also had high magnitudes, demonstrating labio-mental vertical compression and antero-posterior compression of the lower labio-mental soft tissues. Thus, Korean children with Class III malocclusions demonstrate antero-posterior and vertical deformations of the labio-mental soft tissue complex with respect to their European-American counterparts. Morphological heterogeneity of the soft tissue integument in subjects of diverse ethnic origin may obscure the underlying skeletal morphology, but the soft tissue integument appears to have minimal ontogenetic association with Class III malocclusions.

  12. Clinical Paresthesia Atlas Illustrates Likelihood of Coverage Based on Spinal Cord Stimulator Electrode Location.

    Science.gov (United States)

    Taghva, Alexander; Karst, Edward; Underwood, Paul

    2017-08-01

    Concordant paresthesia coverage is an independent predictor of pain relief following spinal cord stimulation (SCS). Using aggregate data, our objective is to produce a map of paresthesia coverage as a function of electrode location in SCS. This retrospective analysis used x-rays, SCS programming data, and paresthesia coverage maps from the EMPOWER registry of SCS implants for chronic neuropathic pain. Spinal level of dorsal column stimulation was determined by x-ray adjudication and active cathodes in patient programs. Likelihood of paresthesia coverage was determined as a function of stimulating electrode location. Segments of paresthesia coverage were grouped anatomically. Fisher's exact test was used to identify significant differences in likelihood of paresthesia coverage as a function of spinal stimulation level. In the 178 patients analyzed, the most prevalent areas of paresthesia coverage were buttocks, anterior and posterior thigh (each 98%), and low back (94%). Unwanted paresthesia at the ribs occurred in 8% of patients. There were significant differences in the likelihood of achieving paresthesia, with higher thoracic levels (T5, T6, and T7) more likely to achieve low back coverage but also more likely to introduce paresthesia felt at the ribs. Higher levels in the thoracic spine were associated with greater coverage of the buttocks, back, and thigh, and with lesser coverage of the leg and foot. This paresthesia atlas uses real-world, aggregate data to determine likelihood of paresthesia coverage as a function of stimulating electrode location. It represents an application of "big data" techniques, and a step toward achieving personalized SCS therapy tailored to the individual's chronic pain. © 2017 International Neuromodulation Society.

  13. Pendeteksian Outlier pada Regresi Nonlinier dengan Metode statistik Likelihood Displacement

    Directory of Open Access Journals (Sweden)

    Siti Tabi'atul Hasanah

    2012-11-01

    Full Text Available Outlier is an observation that much different (extreme from the other observational data, or data can be interpreted that do not follow the general pattern of the model. Sometimes outliers provide information that can not be provided by other data. That's why outliers should not just be eliminated. Outliers can also be an influential observation. There are many methods that can be used to detect of outliers. In previous studies done on outlier detection of linear regression. Next will be developed detection of outliers in nonlinear regression. Nonlinear regression here is devoted to multiplicative nonlinear regression. To detect is use of statistical method likelihood displacement. Statistical methods abbreviated likelihood displacement (LD is a method to detect outliers by removing the suspected outlier data. To estimate the parameters are used to the maximum likelihood method, so we get the estimate of the maximum. By using LD method is obtained i.e likelihood displacement is thought to contain outliers. Further accuracy of LD method in detecting the outliers are shown by comparing the MSE of LD with the MSE from the regression in general. Statistic test used is Λ. Initial hypothesis was rejected when proved so is an outlier.

  14. A Predictive Likelihood Approach to Bayesian Averaging

    Directory of Open Access Journals (Sweden)

    Tomáš Jeřábek

    2015-01-01

    Full Text Available Multivariate time series forecasting is applied in a wide range of economic activities related to regional competitiveness and is the basis of almost all macroeconomic analysis. In this paper we combine multivariate density forecasts of GDP growth, inflation and real interest rates from four various models, two type of Bayesian vector autoregression (BVAR models, a New Keynesian dynamic stochastic general equilibrium (DSGE model of small open economy and DSGE-VAR model. The performance of models is identified using historical dates including domestic economy and foreign economy, which is represented by countries of the Eurozone. Because forecast accuracy of observed models are different, the weighting scheme based on the predictive likelihood, the trace of past MSE matrix, model ranks are used to combine the models. The equal-weight scheme is used as a simple combination scheme. The results show that optimally combined densities are comparable to the best individual models.

  15. Extraction behaviour of Am(III) and Eu(III) from nitric acid medium in TEHDGA-HDEHP impregnated resins

    Energy Technology Data Exchange (ETDEWEB)

    Saipriya, G.; Kumar, T. [Bhabha Atomic Research Centre Facilities, Kalpakkam (India). Kalpakkam Reprocessing Plant; Kumaresan, R.; Nayak, P.K.; Venkatesan, K.A.; Antony, M.P. [Indira Gandhi Center for Atomic Research, Kalpakkam (India). Fuel Chemistry Div.

    2016-07-01

    The extraction behaviour of Am(III) and Eu(III) from nitric acid medium was studied in the solvent impregnated resins containing extractants such as tetra-bis(2-ethylhexyl)diglycolamide (TEHDGA) or bis-(2-ethylhexyl)phosphoric acid (HDEHP) or mixture of TEHDGA+HDEHP. The rate of extraction of Am(III) and Eu(III) from 1 M nitric acid and the effect of various parameters, such as the concentration of nitric acid in aqueous phase and concentration of TEHDGA and HDEHP in resin phase, on the distribution coefficient of Am(III) and Eu(III) was studied. The distribution coefficient of Am(III) and Eu(III) in HDEHP-impregnated resin decreased and that in TEHDGA-impregnated resin increased, with increase in the concentration of nitric acid. However, in (TEHDGA+HDEHP) - impregnated resin, synergic extraction was observed at lower nitric acid concentration and antagonism at higher nitric acid concentration. The mechanism of Am(III) and Eu(III) extraction in the combined resin was investigated by slope analysis method. The extraction of various metal ions present in the fast reactor simulated high-level liquid waste was studied. The separation factor of Am(III) over Eu(III) was studied using citrate-buffered diethylenetriaminepentaacetic acid (DTPA) solution.

  16. Susceptibility, likelihood to be diagnosed, worry and fear for contracting Lyme disease.

    Science.gov (United States)

    Fogel, Joshua; Chawla, Gurasees S

    Risk perception and psychological concerns are relevant for understanding how people view Lyme disease. This study investigates the four separate outcomes of susceptibility, likelihood to be diagnosed, worry, and fear for contracting Lyme disease. University students (n=713) were surveyed about demographics, perceived health, Lyme disease knowledge, Lyme disease preventive behaviors, Lyme disease history, and Lyme disease miscellaneous variables. We found that women were associated with increased susceptibility and fear. Asian/Asian-American race/ethnicity was associated with increased worry and fear. Perceived good health was associated with increased likelihood to be diagnosed, worry, and fear. Correct knowledge was associated with increased susceptibility and likelihood to be diagnosed. Those who typically spend a lot of time outdoors were associated with increased susceptibility, likelihood to be diagnosed, worry, and fear. In conclusion, healthcare providers and public health campaigns should address susceptibility, likelihood to be diagnosed, worry, and fear about Lyme disease, and should particularly target women and Asians/Asian-Americans to address any possible misconceptions and/or offer effective coping strategies. Copyright © 2016 King Saud Bin Abdulaziz University for Health Sciences. Published by Elsevier Ltd. All rights reserved.

  17. Clinicopathological analysis of 91 cases of uterine cervical cancer (including 38 cases of CIN III)

    International Nuclear Information System (INIS)

    Obata, Naoko; Kamiya, Norio; Goto, Setsuko; Takahashi, Satoru

    2000-01-01

    A total of 91 cases of uterine cervical cancer, consisting of 38 cases of carcinoma in situ (CIN III) and 53 cases of stage I-IV cervical cancer, were retrospectively and clinicopathologically analyzed. The standard treatment given to these patients consisted of hysterectomy or conization for CIN III; observation of cases of mild to moderate dysplasia; radical hysterectomy plus pelvic lymph node dissection for stage I and II cervical cancer; and radiotherapy for stage III and IV cervical cancer. Postoperative irradiation consisted of irradiation of the whole pelvis with 40-50 Gy. The patients who were not treated surgically underwent 40 Gy external irradiation of the whole pelvis, followed by an additional 20 Gy with shielding and internal irradiation with an RALS. When lymph node metastasis was present, the nodes were irradiated with 40-50 Gy. The mean age of the 38 patients with CIN III was 45.2 years old, and they were para 0-4. In 24 (63.2%) of them the cancer was detected by cytodiagnosis as part of screening. Radical hysterectomy, simple hysterectomy, and conization were performed in 25 patients, 7 patients, and 6 patients, respectively. No recurrences have been detected, and the survival rate is 100%. The mean age of the 53 patients with cervical cancer stage I-IV was 62.4 years old, and they were para 0-10. There were 25 patients with stage I disease, 15 patients with stage II disease, 6 patients with stage III, and 7 patients with stage IV, and their 5-year survival rate was 82.4%, 68.8%, 66.7%, and 42.9%, respectively. Radioenteritis and radiocystitis occurred as adverse radiation effects. Pathologic factors influencing lymph node metastasis were examined by a multivariate analysis based on the data from 25 patients with stage I and II who underwent hysterectomy. The results of the analysis indicated the importance of screening and the choice of appropriate surgical method/technique, as well as the need for further investigation to determine the effective

  18. Parallel implementation of D-Phylo algorithm for maximum likelihood clusters.

    Science.gov (United States)

    Malik, Shamita; Sharma, Dolly; Khatri, Sunil Kumar

    2017-03-01

    This study explains a newly developed parallel algorithm for phylogenetic analysis of DNA sequences. The newly designed D-Phylo is a more advanced algorithm for phylogenetic analysis using maximum likelihood approach. The D-Phylo while misusing the seeking capacity of k -means keeps away from its real constraint of getting stuck at privately conserved motifs. The authors have tested the behaviour of D-Phylo on Amazon Linux Amazon Machine Image(Hardware Virtual Machine)i2.4xlarge, six central processing unit, 122 GiB memory, 8  ×  800 Solid-state drive Elastic Block Store volume, high network performance up to 15 processors for several real-life datasets. Distributing the clusters evenly on all the processors provides us the capacity to accomplish a near direct speed if there should arise an occurrence of huge number of processors.

  19. BOLD VENTURE COMPUTATION SYSTEM for nuclear reactor core analysis, Version III

    International Nuclear Information System (INIS)

    Vondy, D.R.; Fowler, T.B.; Cunningham, G.W. III.

    1981-06-01

    This report is a condensed documentation for VERSION III of the BOLD VENTURE COMPUTATION SYSTEM for nuclear reactor core analysis. An experienced analyst should be able to use this system routinely for solving problems by referring to this document. Individual reports must be referenced for details. This report covers basic input instructions and describes recent extensions to the modules as well as to the interface data file specifications. Some application considerations are discussed and an elaborate sample problem is used as an instruction aid. Instructions for creating the system on IBM computers are also given

  20. BOLD VENTURE COMPUTATION SYSTEM for nuclear reactor core analysis, Version III

    Energy Technology Data Exchange (ETDEWEB)

    Vondy, D.R.; Fowler, T.B.; Cunningham, G.W. III.

    1981-06-01

    This report is a condensed documentation for VERSION III of the BOLD VENTURE COMPUTATION SYSTEM for nuclear reactor core analysis. An experienced analyst should be able to use this system routinely for solving problems by referring to this document. Individual reports must be referenced for details. This report covers basic input instructions and describes recent extensions to the modules as well as to the interface data file specifications. Some application considerations are discussed and an elaborate sample problem is used as an instruction aid. Instructions for creating the system on IBM computers are also given.

  1. Genomic organization, sequence characterization and expression analysis of Tenebrio molitor apolipophorin-III in response to an intracellular pathogen, Listeria monocytogenes.

    Science.gov (United States)

    Noh, Ju Young; Patnaik, Bharat Bhusan; Tindwa, Hamisi; Seo, Gi Won; Kim, Dong Hyun; Patnaik, Hongray Howrelia; Jo, Yong Hun; Lee, Yong Seok; Lee, Bok Luel; Kim, Nam Jung; Han, Yeon Soo

    2014-01-25

    Apolipophorin III (apoLp-III) is a well-known hemolymph protein having a functional role in lipid transport and immune response of insects. We cloned full-length cDNA encoding putative apoLp-III from larvae of the coleopteran beetle, Tenebrio molitor (TmapoLp-III), by identification of clones corresponding to the partial sequence of TmapoLp-III, subsequently followed with full length sequencing by a clone-by-clone primer walking method. The complete cDNA consists of 890 nucleotides, including an ORF encoding 196 amino acid residues. Excluding a putative signal peptide of the first 20 amino acid residues, the 176-residue mature apoLp-III has a calculated molecular mass of 19,146Da. Genomic sequence analysis with respect to its cDNA showed that TmapoLp-III was organized into four exons interrupted by three introns. Several immune-related transcription factor binding sites were discovered in the putative 5'-flanking region. BLAST and phylogenetic analyses reveal that TmapoLp-III has high sequence identity (88%) with Tribolium castaneum apoLp-III but shares little sequence homologies (molitor. Copyright © 2013 Elsevier B.V. All rights reserved.

  2. Complexes between lanthanide (III) and yttrium (III) picrates and tetra methylene sulfoxide as ligand

    International Nuclear Information System (INIS)

    Silva, M.A.A. da.

    1991-01-01

    The preparation and characterization of addition compounds between lanthanide (III) and yttrium (III) picrates and tetra methylene sulfoxide as ligand were described. The adducts were prepared in the molar relation 1 (salt): 3(ligand) in ethanol. They are microcrystalline with more intense color than those of their respective hydrated salts. At room temperature conditions they are non hygroscopic and do not present perceptible alterations. They became slightly opalescent, when heated between 363 and 423 K. At higher temperatures under several heating ratios, the behavior shown is the same: melting between 439 and 472 K. The characterization of the compounds was made by elemental analysis, electrolytic conductance measurements, X-ray powder patterns, infrared spectroscopy, visible electronic absorption and emission spectra of the neodymium (III) and europium (III), respectively. (author). 116 refs., 17 tabs., 11 figs

  3. Deformation of log-likelihood loss function for multiclass boosting.

    Science.gov (United States)

    Kanamori, Takafumi

    2010-09-01

    The purpose of this paper is to study loss functions in multiclass classification. In classification problems, the decision function is estimated by minimizing an empirical loss function, and then, the output label is predicted by using the estimated decision function. We propose a class of loss functions which is obtained by a deformation of the log-likelihood loss function. There are four main reasons why we focus on the deformed log-likelihood loss function: (1) this is a class of loss functions which has not been deeply investigated so far, (2) in terms of computation, a boosting algorithm with a pseudo-loss is available to minimize the proposed loss function, (3) the proposed loss functions provide a clear correspondence between the decision functions and conditional probabilities of output labels, (4) the proposed loss functions satisfy the statistical consistency of the classification error rate which is a desirable property in classification problems. Based on (3), we show that the deformed log-likelihood loss provides a model of mislabeling which is useful as a statistical model of medical diagnostics. We also propose a robust loss function against outliers in multiclass classification based on our approach. The robust loss function is a natural extension of the existing robust loss function for binary classification. A model of mislabeling and a robust loss function are useful to cope with noisy data. Some numerical studies are presented to show the robustness of the proposed loss function. A mathematical characterization of the deformed log-likelihood loss function is also presented. Copyright 2010 Elsevier Ltd. All rights reserved.

  4. Control Analysis of Hazards Potential in Crude Distiller Unit III PT. Pertamina (Persero) Refinery Unit III Plaju Tahun 2011

    OpenAIRE

    Matariani, Ade; Hasyim, Hamzah; Faisya, Achmad Fickry

    2012-01-01

    Background: Activities in CDU III are very risk to any hazards potential; because of that hazards potential is much needed in controlling the hazards potential to decrease the accidents and occupational diseases. The aim of this study was to analyze the controlling of hazards potential in CDU III PT. Pertamina (Persero) RU III Plaju in 2011. Method: This study was a qualitative study. The methods of data collection were using in-depth interview and observation. The total of informants in this...

  5. Thin-plate spline analysis of mandibular morphological changes induced by early class III treatment: a long-term evaluation.

    Science.gov (United States)

    Franchi, Lorenzo; Pavoni, Chiara; Cerroni, Silvia; Cozza, Paola

    2014-08-01

    To evaluate the long-term mandibular morphological changes induced by early treatment of class III malocclusion with rapid maxillary expansion (RME) and facial mask (FM). Twenty-five subjects [10 boys, 15 girls; mean age at T1 (start of treatment) 9.3±1.6 years] with class III disharmony were treated with RME and FM therapy followed by fixed appliances. The patients were re-evaluated at the end of growth (T2), about 8.5 years after the end of the treatment (mean age, 18.6±2.0 years). Sixteen subjects with untreated class III malocclusion comprised the control group. Mandibular shape changes were analysed on the lateral cephalograms of the subjects of both groups by means of thin-plate spline (TPS) analysis. Procrustes average mandibular configurations were subjected to TPS analysis by means of both cross-sectional between-group comparisons at T1 and at T2 and longitudinal within-group comparisons. Statistical analysis of shape differences was performed using a generalized Goodall F test. In the long term, the treated group exhibited a significant upward and forward direction of condylar growth. On the contrary, untreated class III subjects showed an upward and backward direction of condylar growth associated with a downward and forward deformation of the mandibular symphysis. Limitations are related to the small sample size of both treated and control groups and to the retrospective nature of the study. Early treatment of class III malocclusion with RME and FM is able to produce significant and favourable long-term mandibular shape changes characterized by an anterior morphogenetic rotation. © The Author 2013. Published by Oxford University Press on behalf of the European Orthodontic Society. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  6. Separation studies of La(III) and Ce(III)/Nd(III)/Pr(III)/Sm(III) from chloride solution using DEHPA/PC88A in petrofin

    International Nuclear Information System (INIS)

    Acharya, Sagarika; Mishra, Sujata; Bhatta, B.C.

    2017-01-01

    The separation of La(III) and four other lanthanides. Ce, Nd, Pr and Sm from chloride solution has been studied using the two acidic organophosphorous extractants, DEHPA and PC88A in petrofin at pH 4.3. The metal content analysis was done using an ICP-OES spectrophotometer. The separation factors (β) was calculated and for La-Sm pair highest value of 9.7 was obtained. (author)

  7. Nearly Efficient Likelihood Ratio Tests for Seasonal Unit Roots

    DEFF Research Database (Denmark)

    Jansson, Michael; Nielsen, Morten Ørregaard

    In an important generalization of zero frequency autore- gressive unit root tests, Hylleberg, Engle, Granger, and Yoo (1990) developed regression-based tests for unit roots at the seasonal frequencies in quarterly time series. We develop likelihood ratio tests for seasonal unit roots and show...... that these tests are "nearly efficient" in the sense of Elliott, Rothenberg, and Stock (1996), i.e. that their local asymptotic power functions are indistinguishable from the Gaussian power envelope. Currently available nearly efficient testing procedures for seasonal unit roots are regression-based and require...... the choice of a GLS detrending parameter, which our likelihood ratio tests do not....

  8. Safety systems and safety analysis of the Qinshan phase III CANDU nuclear power plant

    International Nuclear Information System (INIS)

    Cai Jianping; Shen Sen; Barkman, N.

    1999-01-01

    The author introduces the Canadian nuclear reactor safety philosophy and the Qinshan Phase III CANDU NPP safety systems and safety analysis, which are designed and performed according to this philosophy. The concept of 'defence-in-depth' is a key element of the Canadian nuclear reactor safety philosophy. The design concepts of redundancy, diversity, separation, equipment qualification, quality assurance, and use of appropriate design codes and standards are adopted in the design. Four special safety systems as well as a set of reliable safety support systems are incorporated in the design of Qinshan phase III CANDU for accident mitigation. The assessment results for safety systems performance show that the fundamental safety criteria for public dose, and integrity of fuel, channels and the reactor building, are satisfied

  9. Source and Message Factors in Persuasion: A Reply to Stiff's Critique of the Elaboration Likelihood Model.

    Science.gov (United States)

    Petty, Richard E.; And Others

    1987-01-01

    Answers James Stiff's criticism of the Elaboration Likelihood Model (ELM) of persuasion. Corrects certain misperceptions of the ELM and criticizes Stiff's meta-analysis that compares ELM predictions with those derived from Kahneman's elastic capacity model. Argues that Stiff's presentation of the ELM and the conclusions he draws based on the data…

  10. MADmap: A Massively Parallel Maximum-Likelihood Cosmic Microwave Background Map-Maker

    Energy Technology Data Exchange (ETDEWEB)

    Cantalupo, Christopher; Borrill, Julian; Jaffe, Andrew; Kisner, Theodore; Stompor, Radoslaw

    2009-06-09

    MADmap is a software application used to produce maximum-likelihood images of the sky from time-ordered data which include correlated noise, such as those gathered by Cosmic Microwave Background (CMB) experiments. It works efficiently on platforms ranging from small workstations to the most massively parallel supercomputers. Map-making is a critical step in the analysis of all CMB data sets, and the maximum-likelihood approach is the most accurate and widely applicable algorithm; however, it is a computationally challenging task. This challenge will only increase with the next generation of ground-based, balloon-borne and satellite CMB polarization experiments. The faintness of the B-mode signal that these experiments seek to measure requires them to gather enormous data sets. MADmap is already being run on up to O(1011) time samples, O(108) pixels and O(104) cores, with ongoing work to scale to the next generation of data sets and supercomputers. We describe MADmap's algorithm based around a preconditioned conjugate gradient solver, fast Fourier transforms and sparse matrix operations. We highlight MADmap's ability to address problems typically encountered in the analysis of realistic CMB data sets and describe its application to simulations of the Planck and EBEX experiments. The massively parallel and distributed implementation is detailed and scaling complexities are given for the resources required. MADmap is capable of analysing the largest data sets now being collected on computing resources currently available, and we argue that, given Moore's Law, MADmap will be capable of reducing the most massive projected data sets.

  11. Formation constants of Sm(III), Dy(III), Gd(III), Pr(III) and Nd(III) complexes of tridentate schiff base, 2-[(1H-benzimidazol-2-yl-methylene) amino] phenol

    International Nuclear Information System (INIS)

    Omprakash, K.L.; Chandra Pal, A.V.; Reddy, M.L.N.

    1982-01-01

    A new tridentate schiff base, 2- (1H-benzimidazol-2-yl-methylene)amino phenol derived from benzimididazole-2-carbo-xaldehyde and 2-aminophenol has been synthesised and characterised by spectral and analytical data. Proton-ligand formation constants of the schiff base and metal-ligand formation constants of its complexes with Sm(III), Dy(III), Gd(III), Nd(III) and Pr(III) have been determined potentiometrically in 50% (v/v) aqueous dioxane at an ionic strength of 0.1M (NaClO 4 ) and at 25deg C using the Irving-Rossotti titration technique. The order of stability constants (logβ 2 ) is found to be Sm(III)>Dy(III)>Gd(III)>Pr(III)>Nd(III). (author)

  12. Formation constants of Sm(III), Dy(III), Gd(III), Pr(III) and Nd(III) complexes of tridentate schiff base, 2-((1H-benzimidazol-2-yl-methylene) amino) phenol

    Energy Technology Data Exchange (ETDEWEB)

    Omprakash, K L; Chandra Pal, A V; Reddy, M L.N. [Osmania Univ., Hyderabad (India). Dept. of Chemistry

    1982-03-01

    A new tridentate schiff base, 2- (1H-benzimidazol-2-yl-methylene)amino phenol derived from benzimididazole-2-carbo-xaldehyde and 2-aminophenol has been synthesised and characterised by spectral and analytical data. Proton-ligand formation constants of the schiff base and metal-ligand formation constants of its complexes with Sm(III), Dy(III), Gd(III), Nd(III) and Pr(III) have been determined potentiometrically in 50% (v/v) aqueous dioxane at an ionic strength of 0.1M (NaClO/sub 4/) and at 25deg C using the Irving-Rossotti titration technique. The order of stability constants (log..beta../sub 2/) is found to be Sm(III)>Dy(III)>Gd(III)>Pr(III)>Nd(III).

  13. The Davros III supervisory control system

    International Nuclear Information System (INIS)

    Rice, P.

    1996-01-01

    Magnox Electric's Remote Operations Branch deploy a wide variety of remote inspection and maintenance tools into nuclear plant in order to perform a variety of tasks. In recent years much progress has been made on low-level control of individual manipulator axes, and a parallel need has emerged for a supervisory system to assist the operator in the control of the whole system. Some requirements are: 1) to improve operator control of systems; 2) to simplify software maintenance and version control; 3) to reduce the likelihood of damage to manipulators; 4) to assist with rehearsals and simulations. Davros III is a PC software system which has been developed over a number of years to address these requirements. In a single program capable of being configured for a wide variety of applications, it provides a technique for following pretaught routes, a comprehensive and fully configurable interlock system and several different facilities for simulation. (author)

  14. The Davros III supervisory control system

    International Nuclear Information System (INIS)

    Rice, P.

    1996-01-01

    Magnox Electric's Remote Operations Bbranch deploy a wide variety of remote inspection and maintenance tools into nuclear plant in order to perform a variety of tasks. In recent years much progress has been made on low-level control of individual manipulator axes and a parallel need has emerged for a supervisory system to assist the operator in the control of the whole system. Some requirements are: 1, To improve operator control of systems. 2, To simplify software maintenance and version control. 3, To reduce the likelihood of damage to manipulators. 4, To assist with rehearsals and simulations. Davros III is a PC software system which has been developed over a number of years to address these requirements. In a single program capable of being configured for a wide variety of applications, it provides a technique for following pretaught routes, a comprehensive and fully configurable interlock system and several different facilities for simulation. (author)

  15. Risk Assessment Using the Three Dimensions of Probability (Likelihood), Severity, and Level of Control

    Science.gov (United States)

    Watson, Clifford C.

    2011-01-01

    Traditional hazard analysis techniques utilize a two-dimensional representation of the results determined by relative likelihood and severity of the residual risk. These matrices present a quick-look at the Likelihood (Y-axis) and Severity (X-axis) of the probable outcome of a hazardous event. A three-dimensional method, described herein, utilizes the traditional X and Y axes, while adding a new, third dimension, shown as the Z-axis, and referred to as the Level of Control. The elements of the Z-axis are modifications of the Hazard Elimination and Control steps (also known as the Hazard Reduction Precedence Sequence). These steps are: 1. Eliminate risk through design. 2. Substitute less risky materials for more hazardous materials. 3. Install safety devices. 4. Install caution and warning devices. 5. Develop administrative controls (to include special procedures and training.) 6. Provide protective clothing and equipment. When added to the two-dimensional models, the level of control adds a visual representation of the risk associated with the hazardous condition, creating a tall-pole for the least-well-controlled failure while establishing the relative likelihood and severity of all causes and effects for an identified hazard. Computer modeling of the analytical results, using spreadsheets and three-dimensional charting gives a visual confirmation of the relationship between causes and their controls.

  16. Generalized empirical likelihood methods for analyzing longitudinal data

    KAUST Repository

    Wang, S.; Qian, L.; Carroll, R. J.

    2010-01-01

    Efficient estimation of parameters is a major objective in analyzing longitudinal data. We propose two generalized empirical likelihood based methods that take into consideration within-subject correlations. A nonparametric version of the Wilks

  17. The Neural Bases of Difficult Speech Comprehension and Speech Production: Two Activation Likelihood Estimation (ALE) Meta-Analyses

    Science.gov (United States)

    Adank, Patti

    2012-01-01

    The role of speech production mechanisms in difficult speech comprehension is the subject of on-going debate in speech science. Two Activation Likelihood Estimation (ALE) analyses were conducted on neuroimaging studies investigating difficult speech comprehension or speech production. Meta-analysis 1 included 10 studies contrasting comprehension…

  18. Spectrophotometric and pH-Metric Studies of Ce(III, Dy(III, Gd(III,Yb(III and Pr(III Metal Complexes with Rifampicin

    Directory of Open Access Journals (Sweden)

    A. N. Sonar

    2011-01-01

    Full Text Available The metal-ligand and proton-ligand stability constant of Ce(III, Dy(III, Gd(III,Yb(III and Pr(III metals with substituted heterocyclic drug (Rifampicin were determined at various ionic strength by pH metric titration. NaClO4 was used to maintain ionic strength of solution. The results obtained were extrapolated to the zero ionic strength using an equation with one individual parameter. The thermodynamic stability constant of the complexes were also calculated. The formation of complexes has been studied by Job’s method. The results obtained were of stability constants by pH metric method is confirmed by Job’s method.

  19. Estimation Methods for Non-Homogeneous Regression - Minimum CRPS vs Maximum Likelihood

    Science.gov (United States)

    Gebetsberger, Manuel; Messner, Jakob W.; Mayr, Georg J.; Zeileis, Achim

    2017-04-01

    Non-homogeneous regression models are widely used to statistically post-process numerical weather prediction models. Such regression models correct for errors in mean and variance and are capable to forecast a full probability distribution. In order to estimate the corresponding regression coefficients, CRPS minimization is performed in many meteorological post-processing studies since the last decade. In contrast to maximum likelihood estimation, CRPS minimization is claimed to yield more calibrated forecasts. Theoretically, both scoring rules used as an optimization score should be able to locate a similar and unknown optimum. Discrepancies might result from a wrong distributional assumption of the observed quantity. To address this theoretical concept, this study compares maximum likelihood and minimum CRPS estimation for different distributional assumptions. First, a synthetic case study shows that, for an appropriate distributional assumption, both estimation methods yield to similar regression coefficients. The log-likelihood estimator is slightly more efficient. A real world case study for surface temperature forecasts at different sites in Europe confirms these results but shows that surface temperature does not always follow the classical assumption of a Gaussian distribution. KEYWORDS: ensemble post-processing, maximum likelihood estimation, CRPS minimization, probabilistic temperature forecasting, distributional regression models

  20. Estimation of Model's Marginal likelihood Using Adaptive Sparse Grid Surrogates in Bayesian Model Averaging

    Science.gov (United States)

    Zeng, X.

    2015-12-01

    A large number of model executions are required to obtain alternative conceptual models' predictions and their posterior probabilities in Bayesian model averaging (BMA). The posterior model probability is estimated through models' marginal likelihood and prior probability. The heavy computation burden hinders the implementation of BMA prediction, especially for the elaborated marginal likelihood estimator. For overcoming the computation burden of BMA, an adaptive sparse grid (SG) stochastic collocation method is used to build surrogates for alternative conceptual models through the numerical experiment of a synthetical groundwater model. BMA predictions depend on model posterior weights (or marginal likelihoods), and this study also evaluated four marginal likelihood estimators, including arithmetic mean estimator (AME), harmonic mean estimator (HME), stabilized harmonic mean estimator (SHME), and thermodynamic integration estimator (TIE). The results demonstrate that TIE is accurate in estimating conceptual models' marginal likelihoods. The BMA-TIE has better predictive performance than other BMA predictions. TIE has high stability for estimating conceptual model's marginal likelihood. The repeated estimated conceptual model's marginal likelihoods by TIE have significant less variability than that estimated by other estimators. In addition, the SG surrogates are efficient to facilitate BMA predictions, especially for BMA-TIE. The number of model executions needed for building surrogates is 4.13%, 6.89%, 3.44%, and 0.43% of the required model executions of BMA-AME, BMA-HME, BMA-SHME, and BMA-TIE, respectively.

  1. Insecticide resistance, control failure likelihood and the First Law of Geography.

    Science.gov (United States)

    Guedes, Raul Narciso C

    2017-03-01

    Insecticide resistance is a broadly recognized ecological backlash resulting from insecticide use and is widely reported among arthropod pest species with well-recognized underlying mechanisms and consequences. Nonetheless, insecticide resistance is the subject of evolving conceptual views that introduces a different concept useful if recognized in its own right - the risk or likelihood of control failure. Here we suggest an experimental approach to assess the likelihood of control failure of an insecticide allowing for consistent decision-making regarding management of insecticide resistance. We also challenge the current emphasis on limited spatial sampling of arthropod populations for resistance diagnosis in favor of comprehensive spatial sampling. This necessarily requires larger population sampling - aiming to use spatial analysis in area-wide surveys - to recognize focal points of insecticide resistance and/or control failure that will better direct management efforts. The continuous geographical scale of such surveys will depend on the arthropod pest species, the pattern of insecticide use and many other potential factors. Regardless, distance dependence among sampling sites should still hold, following the maxim that the closer two things are, the more they resemble each other, which is the basis of Tobler's First Law of Geography. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.

  2. Parallelization of maximum likelihood fits with OpenMP and CUDA

    CERN Document Server

    Jarp, S; Leduc, J; Nowak, A; Pantaleo, F

    2011-01-01

    Data analyses based on maximum likelihood fits are commonly used in the high energy physics community for fitting statistical models to data samples. This technique requires the numerical minimization of the negative log-likelihood function. MINUIT is the most common package used for this purpose in the high energy physics community. The main algorithm in this package, MIGRAD, searches the minimum by using the gradient information. The procedure requires several evaluations of the function, depending on the number of free parameters and their initial values. The whole procedure can be very CPU-time consuming in case of complex functions, with several free parameters, many independent variables and large data samples. Therefore, it becomes particularly important to speed-up the evaluation of the negative log-likelihood function. In this paper we present an algorithm and its implementation which benefits from data vectorization and parallelization (based on OpenMP) and which was also ported to Graphics Processi...

  3. Moment Conditions Selection Based on Adaptive Penalized Empirical Likelihood

    Directory of Open Access Journals (Sweden)

    Yunquan Song

    2014-01-01

    Full Text Available Empirical likelihood is a very popular method and has been widely used in the fields of artificial intelligence (AI and data mining as tablets and mobile application and social media dominate the technology landscape. This paper proposes an empirical likelihood shrinkage method to efficiently estimate unknown parameters and select correct moment conditions simultaneously, when the model is defined by moment restrictions in which some are possibly misspecified. We show that our method enjoys oracle-like properties; that is, it consistently selects the correct moment conditions and at the same time its estimator is as efficient as the empirical likelihood estimator obtained by all correct moment conditions. Moreover, unlike the GMM, our proposed method allows us to carry out confidence regions for the parameters included in the model without estimating the covariances of the estimators. For empirical implementation, we provide some data-driven procedures for selecting the tuning parameter of the penalty function. The simulation results show that the method works remarkably well in terms of correct moment selection and the finite sample properties of the estimators. Also, a real-life example is carried out to illustrate the new methodology.

  4. Heterologous gene expression and functional analysis of a type III polyketide synthase from Aspergillus niger NRRL 328

    Energy Technology Data Exchange (ETDEWEB)

    Kirimura, Kohtaro, E-mail: kkohtaro@waseda.jp; Watanabe, Shotaro; Kobayashi, Keiichi

    2016-05-13

    Type III polyketide synthases (PKSs) catalyze the formation of pyrone- and resorcinol-types aromatic polyketides. The genomic analysis of the filamentous fungus Aspergillus niger NRRL 328 revealed that this strain has a putative gene (chr-8-2: 2978617–2979847) encoding a type III PKS, although its functions are unknown. In this study, for functional analysis of this putative type III PKS designated as An-CsyA, cloning and heterologous expression of the An-CsyA gene (An-csyA) in Escherichia coli were performed. Recombinant His-tagged An-CsyA was successfully expressed in E. coli BL21 (DE3), purified by Ni{sup 2+}-affinity chromatography, and used for in vitro assay. Tests on the substrate specificity of the His-tagged An-CsyA with myriad acyl-CoAs as starter substrates and malonyl-CoA as extender substrate showed that His-tagged An-CsyA accepted fatty acyl-CoAs (C2-C14) and produced triketide pyrones (C2-C14), tetraketide pyrones (C2-C10), and pentaketide resorcinols (C10-C14). Furthermore, acetoacetyl-CoA, malonyl-CoA, isobutyryl-CoA, and benzoyl-CoA were also accepted as starter substrates, and both of triketide pyrones and tetraketide pyrones were produced. It is noteworthy that the His-tagged An-CsyA produced polyketides from malonyl-CoA as starter and extender substrates and produced tetraketide pyrones from short-chain fatty acyl-CoAs as starter substrates. Therefore, this is the first report showing the functional properties of An-CsyA different from those of other fungal type III PKSs. -- Highlights: •Type III PKS from Aspergillus niger NRRL 328, An-CsyA, was cloned and characterized. •An-CsyA produced triketide pyrones, tetraketide pyrones and pentaketide resorcinols. •Functional properties of An-CsyA differs from those of other fungal type III PKSs.

  5. Heterologous gene expression and functional analysis of a type III polyketide synthase from Aspergillus niger NRRL 328

    International Nuclear Information System (INIS)

    Kirimura, Kohtaro; Watanabe, Shotaro; Kobayashi, Keiichi

    2016-01-01

    Type III polyketide synthases (PKSs) catalyze the formation of pyrone- and resorcinol-types aromatic polyketides. The genomic analysis of the filamentous fungus Aspergillus niger NRRL 328 revealed that this strain has a putative gene (chr-8-2: 2978617–2979847) encoding a type III PKS, although its functions are unknown. In this study, for functional analysis of this putative type III PKS designated as An-CsyA, cloning and heterologous expression of the An-CsyA gene (An-csyA) in Escherichia coli were performed. Recombinant His-tagged An-CsyA was successfully expressed in E. coli BL21 (DE3), purified by Ni"2"+-affinity chromatography, and used for in vitro assay. Tests on the substrate specificity of the His-tagged An-CsyA with myriad acyl-CoAs as starter substrates and malonyl-CoA as extender substrate showed that His-tagged An-CsyA accepted fatty acyl-CoAs (C2-C14) and produced triketide pyrones (C2-C14), tetraketide pyrones (C2-C10), and pentaketide resorcinols (C10-C14). Furthermore, acetoacetyl-CoA, malonyl-CoA, isobutyryl-CoA, and benzoyl-CoA were also accepted as starter substrates, and both of triketide pyrones and tetraketide pyrones were produced. It is noteworthy that the His-tagged An-CsyA produced polyketides from malonyl-CoA as starter and extender substrates and produced tetraketide pyrones from short-chain fatty acyl-CoAs as starter substrates. Therefore, this is the first report showing the functional properties of An-CsyA different from those of other fungal type III PKSs. -- Highlights: •Type III PKS from Aspergillus niger NRRL 328, An-CsyA, was cloned and characterized. •An-CsyA produced triketide pyrones, tetraketide pyrones and pentaketide resorcinols. •Functional properties of An-CsyA differs from those of other fungal type III PKSs.

  6. Unbinned likelihood maximisation framework for neutrino clustering in Python

    Energy Technology Data Exchange (ETDEWEB)

    Coenders, Stefan [Technische Universitaet Muenchen, Boltzmannstr. 2, 85748 Garching (Germany)

    2016-07-01

    Albeit having detected an astrophysical neutrino flux with IceCube, sources of astrophysical neutrinos remain hidden up to now. A detection of a neutrino point source is a smoking gun for hadronic processes and acceleration of cosmic rays. The search for neutrino sources has many degrees of freedom, for example steady versus transient, point-like versus extended sources, et cetera. Here, we introduce a Python framework designed for unbinned likelihood maximisations as used in searches for neutrino point sources by IceCube. Implementing source scenarios in a modular way, likelihood searches on various kinds can be implemented in a user-friendly way, without sacrificing speed and memory management.

  7. Dimension-Independent Likelihood-Informed MCMC

    KAUST Repository

    Cui, Tiangang; Law, Kody; Marzouk, Youssef

    2015-01-01

    Many Bayesian inference problems require exploring the posterior distribution of high-dimensional parameters, which in principle can be described as functions. By exploiting low-dimensional structure in the change from prior to posterior [distributions], we introduce a suite of MCMC samplers that can adapt to the complex structure of the posterior distribution, yet are well-defined on function space. Posterior sampling in nonlinear inverse problems arising from various partial di erential equations and also a stochastic differential equation are used to demonstrate the e ciency of these dimension-independent likelihood-informed samplers.

  8. Dimension-Independent Likelihood-Informed MCMC

    KAUST Repository

    Cui, Tiangang

    2015-01-07

    Many Bayesian inference problems require exploring the posterior distribution of high-dimensional parameters, which in principle can be described as functions. By exploiting low-dimensional structure in the change from prior to posterior [distributions], we introduce a suite of MCMC samplers that can adapt to the complex structure of the posterior distribution, yet are well-defined on function space. Posterior sampling in nonlinear inverse problems arising from various partial di erential equations and also a stochastic differential equation are used to demonstrate the e ciency of these dimension-independent likelihood-informed samplers.

  9. Inner-sphere and outer-sphere complexes of yttrium(III), lanthanum (III), neodymium(III), terbium(III) and thulium(III) with halide ions in N,N-dimethylformamide

    International Nuclear Information System (INIS)

    Takahashi, Ryouta; Ishiguro, Shin-ichi

    1991-01-01

    The formation of chloro, bromo and iodo complexes of yttrium(III), and bromo and iodo complexes of lanthanum(III), neodymium(III), terbium(III) and thulium(III) has been studied by precise titration calorimetry in N,N-dimethylformamide (DMF) at 25 o C. The formation of [YCl] 2+ , [YCl 2 ] + , [YCl 3 ] and [YCl 4 ] - , and [MBr] 2+ and [MBr 2 ] + (M = Y, La, Nd, Tb, Tm) was revealed, and their formation constants, enthalpies and entropies were determined. It is found that the formation enthalpies change in the sequence ΔH o (Cl) > ΔH o (l), which is unusual for hard metal (III) ions. This implies that, unlike the chloride ion, the bromide ion forms outer-sphere complexes with the lanthanide(III) and yttrium(III) ions in DMF. Evidence for either an inner- or outer-sphere complex was obtained from 89 Y NMR spectra for Y(ClO 4 ) 3 , YCl 3 and YBr 3 DMF solutions at room temperature. (author)

  10. Generalized Empirical Likelihood-Based Focused Information Criterion and Model Averaging

    Directory of Open Access Journals (Sweden)

    Naoya Sueishi

    2013-07-01

    Full Text Available This paper develops model selection and averaging methods for moment restriction models. We first propose a focused information criterion based on the generalized empirical likelihood estimator. We address the issue of selecting an optimal model, rather than a correct model, for estimating a specific parameter of interest. Then, this study investigates a generalized empirical likelihood-based model averaging estimator that minimizes the asymptotic mean squared error. A simulation study suggests that our averaging estimator can be a useful alternative to existing post-selection estimators.

  11. Nearly Efficient Likelihood Ratio Tests of the Unit Root Hypothesis

    DEFF Research Database (Denmark)

    Jansson, Michael; Nielsen, Morten Ørregaard

    Seemingly absent from the arsenal of currently available "nearly efficient" testing procedures for the unit root hypothesis, i.e. tests whose local asymptotic power functions are indistinguishable from the Gaussian power envelope, is a test admitting a (quasi-)likelihood ratio interpretation. We...... show that the likelihood ratio unit root test derived in a Gaussian AR(1) model with standard normal innovations is nearly efficient in that model. Moreover, these desirable properties carry over to more complicated models allowing for serially correlated and/or non-Gaussian innovations....

  12. Quantitative Risk Analysis: Method And Process

    Directory of Open Access Journals (Sweden)

    Anass BAYAGA

    2010-03-01

    Full Text Available Recent and past studies (King III report, 2009: 73-75; Stoney 2007;Committee of Sponsoring Organisation-COSO, 2004, Bartell, 2003; Liebenberg and Hoyt, 2003; Reason, 2000; Markowitz 1957 lament that although, the introduction of quantifying risk to enhance degree of objectivity in finance for instance was quite parallel to its development in the manufacturing industry, it is not the same in Higher Education Institution (HEI. In this regard, the objective of the paper was to demonstrate the methods and process of Quantitative Risk Analysis (QRA through likelihood of occurrence of risk (phase I. This paper serves as first of a two-phased study, which sampled hundred (100 risk analysts in a University in the greater Eastern Cape Province of South Africa.The analysis of likelihood of occurrence of risk by logistic regression and percentages were conducted to investigate whether there were a significant difference or not between groups (analyst in respect of QRA.The Hosmer and Lemeshow test was non-significant with a chi-square(X2 =8.181; p = 0.300, which indicated that there was a good model fit, since the data did not significantly deviate from the model. The study concluded that to derive an overall likelihood rating that indicated the probability that a potential risk may be exercised within the construct of an associated threat environment, the following governing factors must be considered: (1 threat source motivation and capability (2 nature of the vulnerability (3 existence and effectiveness of current controls (methods and process.

  13. Maximum likelihood window for time delay estimation

    International Nuclear Information System (INIS)

    Lee, Young Sup; Yoon, Dong Jin; Kim, Chi Yup

    2004-01-01

    Time delay estimation for the detection of leak location in underground pipelines is critically important. Because the exact leak location depends upon the precision of the time delay between sensor signals due to leak noise and the speed of elastic waves, the research on the estimation of time delay has been one of the key issues in leak lovating with the time arrival difference method. In this study, an optimal Maximum Likelihood window is considered to obtain a better estimation of the time delay. This method has been proved in experiments, which can provide much clearer and more precise peaks in cross-correlation functions of leak signals. The leak location error has been less than 1 % of the distance between sensors, for example the error was not greater than 3 m for 300 m long underground pipelines. Apart from the experiment, an intensive theoretical analysis in terms of signal processing has been described. The improved leak locating with the suggested method is due to the windowing effect in frequency domain, which offers a weighting in significant frequencies.

  14. Purification of chicken carbonic anhydrase isozyme-III (CA-III) and its measurement in White Leghorn chickens.

    Science.gov (United States)

    Nishita, Toshiho; Tomita, Yuichiro; Yorifuji, Daisuke; Orito, Kensuke; Ochiai, Hideharu; Arishima, Kazuyosi

    2011-11-26

    The developmental profile of chicken carbonic anhydrase-III (CA-III) blood levels has not been previously determined or reported. We isolated CA-III from chicken muscle and investigated age-related changes in the levels of CA-III in blood. CA-III was purified from chicken muscle. The levels of CA-III in plasma and erythrocytes from 278 female chickens (aged 1-93 weeks) and 68 male chickens (aged 3-59 weeks) were determined by ELISA. The mean level of CA-III in female chicken erythrocytes (1 week old) was 4.6 μg/g of Hb, and the CA-III level did not change until 16 weeks of age. The level then increased until 63 weeks of age (11.8 μg/g of Hb), decreased to 4.7 μg/g of Hb at 73 weeks of age, and increased again until 93 weeks of age (8.6 μg/g of Hb). The mean level of CA-III in erythrocytes from male chickens (3 weeks old) was 2.4 μg/g of Hb, and this level remained steady until 59 weeks of age. The mean plasma level of CA-III in 1-week-old female chickens was 60 ng/mL, and this level was increased at 3 weeks of age (141 ng/mL) and then remained steady until 80 weeks of age (122 ng/mL). The mean plasma level of CA-III in 3-week-old male chickens was 58 ng/mL, and this level remained steady until 59 weeks of age. We observed both developmental changes and sex differences in CA-III concentrations in White Leghorn (WL) chicken erythrocytes and plasma. Simple linear regression analysis showed a significant association between the erythrocyte CA-III level and egg-laying rate in WL-chickens 16-63 weeks of age (p < 0.01).

  15. Assessing compatibility of direct detection data: halo-independent global likelihood analyses

    Energy Technology Data Exchange (ETDEWEB)

    Gelmini, Graciela B. [Department of Physics and Astronomy, UCLA,475 Portola Plaza, Los Angeles, CA 90095 (United States); Huh, Ji-Haeng [CERN Theory Division,CH-1211, Geneva 23 (Switzerland); Witte, Samuel J. [Department of Physics and Astronomy, UCLA,475 Portola Plaza, Los Angeles, CA 90095 (United States)

    2016-10-18

    We present two different halo-independent methods to assess the compatibility of several direct dark matter detection data sets for a given dark matter model using a global likelihood consisting of at least one extended likelihood and an arbitrary number of Gaussian or Poisson likelihoods. In the first method we find the global best fit halo function (we prove that it is a unique piecewise constant function with a number of down steps smaller than or equal to a maximum number that we compute) and construct a two-sided pointwise confidence band at any desired confidence level, which can then be compared with those derived from the extended likelihood alone to assess the joint compatibility of the data. In the second method we define a “constrained parameter goodness-of-fit” test statistic, whose p-value we then use to define a “plausibility region” (e.g. where p≥10%). For any halo function not entirely contained within the plausibility region, the level of compatibility of the data is very low (e.g. p<10%). We illustrate these methods by applying them to CDMS-II-Si and SuperCDMS data, assuming dark matter particles with elastic spin-independent isospin-conserving interactions or exothermic spin-independent isospin-violating interactions.

  16. Imagination perspective affects ratings of the likelihood of occurrence of autobiographical memories.

    Science.gov (United States)

    Marsh, Benjamin U; Pezdek, Kathy; Lam, Shirley T

    2014-07-01

    Two experiments tested and confirmed the hypothesis that when the phenomenological characteristics of imagined events are more similar to those of related autobiographical memories, the imagined event is more likely to be considered to have occurred. At Time 1 and 2-weeks later, individuals rated the likelihood of occurrence for 20 life events. In Experiment 1, 1-week after Time 1, individuals imagined 3 childhood events from a first-person or third-person perspective. There was a no-imagination control. An increase in likelihood ratings from Time 1 to Time 2 resulted when imagination was from the third-person but not first-person perspective. In Experiment 2, childhood and recent events were imagined from a third- or first-person perspective. A significant interaction resulted. For childhood events, likelihood change scores were greater for third-person than first-person perspective; for recent adult events, likelihood change scores were greater for first-person than third-person perspective, although this latter trend was not significant. Copyright © 2014 Elsevier B.V. All rights reserved.

  17. Maximum likelihood as a common computational framework in tomotherapy

    International Nuclear Information System (INIS)

    Olivera, G.H.; Shepard, D.M.; Reckwerdt, P.J.; Ruchala, K.; Zachman, J.; Fitchard, E.E.; Mackie, T.R.

    1998-01-01

    Tomotherapy is a dose delivery technique using helical or axial intensity modulated beams. One of the strengths of the tomotherapy concept is that it can incorporate a number of processes into a single piece of equipment. These processes include treatment optimization planning, dose reconstruction and kilovoltage/megavoltage image reconstruction. A common computational technique that could be used for all of these processes would be very appealing. The maximum likelihood estimator, originally developed for emission tomography, can serve as a useful tool in imaging and radiotherapy. We believe that this approach can play an important role in the processes of optimization planning, dose reconstruction and kilovoltage and/or megavoltage image reconstruction. These processes involve computations that require comparable physical methods. They are also based on equivalent assumptions, and they have similar mathematical solutions. As a result, the maximum likelihood approach is able to provide a common framework for all three of these computational problems. We will demonstrate how maximum likelihood methods can be applied to optimization planning, dose reconstruction and megavoltage image reconstruction in tomotherapy. Results for planning optimization, dose reconstruction and megavoltage image reconstruction will be presented. Strengths and weaknesses of the methodology are analysed. Future directions for this work are also suggested. (author)

  18. AMS-C14 analysis of graphite obtained with an Automated Graphitization Equipment (AGE III) from aerosol collected on quartz filters

    Energy Technology Data Exchange (ETDEWEB)

    Solís, C.; Chávez, E.; Ortiz, M.E.; Andrade, E. [Instituto de Física, Universidad Nacional Autónoma de México, 04510 México D.F. (Mexico); Ortíz, E. [Universidad Autónoma Metropolitana, Unidad Azcapotzalco, México D.F. (Mexico); Szidat, S. [Department of Chemistry and Biochemistry, University of Bern, Freiestrasse 3, CH-3012 Bern (Switzerland); Paul Scherrer Institut (PSI), CH-5232 Villigen (Switzerland); Wacker, L. [Laboratory of Ion Physics, ETH, Honggerberg, Zurich (Switzerland)

    2015-10-15

    AMS-{sup 14}C applications often require the analysis of small samples. Such is the case of atmospheric aerosols where frequently only a small amount of sample is available. The ion beam physics group at the ETH, Zurich, has designed an Automated Graphitization Equipment (AGE III) for routine graphite production for AMS analysis from organic samples of approximately 1 mg. In this study, we explore the potential use of the AGE III for graphitization of particulate carbon collected in quartz filters. In order to test the methodology, samples of reference materials and blanks with different sizes were prepared in the AGE III and the graphite was analyzed in a MICADAS AMS (ETH) system. The graphite samples prepared in the AGE III showed recovery yields higher than 80% and reproducible {sup 14}C values for masses ranging from 50 to 300 μg. Also, reproducible radiocarbon values were obtained for aerosol filters of small sizes that had been graphitized in the AGE III. As a study case, the tested methodology was applied to PM{sub 10} samples collected in two urban cities in Mexico in order to compare the source apportionment of biomass and fossil fuel combustion. The obtained {sup 14}C data showed that carbonaceous aerosols from Mexico City have much lower biogenic signature than the smaller city of Cuernavaca.

  19. TREAT MK III Loop Thermoelastoplastic Stress Analysis for the L03 Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Kennedy, James M.

    1981-03-01

    The STRAW code was used to analyze the static response of a TREAT MK III loop subjected to thermal and mechanical loadings arising from an accident situation for the purpose of determining the defiections and stresses. This analysis provides safety support for the L03 reactivity accident study. The analysis was subdivided into two tasks: (1) an analysis of a flow blockage accident (Cases A and B), where all the energy is assumed deposited in the test leg, resulting in a temperature increase from 530°F to 1720°F, with a small internal pressure throughout the loop and (2) an analysis of a second flow blockage accident (Cases C and D), where again, all the energy is assumed to he deposited in the test leg, resulting in a temperature rise from 530°F to 1845°F, with a small internal pressure throughout the loop. The purpose of these two tasks was to determine if loop failure can occur with the thermal differential across the pump and test legs. Also of interest is whether an undesirable amount of loop lateral deflection will be caused by the thermal differential. A two dimensional analysis of the TREAT MK III loop was performed. The analysis accounted for material nonlinearities, both as a function of temperature and stress, and geometric nonlinearities arising from large deflections. Straight beam elements with annular cross sections were used to model the loop. The analyses show that the maximum strains are less than 21% of their failure strains for all subcases of Cases A and B. For all subcases of cases C and D, the maximum strains are less than 53% of their failure strains. The failure strain is 27.9% for the material at 530°F, 38.1% at 1720°F and 17.8% at 1845°F. Large lateral deflections are observed when the loop is not constrained except at its clamped support--as much as 8.6 inches. However, by accounting for the constraint of the concrete biological shield, the maximum lateral deflection was reduced to less than 0.05 inches at the points of concern.

  20. DFT calculations, spectroscopic, thermal analysis and biological activity of Sm(III) and Tb(III) complexes with 2-aminobenzoic and 2-amino-5-chloro-benzoic acids

    Science.gov (United States)

    Essawy, Amr A.; Afifi, Manal A.; Moustafa, H.; El-Medani, S. M.

    2014-10-01

    The complexes of Sm(III) and Tb(III) with 2-aminobenzoic acid (anthranilic acid, AA) and 2-amino-5-chlorobenzoic acid (5-chloroanthranilic acid, AACl) were synthesized and characterized based on elemental analysis, IR and mass spectroscopy. The data are in accordance with 1:3 [Metal]:[Ligand] ratio. On the basis of the IR analysis, it was found that the metals were coordinated to bidentate anthranilic acid via the ionised oxygen of the carboxylate group and to the nitrogen of amino group. While in 5-chloroanthranilic acid, the metals were coordinated oxidatively to the bidentate carboxylate group without bonding to amino group; accordingly, a chlorine-affected coordination and reactivity-diversity was emphasized. Thermal analyses (TGA) and biological activity of the complexes were also investigated. Density Functional Theory (DFT) calculations at the B3LYP/6-311++G (d,p)_ level of theory have been carried out to investigate the equilibrium geometry of the ligand. The optimized geometry parameters of the complexes were evaluated using SDDALL basis set. Moreover, total energy, energy of HOMO and LUMO and Mullikan atomic charges were calculated. In addition, dipole moment and orientation have been performed and discussed.

  1. Caching and interpolated likelihoods: accelerating cosmological Monte Carlo Markov chains

    Energy Technology Data Exchange (ETDEWEB)

    Bouland, Adam; Easther, Richard; Rosenfeld, Katherine, E-mail: adam.bouland@aya.yale.edu, E-mail: richard.easther@yale.edu, E-mail: krosenfeld@cfa.harvard.edu [Department of Physics, Yale University, New Haven CT 06520 (United States)

    2011-05-01

    We describe a novel approach to accelerating Monte Carlo Markov Chains. Our focus is cosmological parameter estimation, but the algorithm is applicable to any problem for which the likelihood surface is a smooth function of the free parameters and computationally expensive to evaluate. We generate a high-order interpolating polynomial for the log-likelihood using the first points gathered by the Markov chains as a training set. This polynomial then accurately computes the majority of the likelihoods needed in the latter parts of the chains. We implement a simple version of this algorithm as a patch (InterpMC) to CosmoMC and show that it accelerates parameter estimatation by a factor of between two and four for well-converged chains. The current code is primarily intended as a ''proof of concept'', and we argue that there is considerable room for further performance gains. Unlike other approaches to accelerating parameter fits, we make no use of precomputed training sets or special choices of variables, and InterpMC is almost entirely transparent to the user.

  2. Caching and interpolated likelihoods: accelerating cosmological Monte Carlo Markov chains

    International Nuclear Information System (INIS)

    Bouland, Adam; Easther, Richard; Rosenfeld, Katherine

    2011-01-01

    We describe a novel approach to accelerating Monte Carlo Markov Chains. Our focus is cosmological parameter estimation, but the algorithm is applicable to any problem for which the likelihood surface is a smooth function of the free parameters and computationally expensive to evaluate. We generate a high-order interpolating polynomial for the log-likelihood using the first points gathered by the Markov chains as a training set. This polynomial then accurately computes the majority of the likelihoods needed in the latter parts of the chains. We implement a simple version of this algorithm as a patch (InterpMC) to CosmoMC and show that it accelerates parameter estimatation by a factor of between two and four for well-converged chains. The current code is primarily intended as a ''proof of concept'', and we argue that there is considerable room for further performance gains. Unlike other approaches to accelerating parameter fits, we make no use of precomputed training sets or special choices of variables, and InterpMC is almost entirely transparent to the user

  3. Factors Associated With the Likelihood of Hospitalization Following Emergency Department Visits for Behavioral Health Conditions.

    Science.gov (United States)

    Hamilton, Jane E; Desai, Pratikkumar V; Hoot, Nathan R; Gearing, Robin E; Jeong, Shin; Meyer, Thomas D; Soares, Jair C; Begley, Charles E

    2016-11-01

    Behavioral health-related emergency department (ED) visits have been linked with ED overcrowding, an increased demand on limited resources, and a longer length of stay (LOS) due in part to patients being admitted to the hospital but waiting for an inpatient bed. This study examines factors associated with the likelihood of hospital admission for ED patients with behavioral health conditions at 16 hospital-based EDs in a large urban area in the southern United States. Using Andersen's Behavioral Model of Health Service Use for guidance, the study examined the relationship between predisposing (characteristics of the individual, i.e., age, sex, race/ethnicity), enabling (system or structural factors affecting healthcare access), and need (clinical) factors and the likelihood of hospitalization following ED visits for behavioral health conditions (n = 28,716 ED visits). In the adjusted analysis, a logistic fixed-effects model with blockwise entry was used to estimate the relative importance of predisposing, enabling, and need variables added separately as blocks while controlling for variation in unobserved hospital-specific practices across hospitals and time in years. Significant predisposing factors associated with an increased likelihood of hospitalization following an ED visit included increasing age, while African American race was associated with a lower likelihood of hospitalization. Among enabling factors, arrival by emergency transport and a longer ED LOS were associated with a greater likelihood of hospitalization while being uninsured and the availability of community-based behavioral health services within 5 miles of the ED were associated with lower odds. Among need factors, having a discharge diagnosis of schizophrenia/psychotic spectrum disorder, an affective disorder, a personality disorder, dementia, or an impulse control disorder as well as secondary diagnoses of suicidal ideation and/or suicidal behavior increased the likelihood of hospitalization

  4. Vent clearing analysis of a Mark III pressure suppression containment

    International Nuclear Information System (INIS)

    Quintana, R.

    1979-01-01

    An analysis of the vent clearing transient in a Mark III pressure suppression containment after a hypothetical LOCA is carried out. A two-dimensional numerical model solving the transient fluid dynamic equations is used. The geometry of the pressure suppression pool is represented and the pressure and velocity fields in the pool are obtained from the moment the LOCA occurs until the first vent in the drywell wall clears. The results are compared to those obtained with the one-diemensional model used for containment design, with special interest on two-dimensional effects. Some conclusions concerning the effect of the water discharged into the suppression pool through the vents on submerged structures are obtained. Future improvements to the model are suggested. (orig.)

  5. A maximum pseudo-likelihood approach for estimating species trees under the coalescent model

    Directory of Open Access Journals (Sweden)

    Edwards Scott V

    2010-10-01

    Full Text Available Abstract Background Several phylogenetic approaches have been developed to estimate species trees from collections of gene trees. However, maximum likelihood approaches for estimating species trees under the coalescent model are limited. Although the likelihood of a species tree under the multispecies coalescent model has already been derived by Rannala and Yang, it can be shown that the maximum likelihood estimate (MLE of the species tree (topology, branch lengths, and population sizes from gene trees under this formula does not exist. In this paper, we develop a pseudo-likelihood function of the species tree to obtain maximum pseudo-likelihood estimates (MPE of species trees, with branch lengths of the species tree in coalescent units. Results We show that the MPE of the species tree is statistically consistent as the number M of genes goes to infinity. In addition, the probability that the MPE of the species tree matches the true species tree converges to 1 at rate O(M -1. The simulation results confirm that the maximum pseudo-likelihood approach is statistically consistent even when the species tree is in the anomaly zone. We applied our method, Maximum Pseudo-likelihood for Estimating Species Trees (MP-EST to a mammal dataset. The four major clades found in the MP-EST tree are consistent with those in the Bayesian concatenation tree. The bootstrap supports for the species tree estimated by the MP-EST method are more reasonable than the posterior probability supports given by the Bayesian concatenation method in reflecting the level of uncertainty in gene trees and controversies over the relationship of four major groups of placental mammals. Conclusions MP-EST can consistently estimate the topology and branch lengths (in coalescent units of the species tree. Although the pseudo-likelihood is derived from coalescent theory, and assumes no gene flow or horizontal gene transfer (HGT, the MP-EST method is robust to a small amount of HGT in the

  6. Likelihood estimators for multivariate extremes

    KAUST Repository

    Huser, Raphaë l; Davison, Anthony C.; Genton, Marc G.

    2015-01-01

    The main approach to inference for multivariate extremes consists in approximating the joint upper tail of the observations by a parametric family arising in the limit for extreme events. The latter may be expressed in terms of componentwise maxima, high threshold exceedances or point processes, yielding different but related asymptotic characterizations and estimators. The present paper clarifies the connections between the main likelihood estimators, and assesses their practical performance. We investigate their ability to estimate the extremal dependence structure and to predict future extremes, using exact calculations and simulation, in the case of the logistic model.

  7. Likelihood estimators for multivariate extremes

    KAUST Repository

    Huser, Raphaël

    2015-11-17

    The main approach to inference for multivariate extremes consists in approximating the joint upper tail of the observations by a parametric family arising in the limit for extreme events. The latter may be expressed in terms of componentwise maxima, high threshold exceedances or point processes, yielding different but related asymptotic characterizations and estimators. The present paper clarifies the connections between the main likelihood estimators, and assesses their practical performance. We investigate their ability to estimate the extremal dependence structure and to predict future extremes, using exact calculations and simulation, in the case of the logistic model.

  8. The approach to analysis of significance of flaws in ASME section III and section XI

    International Nuclear Information System (INIS)

    Cowan, A.

    1979-01-01

    ASME III Appendix G and ASME XI Appendix A describe linear elastic fracture mechanics methods to assess the significance of defects in thick-walled pressure vessels for nuclear reactor systems. The assessment of fracture toughness, Ksub(Ic), is based upon recommendations made by a Task Group of the USA Pressure Vessel Research Committee and is dependent upon correlations with drop weight and Charpy V-notch data to give a lower bound of fracture toughness Ksub(IR). The methods used in the ASME Appendices are outlined noting that, whereas ASME III Appendix G defines a procedure for obtaining allowable pressure vessel loadings for normal service in the presence of a defect, ASME XI Appendix A defines methods for assessing the significance of defects (found by volumetric inspection) under normal and emergency and faulted conditions. The methods of analysis are discussed with respect to material properties, flaw characterisation, stress analysis and recommended safety factors; a short discussion is given on the applicability of the data and methods to other materials and non-nuclear structures. (author)

  9. Synthesis, analysis and radiolysis of the cobalt III 8 hydroxyquinolinate complex

    International Nuclear Information System (INIS)

    Mestnik, S.A.C.; Silva, C.P.G. da.

    1981-11-01

    The cobalt III 8-hidroxyquinolinate complex was syntetized from a solution of cobalt II. The compound was analysed by IR absorption spectroscopy, elemental analysis and by the determination of number of ligands. The radiolytic degradation was verified by spectrophotometry after submitting samples of 10 - 3 M complex in ethanolic solution to different doses of gamma radiation from a 60 Co source. The change of maximum absorbance of the complex with different doses of gamma radiation and its UV-VIS absorption spectra are presented. The complex in the solid state was also irradiated with 6,9 Mrad of gamma radiation but it didn't present degradation. (Author) [pt

  10. Solvent extraction of anionic chelate complexes of lanthanum(III), europium(III), lutetium(III), scandium(III), and indium(III) with 2-thenoyltrifluoroacetone as ion-pairs with tetrabutylammonium ions

    International Nuclear Information System (INIS)

    Noro, Junji; Sekine, Tatsuya.

    1992-01-01

    The solvent extraction of lanthanum(III), europium(III), lutetium(III), scandium(III), and indium(III) in 0.1 mol dm -3 sodium nitrate solutions with 2-thenoyltrifluoroacetone (Htta) in the absence and presence of tetrabutylammonium ions (tba + ) into carbon tetrachloride was measured. The extraction of lanthanum(III), europium(III), and lutetium(III) was greatly enhanced by the addition of tba + ; this could be explained in terms of the extraction of a ternary complex, M(tta) 4 - tba + . However, the extractions of scandium(III) and indium(III) were nearly the same when tba + was added. The data were treated on the basis of the formation equilibrium of the ternary complex from the neutral chelate, M(tta) 3 , with the extracted ion-pairs of the reagents, tta - tba + , in the organic phase. It was concluded that the degree of association of M(tta) 3 with the ion-pair, tta - tba + , is greater in the order La(tta) 3 ≅ Eu(tta) 3 > Lu(tta) 3 , or that the stability of the ternary complex in the organic phase is higher in the order La(tta) 4 - tba + ≅ Eu(tta) 4 - tba + > Lu(tta) 4 - tba + . This is similar to those of adduct metal chelates of Htta with tributylphosphate (TBP) in synergistic extraction systems. (author)

  11. Cases in which ancestral maximum likelihood will be confusingly misleading.

    Science.gov (United States)

    Handelman, Tomer; Chor, Benny

    2017-05-07

    Ancestral maximum likelihood (AML) is a phylogenetic tree reconstruction criteria that "lies between" maximum parsimony (MP) and maximum likelihood (ML). ML has long been known to be statistically consistent. On the other hand, Felsenstein (1978) showed that MP is statistically inconsistent, and even positively misleading: There are cases where the parsimony criteria, applied to data generated according to one tree topology, will be optimized on a different tree topology. The question of weather AML is statistically consistent or not has been open for a long time. Mossel et al. (2009) have shown that AML can "shrink" short tree edges, resulting in a star tree with no internal resolution, which yields a better AML score than the original (resolved) model. This result implies that AML is statistically inconsistent, but not that it is positively misleading, because the star tree is compatible with any other topology. We show that AML is confusingly misleading: For some simple, four taxa (resolved) tree, the ancestral likelihood optimization criteria is maximized on an incorrect (resolved) tree topology, as well as on a star tree (both with specific edge lengths), while the tree with the original, correct topology, has strictly lower ancestral likelihood. Interestingly, the two short edges in the incorrect, resolved tree topology are of length zero, and are not adjacent, so this resolved tree is in fact a simple path. While for MP, the underlying phenomenon can be described as long edge attraction, it turns out that here we have long edge repulsion. Copyright © 2017. Published by Elsevier Ltd.

  12. Market Analysis and Consumer Impacts Source Document. Part III. Consumer Behavior and Attitudes Toward Fuel Efficient Vehicles

    Science.gov (United States)

    1980-12-01

    This source document on motor vehicle market analysis and consumer impacts consists of three parts. Part III consists of studies and reviews on: consumer awareness of fuel efficiency issues; consumer acceptance of fuel efficient vehicles; car size ch...

  13. The Prior Can Often Only Be Understood in the Context of the Likelihood

    Directory of Open Access Journals (Sweden)

    Andrew Gelman

    2017-10-01

    Full Text Available A key sticking point of Bayesian analysis is the choice of prior distribution, and there is a vast literature on potential defaults including uniform priors, Jeffreys’ priors, reference priors, maximum entropy priors, and weakly informative priors. These methods, however, often manifest a key conceptual tension in prior modeling: a model encoding true prior information should be chosen without reference to the model of the measurement process, but almost all common prior modeling techniques are implicitly motivated by a reference likelihood. In this paper we resolve this apparent paradox by placing the choice of prior into the context of the entire Bayesian analysis, from inference to prediction to model evaluation.

  14. Analysis of Pairwise Interactions in a Maximum Likelihood Sense to Identify Leaders in a Group

    Directory of Open Access Journals (Sweden)

    Violet Mwaffo

    2017-07-01

    Full Text Available Collective motion in animal groups manifests itself in the form of highly coordinated maneuvers determined by local interactions among individuals. A particularly critical question in understanding the mechanisms behind such interactions is to detect and classify leader–follower relationships within the group. In the technical literature of coupled dynamical systems, several methods have been proposed to reconstruct interaction networks, including linear correlation analysis, transfer entropy, and event synchronization. While these analyses have been helpful in reconstructing network models from neuroscience to public health, rules on the most appropriate method to use for a specific dataset are lacking. Here, we demonstrate the possibility of detecting leaders in a group from raw positional data in a model-free approach that combines multiple methods in a maximum likelihood sense. We test our framework on synthetic data of groups of self-propelled Vicsek particles, where a single agent acts as a leader and both the size of the interaction region and the level of inherent noise are systematically varied. To assess the feasibility of detecting leaders in real-world applications, we study a synthetic dataset of fish shoaling, generated by using a recent data-driven model for social behavior, and an experimental dataset of pharmacologically treated zebrafish. Not only does our approach offer a robust strategy to detect leaders in synthetic data but it also allows for exploring the role of psychoactive compounds on leader–follower relationships.

  15. T-scan III system diagnostic tool for digital occlusal analysis in orthodontics - a modern approach.

    Science.gov (United States)

    Trpevska, Vesna; Kovacevska, Gordana; Benedeti, Alberto; Jordanov, Bozidar

    2014-01-01

    This systematic literature review was performed to establish the mechanism, methodology, characteristics, clinical application and opportunities of the T-Scan III System as a diagnostic tool for digital occlusal analysis in different fields of dentistry, precisely in orthodontics. Searching of electronic databases, using MEDLINE and PubMed, hand searching of relevant key journals, and screening of reference lists of included studies with no language restriction was performed. Publications providing statistically examined data were included for systematic review. Twenty potentially relevant Randomized Controlled Trials (RCTs) were identified. Only ten met the inclusion criteria. The literature demonstrates that using digital occlusal analysis with T-Scan III System in orthodontics has significant advantage with regard to the capability of measuring occlusal parameters in static positions and during dynamic of the mandible. Within the scope of this systematic review, there is evidence to support that T-Scan system is rapid and accurate in identifying the distribution of the tooth contacts and it shows great promise as a clinical diagnostic screening device for occlusion and for improving the occlusion after various dental treatments. Additional clinical studies are required to advance the indication filed of this system. Importance of using digital occlusal T-Scan analysis in orthodontics deserves further investigation.

  16. Direct detection of dark matter with the EDELWEISS-III experiment: signals induced by charge trapping, data analysis and characterization of cryogenic detector sensitivity to low-mass WIMPs

    International Nuclear Information System (INIS)

    Arnaud, Quentin

    2015-01-01

    The EDELWEISS-III experiment is dedicated to direct dark matter searches aiming at detecting WIMPS. These massive particles should account for more than 80% of the mass of the Universe and be detectable through their elastic scattering on nuclei constituting the absorber of a detector. As the expected WIMP event rate is extremely low ( 20 GeV). Finally, a study dedicated to the optimization of solid cryogenic detectors to low mass WIMP searches is presented. This study is performed on simulated data using a statistical test based on a profiled likelihood ratio that allows for statistical background subtraction and spectral shape discrimination. This study combined with results from Run308, has lead the EDELWEISS experiment to favor low mass WIMP searches ( [fr

  17. Likelihood-based inference for clustered line transect data

    DEFF Research Database (Denmark)

    Waagepetersen, Rasmus Plenge; Schweder, Tore

    The uncertainty in estimation of spatial animal density from line transect surveys depends on the degree of spatial clustering in the animal population. To quantify the clustering we model line transect data as independent thinnings of spatial shot-noise Cox processes. Likelihood-based inference...

  18. Adaptive Unscented Kalman Filter using Maximum Likelihood Estimation

    DEFF Research Database (Denmark)

    Mahmoudi, Zeinab; Poulsen, Niels Kjølstad; Madsen, Henrik

    2017-01-01

    The purpose of this study is to develop an adaptive unscented Kalman filter (UKF) by tuning the measurement noise covariance. We use the maximum likelihood estimation (MLE) and the covariance matching (CM) method to estimate the noise covariance. The multi-step prediction errors generated...

  19. Likelihood-based inference for clustered line transect data

    DEFF Research Database (Denmark)

    Waagepetersen, Rasmus; Schweder, Tore

    2006-01-01

    The uncertainty in estimation of spatial animal density from line transect surveys depends on the degree of spatial clustering in the animal population. To quantify the clustering we model line transect data as independent thinnings of spatial shot-noise Cox processes. Likelihood-based inference...

  20. Localisation of deformations of the midfacial complex in subjects with class III malocclusions employing thin-plate spline analysis.

    Science.gov (United States)

    Singh, G D; McNamara, J A; Lozanoff, S

    1997-11-01

    This study determines deformations of the midface that contribute to a class III appearance, employing thinplate spline analysis. A total of 135 lateral cephalographs of prepubertal children of European-American descent with either class III malocclusions or a class I molar occlusion were compared. The cephalographs were traced and checked, and 7 homologous landmarks of the midface were identified and digitised. The data sets were scaled to an equivalent size and subjected to Procrustes analysis. These statistical tests indicated significant differences (P spline analysis indicated that both affine and nonaffine transformations contribute towards the total spline for the averaged midfacial configuration. For nonaffine transformations, partial warp 3 had the highest magnitude, indicating the large scale deformations of the midfacial configuration. These deformations affected the palatal landmarks, and were associated with compression of the midfacial complex in the anteroposterior plane predominantly. Partial warp 4 produced some vertical compression of the posterior aspect of the midfacial complex whereas partial warps 1 and 2 indicated localised shape changes of the maxillary alveolus region. large spatial-scale deformations therefore affect the midfacial complex in an anteroposterior axis, in combination with vertical compression and localised distortions. These deformations may represent a developmental diminution of the palatal complex anteroposteriorly that, allied with vertical shortening of midfacial height posteriorly, results in class III malocclusions with a retrusive midfacial profile.

  1. Sparkle/PM3 for the modeling of europium(III), gadolinium(III), and terbium(III) complexes

    International Nuclear Information System (INIS)

    Freire, Ricardo O.; Rocha, Gerd B.; Simas, Alfredo M.

    2009-01-01

    The Sparkle/PM3 model is extended to europium(III), gadolinium(III), and terbium(III) complexes. The validation procedure was carried out using only high quality crystallographic structures, for a total of ninety-six Eu(III) complexes, seventy Gd(III) complexes, and forty-two Tb(III) complexes. The Sparkle/PM3 unsigned mean error, for all interatomic distances between the trivalent lanthanide ion and the ligand atoms of the first sphere of coordination, is: 0.080 A for Eu(III); 0.063 A for Gd(III); and 0.070 A for Tb(III). These figures are similar to the Sparkle/AM1 ones of 0.082 A, 0.061 A, and 0.068 A respectively, indicating they are all comparable parameterizations. Moreover, their accuracy is similar to what can be obtained by present-day ab initio effective core potential full geometry optimization calculations on such lanthanide complexes. Finally, we report a preliminary attempt to show that Sparkle/PM3 geometry predictions are reliable. For one of the Eu(III) complexes, BAFZEO, we created hundreds of different input geometries by randomly varying the distances and angles of the ligands to the central Eu(III) ion, which were all subsequently fully optimized. A significant trend was unveiled, indicating that more accurate local minima geometries cluster at lower total energies, thus reinforcing the validity of sparkle model calculations. (author)

  2. A score to estimate the likelihood of detecting advanced colorectal neoplasia at colonoscopy.

    Science.gov (United States)

    Kaminski, Michal F; Polkowski, Marcin; Kraszewska, Ewa; Rupinski, Maciej; Butruk, Eugeniusz; Regula, Jaroslaw

    2014-07-01

    This study aimed to develop and validate a model to estimate the likelihood of detecting advanced colorectal neoplasia in Caucasian patients. We performed a cross-sectional analysis of database records for 40-year-old to 66-year-old patients who entered a national primary colonoscopy-based screening programme for colorectal cancer in 73 centres in Poland in the year 2007. We used multivariate logistic regression to investigate the associations between clinical variables and the presence of advanced neoplasia in a randomly selected test set, and confirmed the associations in a validation set. We used model coefficients to develop a risk score for detection of advanced colorectal neoplasia. Advanced colorectal neoplasia was detected in 2544 of the 35,918 included participants (7.1%). In the test set, a logistic-regression model showed that independent risk factors for advanced colorectal neoplasia were: age, sex, family history of colorectal cancer, cigarette smoking (padvanced neoplasia: 1.00 (95% CI 0.95 to 1.06)) and had moderate discriminatory power (c-statistic 0.62). We developed a score that estimated the likelihood of detecting advanced neoplasia in the validation set, from 1.32% for patients scoring 0, to 19.12% for patients scoring 7-8. Developed and internally validated score consisting of simple clinical factors successfully estimates the likelihood of detecting advanced colorectal neoplasia in asymptomatic Caucasian patients. Once externally validated, it may be useful for counselling or designing primary prevention studies. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  3. Likelihood ratio data to report the validation of a forensic fingerprint evaluation method

    NARCIS (Netherlands)

    Ramos, Daniel; Haraksim, Rudolf; Meuwly, Didier

    2017-01-01

    Data to which the authors refer to throughout this article are likelihood ratios (LR) computed from the comparison of 5–12 minutiae fingermarks with fingerprints. These LRs data are used for the validation of a likelihood ratio (LR) method in forensic evidence evaluation. These data present a

  4. Medial Condyle Fracture (Kilfoyle Type III of the Distal Humerus with Transient Fishtail Deformity after Surgery

    Directory of Open Access Journals (Sweden)

    Motoki Sonohata

    2017-01-01

    Full Text Available A “Fishtail deformity” is one of the well-known complications following pediatric lateral condyle or supracondylar fractures of the humerus. We herein report a case of medial condyle fracture (Kilfoyle type III in an 11-year-old boy. He had a transient “fishtail deformity” of the trochlear groove after open reduction and internal fixation. As occurred in the current case, the bone remodeling and the improvement of ischemia of the trochlea after medial condyle fracture may be associated with the likelihood of recovery from transient “fishtail deformity.”

  5. Purification of chicken carbonic anhydrase isozyme-III (CA-III and its measurement in White Leghorn chickens

    Directory of Open Access Journals (Sweden)

    Nishita Toshiho

    2011-11-01

    Full Text Available Abstract Background The developmental profile of chicken carbonic anhydrase-III (CA-III blood levels has not been previously determined or reported. We isolated CA-III from chicken muscle and investigated age-related changes in the levels of CA-III in blood. Methods CA-III was purified from chicken muscle. The levels of CA-III in plasma and erythrocytes from 278 female chickens (aged 1-93 weeks and 68 male chickens (aged 3-59 weeks were determined by ELISA. Results The mean level of CA-III in female chicken erythrocytes (1 week old was 4.6 μg/g of Hb, and the CA-III level did not change until 16 weeks of age. The level then increased until 63 weeks of age (11.8 μg/g of Hb, decreased to 4.7 μg/g of Hb at 73 weeks of age, and increased again until 93 weeks of age (8.6 μg/g of Hb. The mean level of CA-III in erythrocytes from male chickens (3 weeks old was 2.4 μg/g of Hb, and this level remained steady until 59 weeks of age. The mean plasma level of CA-III in 1-week-old female chickens was 60 ng/mL, and this level was increased at 3 weeks of age (141 ng/mL and then remained steady until 80 weeks of age (122 ng/mL. The mean plasma level of CA-III in 3-week-old male chickens was 58 ng/mL, and this level remained steady until 59 weeks of age. Conclusion We observed both developmental changes and sex differences in CA-III concentrations in White Leghorn (WL chicken erythrocytes and plasma. Simple linear regression analysis showed a significant association between the erythrocyte CA-III level and egg-laying rate in WL-chickens 16-63 weeks of age (p

  6. Mapping grey matter reductions in schizophrenia: an anatomical likelihood estimation analysis of voxel-based morphometry studies.

    Science.gov (United States)

    Fornito, A; Yücel, M; Patti, J; Wood, S J; Pantelis, C

    2009-03-01

    Voxel-based morphometry (VBM) is a popular tool for mapping neuroanatomical changes in schizophrenia patients. Several recent meta-analyses have identified the brain regions in which patients most consistently show grey matter reductions, although they have not examined whether such changes reflect differences in grey matter concentration (GMC) or grey matter volume (GMV). These measures assess different aspects of grey matter integrity, and may therefore reflect different pathological processes. In this study, we used the Anatomical Likelihood Estimation procedure to analyse significant differences reported in 37 VBM studies of schizophrenia patients, incorporating data from 1646 patients and 1690 controls, and compared the findings of studies using either GMC or GMV to index grey matter differences. Analysis of all studies combined indicated that grey matter reductions in a network of frontal, temporal, thalamic and striatal regions are among the most frequently reported in literature. GMC reductions were generally larger and more consistent than GMV reductions, and were more frequent in the insula, medial prefrontal, medial temporal and striatal regions. GMV reductions were more frequent in dorso-medial frontal cortex, and lateral and orbital frontal areas. These findings support the primacy of frontal, limbic, and subcortical dysfunction in the pathophysiology of schizophrenia, and suggest that the grey matter changes observed with MRI may not necessarily result from a unitary pathological process.

  7. Maximum likelihood estimation of the attenuated ultrasound pulse

    DEFF Research Database (Denmark)

    Rasmussen, Klaus Bolding

    1994-01-01

    The attenuated ultrasound pulse is divided into two parts: a stationary basic pulse and a nonstationary attenuation pulse. A standard ARMA model is used for the basic pulse, and a nonstandard ARMA model is derived for the attenuation pulse. The maximum likelihood estimator of the attenuated...

  8. Corporate governance effect on financial distress likelihood: Evidence from Spain

    Directory of Open Access Journals (Sweden)

    Montserrat Manzaneque

    2016-01-01

    Full Text Available The paper explores some mechanisms of corporate governance (ownership and board characteristics in Spanish listed companies and their impact on the likelihood of financial distress. An empirical study was conducted between 2007 and 2012 using a matched-pairs research design with 308 observations, with half of them classified as distressed and non-distressed. Based on the previous study by Pindado, Rodrigues, and De la Torre (2008, a broader concept of bankruptcy is used to define business failure. Employing several conditional logistic models, as well as to other previous studies on bankruptcy, the results confirm that in difficult situations prior to bankruptcy, the impact of board ownership and proportion of independent directors on business failure likelihood are similar to those exerted in more extreme situations. These results go one step further, to offer a negative relationship between board size and the likelihood of financial distress. This result is interpreted as a form of creating diversity and to improve the access to the information and resources, especially in contexts where the ownership is highly concentrated and large shareholders have a great power to influence the board structure. However, the results confirm that ownership concentration does not have a significant impact on financial distress likelihood in the Spanish context. It is argued that large shareholders are passive as regards an enhanced monitoring of management and, alternatively, they do not have enough incentives to hold back the financial distress. These findings have important implications in the Spanish context, where several changes in the regulatory listing requirements have been carried out with respect to corporate governance, and where there is no empirical evidence regarding this respect.

  9. Rationalization of paclitaxel insensitivity of yeast β-tubulin and human βIII-tubulin isotype using principal component analysis

    Directory of Open Access Journals (Sweden)

    Das Lalita

    2012-08-01

    Full Text Available Abstract Background The chemotherapeutic agent paclitaxel arrests cell division by binding to the hetero-dimeric protein tubulin. Subtle differences in tubulin sequences, across eukaryotes and among β-tubulin isotypes, can have profound impact on paclitaxel-tubulin binding. To capture the experimentally observed paclitaxel-resistance of human βIII tubulin isotype and yeast β-tubulin, within a common theoretical framework, we have performed structural principal component analyses of β-tubulin sequences across eukaryotes. Results The paclitaxel-resistance of human βIII tubulin isotype and yeast β-tubulin uniquely mapped on to the lowest two principal components, defining the paclitaxel-binding site residues of β-tubulin. The molecular mechanisms behind paclitaxel-resistance, mediated through key residues, were identified from structural consequences of characteristic mutations that confer paclitaxel-resistance. Specifically, Ala277 in βIII isotype was shown to be crucial for paclitaxel-resistance. Conclusions The present analysis captures the origin of two apparently unrelated events, paclitaxel-insensitivity of yeast tubulin and human βIII tubulin isotype, through two common collective sequence vectors.

  10. Uranium (III)-Plutonium (III) co-precipitation in molten chloride

    Science.gov (United States)

    Vigier, Jean-François; Laplace, Annabelle; Renard, Catherine; Miguirditchian, Manuel; Abraham, Francis

    2018-02-01

    Co-management of the actinides in an integrated closed fuel cycle by a pyrochemical process is studied at the laboratory scale in France in the CEA-ATALANTE facility. In this context the co-precipitation of U(III) and Pu(III) by wet argon sparging in LiCl-CaCl2 (30-70 mol%) molten salt at 705 °C is studied. Pu(III) is prepared in situ in the molten salt by carbochlorination of PuO2 and U(III) is then introduced as UCl3 after chlorine purge by argon to avoid any oxidation of uranium up to U(VI) by Cl2. The oxide conversion yield through wet argon sparging is quantitative. However, the preferential oxidation of U(III) in comparison to Pu(III) is responsible for a successive conversion of the two actinides, giving a mixture of UO2 and PuO2 oxides. Surprisingly, the conversion of sole Pu(III) in the same conditions leads to a mixture of PuO2 and PuOCl, characteristic of a partial oxidation of Pu(III) to Pu(IV). This is in contrast with coconversion of U(III)-Pu(III) mixtures but in agreement with the conversion of Ce(III).

  11. Block Empirical Likelihood for Longitudinal Single-Index Varying-Coefficient Model

    Directory of Open Access Journals (Sweden)

    Yunquan Song

    2013-01-01

    Full Text Available In this paper, we consider a single-index varying-coefficient model with application to longitudinal data. In order to accommodate the within-group correlation, we apply the block empirical likelihood procedure to longitudinal single-index varying-coefficient model, and prove a nonparametric version of Wilks’ theorem which can be used to construct the block empirical likelihood confidence region with asymptotically correct coverage probability for the parametric component. In comparison with normal approximations, the proposed method does not require a consistent estimator for the asymptotic covariance matrix, making it easier to conduct inference for the model's parametric component. Simulations demonstrate how the proposed method works.

  12. Ethnicity and skeletal Class III morphology: a pubertal growth analysis using thin-plate spline analysis.

    Science.gov (United States)

    Alkhamrah, B; Terada, K; Yamaki, M; Ali, I M; Hanada, K

    2001-01-01

    A longitudinal retrospective study using thin-plate spline analysis was used to investigate skeletal Class III etiology in Japanese female adolescents. Headfilms of 40 subjects were chosen from the archives of the Orthodontic department at Niigata University Dental Hospital, and were traced at IIIB and IVA Hellman dental ages. Twenty-eight homologous landmarks, representing hard and soft tissue, were digitized. These were used to reproduce a consensus for the profilogram, craniomaxillary complex, mandible, and soft tissue for each age and skeletal group. Generalized least-square analysis revealed a significant shape difference between age-matched groups (P spline and partial warps (PW)3 and 2 showed a maxillary retrusion at stage IIIB opposite an acute cranial base at stage IVA. Mandibular total spline and PW4, 5 showed changes affecting most landmarks and their spatial interrelationship, especially a stretch along the articulare-pogonion axis. In soft tissue analysis, PW8 showed large and local changes which paralleled the underlying hard tissue components. Allometry of the mandible and anisotropy of the cranial base, the maxilla, and the mandible asserted the complexity of craniofacial growth and the difficulty of predicting its outcome.

  13. Optimizing Likelihood Models for Particle Trajectory Segmentation in Multi-State Systems.

    Science.gov (United States)

    Young, Dylan Christopher; Scrimgeour, Jan

    2018-06-19

    Particle tracking offers significant insight into the molecular mechanics that govern the behav- ior of living cells. The analysis of molecular trajectories that transition between different motive states, such as diffusive, driven and tethered modes, is of considerable importance, with even single trajectories containing significant amounts of information about a molecule's environment and its interactions with cellular structures. Hidden Markov models (HMM) have been widely adopted to perform the segmentation of such complex tracks. In this paper, we show that extensive analysis of hidden Markov model outputs using data derived from multi-state Brownian dynamics simulations can be used both for the optimization of the likelihood models used to describe the states of the system and for characterization of the technique's failure mechanisms. This analysis was made pos- sible by the implementation of parallelized adaptive direct search algorithm on a Nvidia graphics processing unit. This approach provides critical information for the visualization of HMM failure and successful design of particle tracking experiments where trajectories contain multiple mobile states. © 2018 IOP Publishing Ltd.

  14. Evolutionary analysis of apolipoprotein E by Maximum Likelihood and complex network methods

    Directory of Open Access Journals (Sweden)

    Leandro de Jesus Benevides

    Full Text Available Abstract Apolipoprotein E (apo E is a human glycoprotein with 299 amino acids, and it is a major component of very low density lipoproteins (VLDL and a group of high-density lipoproteins (HDL. Phylogenetic studies are important to clarify how various apo E proteins are related in groups of organisms and whether they evolved from a common ancestor. Here, we aimed at performing a phylogenetic study on apo E carrying organisms. We employed a classical and robust method, such as Maximum Likelihood (ML, and compared the results using a more recent approach based on complex networks. Thirty-two apo E amino acid sequences were downloaded from NCBI. A clear separation could be observed among three major groups: mammals, fish and amphibians. The results obtained from ML method, as well as from the constructed networks showed two different groups: one with mammals only (C1 and another with fish (C2, and a single node with the single sequence available for an amphibian. The accordance in results from the different methods shows that the complex networks approach is effective in phylogenetic studies. Furthermore, our results revealed the conservation of apo E among animal groups.

  15. Is there a critical lesion site for unilateral spatial neglect? A meta-analysis using activation likelihood estimation.

    Directory of Open Access Journals (Sweden)

    Pascal eMolenberghs

    2012-04-01

    Full Text Available The critical lesion site responsible for the syndrome of unilateral spatial neglect has been debated for more than a decade. Here we performed an activation likelihood estimation (ALE to provide for the first time an objective quantitative index of the consistency of lesion sites across anatomical group studies of spatial neglect. The analysis revealed several distinct regions in which damage has consistently been associated with spatial neglect symptoms. Lesioned clusters were located in several cortical and subcortical regions of the right hemisphere, including the middle and superior temporal gyrus, inferior parietal lobule, intraparietal sulcus, precuneus, middle occipital gyrus, caudate nucleus and posterior insula, as well as in the white matter pathway corresponding to the posterior part of the superior longitudinal fasciculus. Further analyses suggested that separate lesion sites are associated with impairments in different behavioural tests, such as line bisection and target cancellation. Similarly, specific subcomponents of the heterogeneous neglect syndrome, such as extinction and allocentric and personal neglect, are associated with distinct lesion sites. Future progress in delineating the neuropathological correlates of spatial neglect will depend upon the development of more refined measures of perceptual and cognitive functions than those currently available in the clinical setting.

  16. An Elaboration Likelihood Model Based Longitudinal Analysis of Attitude Change during the Process of IT Acceptance via Education Program

    Science.gov (United States)

    Lee, Woong-Kyu

    2012-01-01

    The principal objective of this study was to gain insight into attitude changes occurring during IT acceptance from the perspective of elaboration likelihood model (ELM). In particular, the primary target of this study was the process of IT acceptance through an education program. Although the Internet and computers are now quite ubiquitous, and…

  17. From reads to genes to pathways: differential expression analysis of RNA-Seq experiments using Rsubread and the edgeR quasi-likelihood pipeline [version 2; referees: 5 approved

    Directory of Open Access Journals (Sweden)

    Yunshun Chen

    2016-08-01

    Full Text Available In recent years, RNA sequencing (RNA-seq has become a very widely used technology for profiling gene expression. One of the most common aims of RNA-seq profiling is to identify genes or molecular pathways that are differentially expressed (DE between two or more biological conditions. This article demonstrates a computational workflow for the detection of DE genes and pathways from RNA-seq data by providing a complete analysis of an RNA-seq experiment profiling epithelial cell subsets in the mouse mammary gland. The workflow uses R software packages from the open-source Bioconductor project and covers all steps of the analysis pipeline, including alignment of read sequences, data exploration, differential expression analysis, visualization and pathway analysis. Read alignment and count quantification is conducted using the Rsubread package and the statistical analyses are performed using the edgeR package. The differential expression analysis uses the quasi-likelihood functionality of edgeR.

  18. Validation of DNA-based identification software by computation of pedigree likelihood ratios.

    Science.gov (United States)

    Slooten, K

    2011-08-01

    Disaster victim identification (DVI) can be aided by DNA-evidence, by comparing the DNA-profiles of unidentified individuals with those of surviving relatives. The DNA-evidence is used optimally when such a comparison is done by calculating the appropriate likelihood ratios. Though conceptually simple, the calculations can be quite involved, especially with large pedigrees, precise mutation models etc. In this article we describe a series of test cases designed to check if software designed to calculate such likelihood ratios computes them correctly. The cases include both simple and more complicated pedigrees, among which inbred ones. We show how to calculate the likelihood ratio numerically and algebraically, including a general mutation model and possibility of allelic dropout. In Appendix A we show how to derive such algebraic expressions mathematically. We have set up these cases to validate new software, called Bonaparte, which performs pedigree likelihood ratio calculations in a DVI context. Bonaparte has been developed by SNN Nijmegen (The Netherlands) for the Netherlands Forensic Institute (NFI). It is available free of charge for non-commercial purposes (see www.dnadvi.nl for details). Commercial licenses can also be obtained. The software uses Bayesian networks and the junction tree algorithm to perform its calculations. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  19. Major Accidents (Gray Swans) Likelihood Modeling Using Accident Precursors and Approximate Reasoning.

    Science.gov (United States)

    Khakzad, Nima; Khan, Faisal; Amyotte, Paul

    2015-07-01

    Compared to the remarkable progress in risk analysis of normal accidents, the risk analysis of major accidents has not been so well-established, partly due to the complexity of such accidents and partly due to low probabilities involved. The issue of low probabilities normally arises from the scarcity of major accidents' relevant data since such accidents are few and far between. In this work, knowing that major accidents are frequently preceded by accident precursors, a novel precursor-based methodology has been developed for likelihood modeling of major accidents in critical infrastructures based on a unique combination of accident precursor data, information theory, and approximate reasoning. For this purpose, we have introduced an innovative application of information analysis to identify the most informative near accident of a major accident. The observed data of the near accident were then used to establish predictive scenarios to foresee the occurrence of the major accident. We verified the methodology using offshore blowouts in the Gulf of Mexico, and then demonstrated its application to dam breaches in the United Sates. © 2015 Society for Risk Analysis.

  20. Greenery in the university environment: Students’ preferences and perceived restoration likelihood

    Science.gov (United States)

    2018-01-01

    A large body of evidence shows that interaction with greenery can be beneficial for human stress reduction, emotional states, and improved cognitive function. It can, therefore, be expected that university students might benefit from greenery in the university environment. Before investing in real-life interventions in a university environment, it is necessary to first explore students’ perceptions of greenery in the university environment. This study examined (1) preference for university indoor and outdoor spaces with and without greenery (2) perceived restoration likelihood of university outdoor spaces with and without greenery and (3) if preference and perceived restoration likelihood ratings were modified by demographic characteristics or connectedness to nature in Dutch university students (N = 722). Digital photographic stimuli represented four university spaces (lecture hall, classroom, study area, university outdoor space). For each of the three indoor spaces there were four or five stimuli conditions: (1) the standard design (2) the standard design with a colorful poster (3) the standard design with a nature poster (4) the standard design with a green wall (5) the standard design with a green wall plus interior plants. The university outdoor space included: (1) the standard design (2) the standard design with seating (3) the standard design with colorful artifacts (4) the standard design with green elements (5) the standard design with extensive greenery. Multi-level analyses showed that students gave higher preference ratings to the indoor spaces with a nature poster, a green wall, or a green wall plus interior plants than to the standard designs and the designs with the colorful posters. Students also rated preference and perceived restoration likelihood of the outdoor spaces that included greenery higher than those without. Preference and perceived restoration likelihood were not modified by demographic characteristics, but students with strong

  1. Greenery in the university environment: Students' preferences and perceived restoration likelihood.

    Directory of Open Access Journals (Sweden)

    Nicole van den Bogerd

    Full Text Available A large body of evidence shows that interaction with greenery can be beneficial for human stress reduction, emotional states, and improved cognitive function. It can, therefore, be expected that university students might benefit from greenery in the university environment. Before investing in real-life interventions in a university environment, it is necessary to first explore students' perceptions of greenery in the university environment. This study examined (1 preference for university indoor and outdoor spaces with and without greenery (2 perceived restoration likelihood of university outdoor spaces with and without greenery and (3 if preference and perceived restoration likelihood ratings were modified by demographic characteristics or connectedness to nature in Dutch university students (N = 722. Digital photographic stimuli represented four university spaces (lecture hall, classroom, study area, university outdoor space. For each of the three indoor spaces there were four or five stimuli conditions: (1 the standard design (2 the standard design with a colorful poster (3 the standard design with a nature poster (4 the standard design with a green wall (5 the standard design with a green wall plus interior plants. The university outdoor space included: (1 the standard design (2 the standard design with seating (3 the standard design with colorful artifacts (4 the standard design with green elements (5 the standard design with extensive greenery. Multi-level analyses showed that students gave higher preference ratings to the indoor spaces with a nature poster, a green wall, or a green wall plus interior plants than to the standard designs and the designs with the colorful posters. Students also rated preference and perceived restoration likelihood of the outdoor spaces that included greenery higher than those without. Preference and perceived restoration likelihood were not modified by demographic characteristics, but students with strong

  2. Greenery in the university environment: Students' preferences and perceived restoration likelihood.

    Science.gov (United States)

    van den Bogerd, Nicole; Dijkstra, S Coosje; Seidell, Jacob C; Maas, Jolanda

    2018-01-01

    A large body of evidence shows that interaction with greenery can be beneficial for human stress reduction, emotional states, and improved cognitive function. It can, therefore, be expected that university students might benefit from greenery in the university environment. Before investing in real-life interventions in a university environment, it is necessary to first explore students' perceptions of greenery in the university environment. This study examined (1) preference for university indoor and outdoor spaces with and without greenery (2) perceived restoration likelihood of university outdoor spaces with and without greenery and (3) if preference and perceived restoration likelihood ratings were modified by demographic characteristics or connectedness to nature in Dutch university students (N = 722). Digital photographic stimuli represented four university spaces (lecture hall, classroom, study area, university outdoor space). For each of the three indoor spaces there were four or five stimuli conditions: (1) the standard design (2) the standard design with a colorful poster (3) the standard design with a nature poster (4) the standard design with a green wall (5) the standard design with a green wall plus interior plants. The university outdoor space included: (1) the standard design (2) the standard design with seating (3) the standard design with colorful artifacts (4) the standard design with green elements (5) the standard design with extensive greenery. Multi-level analyses showed that students gave higher preference ratings to the indoor spaces with a nature poster, a green wall, or a green wall plus interior plants than to the standard designs and the designs with the colorful posters. Students also rated preference and perceived restoration likelihood of the outdoor spaces that included greenery higher than those without. Preference and perceived restoration likelihood were not modified by demographic characteristics, but students with strong

  3. Further Evaluation of Covariate Analysis using Empirical Bayes Estimates in Population Pharmacokinetics: the Perception of Shrinkage and Likelihood Ratio Test.

    Science.gov (United States)

    Xu, Xu Steven; Yuan, Min; Yang, Haitao; Feng, Yan; Xu, Jinfeng; Pinheiro, Jose

    2017-01-01

    Covariate analysis based on population pharmacokinetics (PPK) is used to identify clinically relevant factors. The likelihood ratio test (LRT) based on nonlinear mixed effect model fits is currently recommended for covariate identification, whereas individual empirical Bayesian estimates (EBEs) are considered unreliable due to the presence of shrinkage. The objectives of this research were to investigate the type I error for LRT and EBE approaches, to confirm the similarity of power between the LRT and EBE approaches from a previous report and to explore the influence of shrinkage on LRT and EBE inferences. Using an oral one-compartment PK model with a single covariate impacting on clearance, we conducted a wide range of simulations according to a two-way factorial design. The results revealed that the EBE-based regression not only provided almost identical power for detecting a covariate effect, but also controlled the false positive rate better than the LRT approach. Shrinkage of EBEs is likely not the root cause for decrease in power or inflated false positive rate although the size of the covariate effect tends to be underestimated at high shrinkage. In summary, contrary to the current recommendations, EBEs may be a better choice for statistical tests in PPK covariate analysis compared to LRT. We proposed a three-step covariate modeling approach for population PK analysis to utilize the advantages of EBEs while overcoming their shortcomings, which allows not only markedly reducing the run time for population PK analysis, but also providing more accurate covariate tests.

  4. [Analysis of prognostic factors after radical resection in 628 patients with stage II or III colon cancer].

    Science.gov (United States)

    Qin, Qiong; Yang, Lin; Zhou, Ai-ping; Sun, Yong-kun; Song, Yan; DU, Feng; Wang, Jin-wan

    2013-03-01

    To analyze the clinicopathologic factors related to recurrence and metastasis of stage II or III colon cancer after radical resection. The clinical and pathological data of 628 patients with stage II or III colon cancer after radical resection from Jan. 2005 to Dec. 2008 in our hospital were retrospectively reviewed and analyzed. The overall recurrence and metastasis rate was 28.5% (179/628). The 5-year disease-free survival (DFS) rate was 70.3% and 5-year overall survival (OS) rate was 78.5%. Univariate analysis showed that age, smoking intensity, depth of tumor invasion, lymph node metastasis, TNM stage, gross classification, histological differentiation, blood vessel tumor embolus, tumor gross pathology, multiple primary tumors, preoperative and postoperative serum concentration of CEA and CA19-9, and the regimen of adjuvant chemotherapy were correlated to recurrence and metastasis of colon cancer after radical resection. Multivariate analysis showed that regional lymph node metastasis, TNM stage, the regimen of postoperative adjuvant chemotherapy, and preoperative serum concentration of CEA and CA19-9 were independent factors affecting the prognosis of colon cancer patients. Regional lymph node metastasis, TNM stage, elevated preoperative serum concentration of CEA and CA19-9, the regimen of postoperative adjuvant chemotherapy with single fluorouracil type drug are independent risk factors of recurrence and metastasis in patients with stage II-III colon cancer after radical resection.

  5. Multi-Channel Maximum Likelihood Pitch Estimation

    DEFF Research Database (Denmark)

    Christensen, Mads Græsbøll

    2012-01-01

    In this paper, a method for multi-channel pitch estimation is proposed. The method is a maximum likelihood estimator and is based on a parametric model where the signals in the various channels share the same fundamental frequency but can have different amplitudes, phases, and noise characteristics....... This essentially means that the model allows for different conditions in the various channels, like different signal-to-noise ratios, microphone characteristics and reverberation. Moreover, the method does not assume that a certain array structure is used but rather relies on a more general model and is hence...

  6. Solar neutrino oscillation parameters after SNO Phase-III and SAGE Part-III

    International Nuclear Information System (INIS)

    Yang Ping; Liu Qiuyu

    2009-01-01

    We analyse the recently published results from solar neutrino experiments SNO Phase-III and SAGE Part-III and show their constraints on solar neutrino oscillation parameters, especially for the mixing angle θ 12 . Through a global analysis using all existing data from SK, SNO, Ga and Cl radiochemical experiments and long base line reactor experiment KamLAND , we obtain the parameters Δm 12 2 =7.684 -0.208 +0.212 x 10 -5 eV 2 , tan 2 θ 12 =0.440 -0.057 +0.059 . We also find that the discrepancy between the KamLAND and solar neutrino results can be reduced by choosing a small non-zero value for the mixing angle θ 13 . (authors)

  7. Maximum likelihood estimation for cytogenetic dose-response curves

    International Nuclear Information System (INIS)

    Frome, E.L.; DuFrain, R.J.

    1986-01-01

    In vitro dose-response curves are used to describe the relation between chromosome aberrations and radiation dose for human lymphocytes. The lymphocytes are exposed to low-LET radiation, and the resulting dicentric chromosome aberrations follow the Poisson distribution. The expected yield depends on both the magnitude and the temporal distribution of the dose. A general dose-response model that describes this relation has been presented by Kellerer and Rossi (1972, Current Topics on Radiation Research Quarterly 8, 85-158; 1978, Radiation Research 75, 471-488) using the theory of dual radiation action. Two special cases of practical interest are split-dose and continuous exposure experiments, and the resulting dose-time-response models are intrinsically nonlinear in the parameters. A general-purpose maximum likelihood estimation procedure is described, and estimation for the nonlinear models is illustrated with numerical examples from both experimental designs. Poisson regression analysis is used for estimation, hypothesis testing, and regression diagnostics. Results are discussed in the context of exposure assessment procedures for both acute and chronic human radiation exposure

  8. Superfast maximum-likelihood reconstruction for quantum tomography

    Science.gov (United States)

    Shang, Jiangwei; Zhang, Zhengyun; Ng, Hui Khoon

    2017-06-01

    Conventional methods for computing maximum-likelihood estimators (MLE) often converge slowly in practical situations, leading to a search for simplifying methods that rely on additional assumptions for their validity. In this work, we provide a fast and reliable algorithm for maximum-likelihood reconstruction that avoids this slow convergence. Our method utilizes the state-of-the-art convex optimization scheme, an accelerated projected-gradient method, that allows one to accommodate the quantum nature of the problem in a different way than in the standard methods. We demonstrate the power of our approach by comparing its performance with other algorithms for n -qubit state tomography. In particular, an eight-qubit situation that purportedly took weeks of computation time in 2005 can now be completed in under a minute for a single set of data, with far higher accuracy than previously possible. This refutes the common claim that MLE reconstruction is slow and reduces the need for alternative methods that often come with difficult-to-verify assumptions. In fact, recent methods assuming Gaussian statistics or relying on compressed sensing ideas are demonstrably inapplicable for the situation under consideration here. Our algorithm can be applied to general optimization problems over the quantum state space; the philosophy of projected gradients can further be utilized for optimization contexts with general constraints.

  9. Maximum Likelihood Approach for RFID Tag Set Cardinality Estimation with Detection Errors

    DEFF Research Database (Denmark)

    Nguyen, Chuyen T.; Hayashi, Kazunori; Kaneko, Megumi

    2013-01-01

    Abstract Estimation schemes of Radio Frequency IDentification (RFID) tag set cardinality are studied in this paper using Maximum Likelihood (ML) approach. We consider the estimation problem under the model of multiple independent reader sessions with detection errors due to unreliable radio...... is evaluated under dierent system parameters and compared with that of the conventional method via computer simulations assuming flat Rayleigh fading environments and framed-slotted ALOHA based protocol. Keywords RFID tag cardinality estimation maximum likelihood detection error...

  10. Cosmic shear measurement with maximum likelihood and maximum a posteriori inference

    Science.gov (United States)

    Hall, Alex; Taylor, Andy

    2017-06-01

    We investigate the problem of noise bias in maximum likelihood and maximum a posteriori estimators for cosmic shear. We derive the leading and next-to-leading order biases and compute them in the context of galaxy ellipticity measurements, extending previous work on maximum likelihood inference for weak lensing. We show that a large part of the bias on these point estimators can be removed using information already contained in the likelihood when a galaxy model is specified, without the need for external calibration. We test these bias-corrected estimators on simulated galaxy images similar to those expected from planned space-based weak lensing surveys, with promising results. We find that the introduction of an intrinsic shape prior can help with mitigation of noise bias, such that the maximum a posteriori estimate can be made less biased than the maximum likelihood estimate. Second-order terms offer a check on the convergence of the estimators, but are largely subdominant. We show how biases propagate to shear estimates, demonstrating in our simple set-up that shear biases can be reduced by orders of magnitude and potentially to within the requirements of planned space-based surveys at mild signal-to-noise ratio. We find that second-order terms can exhibit significant cancellations at low signal-to-noise ratio when Gaussian noise is assumed, which has implications for inferring the performance of shear-measurement algorithms from simplified simulations. We discuss the viability of our point estimators as tools for lensing inference, arguing that they allow for the robust measurement of ellipticity and shear.

  11. Maximum likelihood estimation for Cox's regression model under nested case-control sampling

    DEFF Research Database (Denmark)

    Scheike, Thomas Harder; Juul, Anders

    2004-01-01

    -like growth factor I was associated with ischemic heart disease. The study was based on a population of 3784 Danes and 231 cases of ischemic heart disease where controls were matched on age and gender. We illustrate the use of the MLE for these data and show how the maximum likelihood framework can be used......Nested case-control sampling is designed to reduce the costs of large cohort studies. It is important to estimate the parameters of interest as efficiently as possible. We present a new maximum likelihood estimator (MLE) for nested case-control sampling in the context of Cox's proportional hazards...... model. The MLE is computed by the EM-algorithm, which is easy to implement in the proportional hazards setting. Standard errors are estimated by a numerical profile likelihood approach based on EM aided differentiation. The work was motivated by a nested case-control study that hypothesized that insulin...

  12. Noncentral Chi-Square versus Normal Distributions in Describing the Likelihood Ratio Statistic: The Univariate Case and Its Multivariate Implication

    Science.gov (United States)

    Yuan, Ke-Hai

    2008-01-01

    In the literature of mean and covariance structure analysis, noncentral chi-square distribution is commonly used to describe the behavior of the likelihood ratio (LR) statistic under alternative hypothesis. Due to the inaccessibility of the rather technical literature for the distribution of the LR statistic, it is widely believed that the…

  13. The skewed weak lensing likelihood: why biases arise, despite data and theory being sound.

    Science.gov (United States)

    Sellentin, Elena; Heymans, Catherine; Harnois-Déraps, Joachim

    2018-04-01

    We derive the essentials of the skewed weak lensing likelihood via a simple Hierarchical Forward Model. Our likelihood passes four objective and cosmology-independent tests which a standard Gaussian likelihood fails. We demonstrate that sound weak lensing data are naturally biased low, since they are drawn from a skewed distribution. This occurs already in the framework of ΛCDM. Mathematically, the biases arise because noisy two-point functions follow skewed distributions. This form of bias is already known from CMB analyses, where the low multipoles have asymmetric error bars. Weak lensing is more strongly affected by this asymmetry as galaxies form a discrete set of shear tracer particles, in contrast to a smooth shear field. We demonstrate that the biases can be up to 30% of the standard deviation per data point, dependent on the properties of the weak lensing survey and the employed filter function. Our likelihood provides a versatile framework with which to address this bias in future weak lensing analyses.

  14. Extraction behaviour of Am(III) and Eu(III) from nitric acid medium in CMPO-HDEHP impregnated resins

    Energy Technology Data Exchange (ETDEWEB)

    Saipriya, K.; Kumar, T. [Bhabha Atomic Research Centre Facilities (India). Kalpakkam Reproscessing Plants; Kumaresan, R.; Nayak, P.K.; Venkatesan, K.A.; Antony, M.P. [Indira Gandhi Center for Atomic Research, Kalpakkam (India). Fuel Chemistry Div.

    2016-05-01

    Chromatographic resin containing extractants such as octyl(phenyl)-N,N-diisobutylcarbamoylmethylphosphine oxide (CMPO) or bis-(2-ethylhexyl)phosphoric acid (HDEHP) or mixture of extractants (CMPO + HDEHP) in an acrylic polymer matrix was prepared and studied for the extraction of Am(III) and Eu(III) over a range of nitric acid concentration. The effect of various parameters such as concentration of nitric acid in aqueous phase and the concentration of CMPO and HDEHP in the resin phase was studied. The distribution coefficient of Am(III) and Eu(III) in the impregnated resin increased with increased in the concentration of nitric acid for CMPO-impregnated resin, whereas a reverse trend was observed in HDEHP impregnated resin. In case of resin containing both the extractants, synergism was observed at low nitric acid concentration and antagonism at high nitric acid concentration. The mechanism of extraction was probed by slope analysis method at 0.01 and 2 M nitric acid concentrations. Citrate-buffered DTPA was used for the selective separation of Am(III), and a separation factor of 3-4 was obtained at pH 3.

  15. A likelihood-based time series modeling approach for application in dendrochronology to examine the growth-climate relations and forest disturbance history

    Science.gov (United States)

    A time series intervention analysis (TSIA) of dendrochronological data to infer the tree growth-climate-disturbance relations and forest disturbance history is described. Maximum likelihood is used to estimate the parameters of a structural time series model with components for ...

  16. Synergistic solvent extraction of Eu(III) and Tb(III) with mixtures of various organophosphorus extractants

    International Nuclear Information System (INIS)

    Reddy, B.V.; Reddy, L.K.; Reddy, A.S.

    1994-01-01

    Synergistic solvent extraction of Eu(III) and Tb(III) from thiocyanate solutions with mixtures of 2-ethylhexylphosphonic acid mono-2-ethylhexyl ester (EHPNA) and di-2-ethylhexylphosphoric acid (DEHPA) or tributyl phosphate (TBP) or trioctylphosphine oxide (TOPO) or triphenylphosphine oxide (TPhPO) in benzene has been studied. The mechanism of extraction can be explained by a simple chemically based model. The equilibrium constants of the mixed-ligand species of the various neutral donors have been determined by non-linear regression analysis. (author) 13 refs.; 9 figs.; 2 tabs

  17. Average Likelihood Methods of Classification of Code Division Multiple Access (CDMA)

    Science.gov (United States)

    2016-05-01

    subject to code matrices that follows the structure given by (113). [⃗ yR y⃗I ] = √ Es 2L [ GR1 −GI1 GI2 GR2 ] [ QR −QI QI QR ] [⃗ bR b⃗I ] + [⃗ nR n⃗I... QR ] [⃗ b+ b⃗− ] + [⃗ n+ n⃗− ] (115) The average likelihood for type 4 CDMA (116) is a special case of type 1 CDMA with twice the code length and...AVERAGE LIKELIHOOD METHODS OF CLASSIFICATION OF CODE DIVISION MULTIPLE ACCESS (CDMA) MAY 2016 FINAL TECHNICAL REPORT APPROVED FOR PUBLIC RELEASE

  18. Review of Elaboration Likelihood Model of persuasion

    OpenAIRE

    藤原, 武弘; 神山, 貴弥

    1989-01-01

    This article mainly introduces Elaboration Likelihood Model (ELM), proposed by Petty & Cacioppo, that is, a general attitude change theory. ELM posturates two routes to persuasion; central and peripheral route. Attitude change by central route is viewed as resulting from a diligent consideration of the issue-relevant informations presented. On the other hand, attitude change by peripheral route is viewed as resulting from peripheral cues in the persuasion context. Secondly we compare these tw...

  19. Thermodecomposition of lanthanides (III) and ytrium (III) glucoheptonates

    International Nuclear Information System (INIS)

    Giolito, J.

    1987-01-01

    The lanthanides (III) and yttrium (III) glucoheptonates as well the D-glucoheptono 1-4 lactone were studied using common analytical methods, elemental microanalysis of carbon and hydrogen, thermogravimetry and differential scanning calorimetry. These compounds were prepared from the reaction between the lanthanides (III) and yttrium (III) hydroxides and glucoheptonic acid aqueous solution obtained by means of the delta lactone hydrolysis of this acid. After stoichiometric reaction the compounds were precipitated by the addition of absolute ethanol, washed with the same solvent and dried in desiccator. Thermogravimetric the (TG) curves of the lanthanides glucoheptonates of the ceric group present thermal profiles with enough differences permitting an easy caracterization of each compound and the yttrium (III) glucoheptonate TG curve showed a great similarity with the erbium (III) compound TG curve. The differential scanning calometry (DSC) curves showed endothermic and exothermic peaks by their shape, height and position (temperature) permit an easy and rapid identification of each compound specially if DSC and TG curves were examined simultaneously. (author) [pt

  20. Validation of software for calculating the likelihood ratio for parentage and kinship.

    Science.gov (United States)

    Drábek, J

    2009-03-01

    Although the likelihood ratio is a well-known statistical technique, commercial off-the-shelf (COTS) software products for its calculation are not sufficiently validated to suit general requirements for the competence of testing and calibration laboratories (EN/ISO/IEC 17025:2005 norm) per se. The software in question can be considered critical as it directly weighs the forensic evidence allowing judges to decide on guilt or innocence or to identify person or kin (i.e.: in mass fatalities). For these reasons, accredited laboratories shall validate likelihood ratio software in accordance with the above norm. To validate software for calculating the likelihood ratio in parentage/kinship scenarios I assessed available vendors, chose two programs (Paternity Index and familias) for testing, and finally validated them using tests derived from elaboration of the available guidelines for the field of forensics, biomedicine, and software engineering. MS Excel calculation using known likelihood ratio formulas or peer-reviewed results of difficult paternity cases were used as a reference. Using seven testing cases, it was found that both programs satisfied the requirements for basic paternity cases. However, only a combination of two software programs fulfills the criteria needed for our purpose in the whole spectrum of functions under validation with the exceptions of providing algebraic formulas in cases of mutation and/or silent allele.

  1. The Moessbauer effect in Fe(III) HEDTA, Fe(III) EDTA, and Fe(III) CDTA compounds

    International Nuclear Information System (INIS)

    Prado, F.R.

    1989-01-01

    The dependence of Moessbauer spectra with pH value of Fe(III)HEDTA and Fe(III)CDTA compounds is studied. Informations on formation processes of LFe-O-FeL (L=ligand) type dimers by the relation of titration curves of Fe(III)EDTA, Fe(III)HEDTA and Fe(III)CDTA compounds with the series of Moessbauer spectra, are obtained. Some informations on Fe-O-Fe bond structure are also obtained. Comparing the titration curves with the series of Moessbauer spectra, it is concluded that the dimerization process begins when a specie of the form FeXOH α (X = EDTA, HEDTA, CDTA; α = -1, -2) arises. (M.C.K.) [pt

  2. A guideline for the validation of likelihood ratio methods used for forensic evidence evaluation

    NARCIS (Netherlands)

    Meuwly, Didier; Ramos, Daniel; Haraksim, Rudolf

    2017-01-01

    This Guideline proposes a protocol for the validation of forensic evaluation methods at the source level, using the Likelihood Ratio framework as defined within the Bayes’ inference model. In the context of the inference of identity of source, the Likelihood Ratio is used to evaluate the strength of

  3. Characterization of Fe (III)-reducing enrichment culture and isolation of Fe (III)-reducing bacterium Enterobacter sp. L6 from marine sediment.

    Science.gov (United States)

    Liu, Hongyan; Wang, Hongyu

    2016-07-01

    To enrich the Fe (III)-reducing bacteria, sludge from marine sediment was inoculated into the medium using Fe (OH)3 as the sole electron acceptor. Efficiency of Fe (III) reduction and composition of Fe (III)-reducing enrichment culture were analyzed. The results indicated that the Fe (III)-reducing enrichment culture with the dominant bacteria relating to Clostridium and Enterobacter sp. had high Fe (III) reduction of (2.73 ± 0.13) mmol/L-Fe (II). A new Fe (III)-reducing bacterium was isolated from the Fe (III)-reducing enrichment culture and identified as Enterobacter sp. L6 by 16S rRNA gene sequence analysis. The Fe (III)-reducing ability of strain L6 under different culture conditions was investigated. The results indicated that strain L6 had high Fe (III)-reducing activity using glucose and pyruvate as carbon sources. Strain L6 could reduce Fe (III) at the range of NaCl concentrations tested and had the highest Fe (III) reduction of (4.63 ± 0.27) mmol/L Fe (II) at the NaCl concentration of 4 g/L. This strain L6 could reduce Fe (III) with unique properties in adaptability to salt variation, which indicated that it can be used as a model organism to study Fe (III)-reducing activity isolated from marine environment. Copyright © 2015 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.

  4. Likelihood for transcriptions in a genetic regulatory system under asymmetric stable Lévy noise.

    Science.gov (United States)

    Wang, Hui; Cheng, Xiujun; Duan, Jinqiao; Kurths, Jürgen; Li, Xiaofan

    2018-01-01

    This work is devoted to investigating the evolution of concentration in a genetic regulation system, when the synthesis reaction rate is under additive and multiplicative asymmetric stable Lévy fluctuations. By focusing on the impact of skewness (i.e., non-symmetry) in the probability distributions of noise, we find that via examining the mean first exit time (MFET) and the first escape probability (FEP), the asymmetric fluctuations, interacting with nonlinearity in the system, lead to peculiar likelihood for transcription. This includes, in the additive noise case, realizing higher likelihood of transcription for larger positive skewness (i.e., asymmetry) index β, causing a stochastic bifurcation at the non-Gaussianity index value α = 1 (i.e., it is a separating point or line for the likelihood for transcription), and achieving a turning point at the threshold value β≈-0.5 (i.e., beyond which the likelihood for transcription suddenly reversed for α values). The stochastic bifurcation and turning point phenomena do not occur in the symmetric noise case (β = 0). While in the multiplicative noise case, non-Gaussianity index value α = 1 is a separating point or line for both the MFET and the FEP. We also investigate the noise enhanced stability phenomenon. Additionally, we are able to specify the regions in the whole parameter space for the asymmetric noise, in which we attain desired likelihood for transcription. We have conducted a series of numerical experiments in "regulating" the likelihood of gene transcription by tuning asymmetric stable Lévy noise indexes. This work offers insights for possible ways of achieving gene regulation in experimental research.

  5. Reconceptualizing Social Influence in Counseling: The Elaboration Likelihood Model.

    Science.gov (United States)

    McNeill, Brian W.; Stoltenberg, Cal D.

    1989-01-01

    Presents Elaboration Likelihood Model (ELM) of persuasion (a reconceptualization of the social influence process) as alternative model of attitude change. Contends ELM unifies conflicting social psychology results and can potentially account for inconsistent research findings in counseling psychology. Provides guidelines on integrating…

  6. The Likelihood of Confusion in the United State Ninth Circuit and the doctrine of Confusable Marks in the Andean Tribunal

    Directory of Open Access Journals (Sweden)

    Francisco José Cabrera Perdomo

    2016-06-01

    Full Text Available The article analysis the most important cases within the jurisdiction of California regarding the trademark infringement and its prerogative of the likelihood of confusion. Finally, it compares the conclusion with the confusable marks theory within the Andean community’s recent cases solving the issue.

  7. Physician Bayesian updating from personal beliefs about the base rate and likelihood ratio.

    Science.gov (United States)

    Rottman, Benjamin Margolin

    2017-02-01

    Whether humans can accurately make decisions in line with Bayes' rule has been one of the most important yet contentious topics in cognitive psychology. Though a number of paradigms have been used for studying Bayesian updating, rarely have subjects been allowed to use their own preexisting beliefs about the prior and the likelihood. A study is reported in which physicians judged the posttest probability of a diagnosis for a patient vignette after receiving a test result, and the physicians' posttest judgments were compared to the normative posttest calculated from their own beliefs in the sensitivity and false positive rate of the test (likelihood ratio) and prior probability of the diagnosis. On the one hand, the posttest judgments were strongly related to the physicians' beliefs about both the prior probability as well as the likelihood ratio, and the priors were used considerably more strongly than in previous research. On the other hand, both the prior and the likelihoods were still not used quite as much as they should have been, and there was evidence of other nonnormative aspects to the updating, such as updating independent of the likelihood beliefs. By focusing on how physicians use their own prior beliefs for Bayesian updating, this study provides insight into how well experts perform probabilistic inference in settings in which they rely upon their own prior beliefs rather than experimenter-provided cues. It suggests that there is reason to be optimistic about experts' abilities, but that there is still considerable need for improvement.

  8. Powdered alcohol: Awareness and likelihood of use among a sample of college students.

    Science.gov (United States)

    Vail-Smith, Karen; Chaney, Beth H; Martin, Ryan J; Don Chaney, J

    2016-01-01

    In March 2015, the Alcohol and Tobacco Tax and Trade Bureau approved the sale of Palcohol, the first powdered alcohol product to be marketed and sold in the U.S. Powdered alcohol is freeze-dried, and one individual-serving size packet added to 6 ounces of liquid is equivalent to a standard drink. This study assessed awareness of powered alcohol and likelihood to use and/or misuse powdered alcohol among college students. Surveys were administered to a convenience sample of 1,841 undergraduate students. Only 16.4% of respondents had heard of powdered alcohol. After being provided a brief description of powdered alcohol, 23% indicated that they would use the product if available, and of those, 62.1% also indicated likelihood of misusing the product (eg, snorting it, mixing it with alcohol). Caucasian students (OR = 1.5) and hazardous drinkers (based on AUDIT-C scores; OR = 4.7) were significantly more likely to indicate likelihood of use. Hazardous drinkers were also six times more likely to indicate likelihood to misuse the product. These findings can inform upstream prevention efforts in states debating bans on powdered alcohol. In states where powdered alcohol will soon be available, alcohol education initiatives should be updated to include information on the potential risks of use and be targeted to those populations most likely to misuse. This is the first peer-reviewed study to assess the awareness of and likelihood to use and/or misuse powdered alcohol, a potentially emerging form of alcohol. © American Academy of Addiction Psychiatry.

  9. Observation Likelihood Model Design and Failure Recovery Scheme toward Reliable Localization of Mobile Robots

    Directory of Open Access Journals (Sweden)

    Chang-bae Moon

    2011-01-01

    Full Text Available Although there have been many researches on mobile robot localization, it is still difficult to obtain reliable localization performance in a human co-existing real environment. Reliability of localization is highly dependent upon developer's experiences because uncertainty is caused by a variety of reasons. We have developed a range sensor based integrated localization scheme for various indoor service robots. Through the experience, we found out that there are several significant experimental issues. In this paper, we provide useful solutions for following questions which are frequently faced with in practical applications: 1 How to design an observation likelihood model? 2 How to detect the localization failure? 3 How to recover from the localization failure? We present design guidelines of observation likelihood model. Localization failure detection and recovery schemes are presented by focusing on abrupt wheel slippage. Experiments were carried out in a typical office building environment. The proposed scheme to identify the localizer status is useful in practical environments. Moreover, the semi-global localization is a computationally efficient recovery scheme from localization failure. The results of experiments and analysis clearly present the usefulness of proposed solutions.

  10. Observation Likelihood Model Design and Failure Recovery Scheme Toward Reliable Localization of Mobile Robots

    Directory of Open Access Journals (Sweden)

    Chang-bae Moon

    2010-12-01

    Full Text Available Although there have been many researches on mobile robot localization, it is still difficult to obtain reliable localization performance in a human co-existing real environment. Reliability of localization is highly dependent upon developer's experiences because uncertainty is caused by a variety of reasons. We have developed a range sensor based integrated localization scheme for various indoor service robots. Through the experience, we found out that there are several significant experimental issues. In this paper, we provide useful solutions for following questions which are frequently faced with in practical applications: 1 How to design an observation likelihood model? 2 How to detect the localization failure? 3 How to recover from the localization failure? We present design guidelines of observation likelihood model. Localization failure detection and recovery schemes are presented by focusing on abrupt wheel slippage. Experiments were carried out in a typical office building environment. The proposed scheme to identify the localizer status is useful in practical environments. Moreover, the semi-global localization is a computationally efficient recovery scheme from localization failure. The results of experiments and analysis clearly present the usefulness of proposed solutions.

  11. Practical Statistics for LHC Physicists: Descriptive Statistics, Probability and Likelihood (1/3)

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    These lectures cover those principles and practices of statistics that are most relevant for work at the LHC. The first lecture discusses the basic ideas of descriptive statistics, probability and likelihood. The second lecture covers the key ideas in the frequentist approach, including confidence limits, profile likelihoods, p-values, and hypothesis testing. The third lecture covers inference in the Bayesian approach. Throughout, real-world examples will be used to illustrate the practical application of the ideas. No previous knowledge is assumed.

  12. A composite likelihood approach for spatially correlated survival data

    Science.gov (United States)

    Paik, Jane; Ying, Zhiliang

    2013-01-01

    The aim of this paper is to provide a composite likelihood approach to handle spatially correlated survival data using pairwise joint distributions. With e-commerce data, a recent question of interest in marketing research has been to describe spatially clustered purchasing behavior and to assess whether geographic distance is the appropriate metric to describe purchasing dependence. We present a model for the dependence structure of time-to-event data subject to spatial dependence to characterize purchasing behavior from the motivating example from e-commerce data. We assume the Farlie-Gumbel-Morgenstern (FGM) distribution and then model the dependence parameter as a function of geographic and demographic pairwise distances. For estimation of the dependence parameters, we present pairwise composite likelihood equations. We prove that the resulting estimators exhibit key properties of consistency and asymptotic normality under certain regularity conditions in the increasing-domain framework of spatial asymptotic theory. PMID:24223450

  13. A composite likelihood approach for spatially correlated survival data.

    Science.gov (United States)

    Paik, Jane; Ying, Zhiliang

    2013-01-01

    The aim of this paper is to provide a composite likelihood approach to handle spatially correlated survival data using pairwise joint distributions. With e-commerce data, a recent question of interest in marketing research has been to describe spatially clustered purchasing behavior and to assess whether geographic distance is the appropriate metric to describe purchasing dependence. We present a model for the dependence structure of time-to-event data subject to spatial dependence to characterize purchasing behavior from the motivating example from e-commerce data. We assume the Farlie-Gumbel-Morgenstern (FGM) distribution and then model the dependence parameter as a function of geographic and demographic pairwise distances. For estimation of the dependence parameters, we present pairwise composite likelihood equations. We prove that the resulting estimators exhibit key properties of consistency and asymptotic normality under certain regularity conditions in the increasing-domain framework of spatial asymptotic theory.

  14. PALM: a paralleled and integrated framework for phylogenetic inference with automatic likelihood model selectors.

    Directory of Open Access Journals (Sweden)

    Shu-Hwa Chen

    Full Text Available BACKGROUND: Selecting an appropriate substitution model and deriving a tree topology for a given sequence set are essential in phylogenetic analysis. However, such time consuming, computationally intensive tasks rely on knowledge of substitution model theories and related expertise to run through all possible combinations of several separate programs. To ensure a thorough and efficient analysis and avert tedious manipulations of various programs, this work presents an intuitive framework, the phylogenetic reconstruction with automatic likelihood model selectors (PALM, with convincing, updated algorithms and a best-fit model selection mechanism for seamless phylogenetic analysis. METHODOLOGY: As an integrated framework of ClustalW, PhyML, MODELTEST, ProtTest, and several in-house programs, PALM evaluates the fitness of 56 substitution models for nucleotide sequences and 112 substitution models for protein sequences with scores in various criteria. The input for PALM can be either sequences in FASTA format or a sequence alignment file in PHYLIP format. To accelerate the computing of maximum likelihood and bootstrapping, this work integrates MPICH2/PhyML, PalmMonitor and Palm job controller across several machines with multiple processors and adopts the task parallelism approach. Moreover, an intuitive and interactive web component, PalmTree, is developed for displaying and operating the output tree with options of tree rooting, branches swapping, viewing the branch length values, and viewing bootstrapping score, as well as removing nodes to restart analysis iteratively. SIGNIFICANCE: The workflow of PALM is straightforward and coherent. Via a succinct, user-friendly interface, researchers unfamiliar with phylogenetic analysis can easily use this server to submit sequences, retrieve the output, and re-submit a job based on a previous result if some sequences are to be deleted or added for phylogenetic reconstruction. PALM results in an inference of

  15. Direct spectrophotometric analysis of low level Pu (III) in Pu(IV) nitrate solution

    International Nuclear Information System (INIS)

    Mageswaran, P.; Suresh Kumar, K.; Kumar, T.; Gayen, J.K.; Shreekumar, B.; Dey, P.K.

    2010-01-01

    Among the various methods demonstrated for the conversion of plutonium nitrate to its oxide, the oxalate precipitation process either as Pu (III) or Pu (IV) oxalate gained wide acceptance. Since uranous nitrate is the most successful partitioning agent used in the PUREX process for the separation of Pu from the bulk amount of U, the Pu (III) oxalate precipitation of the purified nitrate solution will not give required decontamination from U. Hence Pu IV oxalate precipitation process is a better option to achieve the end user's specified PuO 2 product. Prior to the precipitation process, ensuring of the Pu (IV) oxidation state is essential. Hence monitoring of the level of Pu oxidation state either Pu (III) or Pu (IV) in the feed solution plays a significant role to establish complete conversion of Pu (III). The method in vogue to estimate Pu(lV) content is extractive radiometry using Theonyl Trifluoro Acetone (TTA). As the the method warrants a sample preparation with respect to acidity, a precise measurement of Pu (IV) without affecting the Pu(III) level in the feed sample is difficult. Present study is focused on the exploration of direct spectrophotometry using an optic fiber probe of path length of 40mm to monitor the low level of Pu(III) after removing the bulk Pu(lV) which interfere in the Pu(III) absorption spectrum, using TTA-TBP synergistic mixture without changing the sample acidity

  16. Maximum Likelihood Compton Polarimetry with the Compton Spectrometer and Imager

    Energy Technology Data Exchange (ETDEWEB)

    Lowell, A. W.; Boggs, S. E; Chiu, C. L.; Kierans, C. A.; Sleator, C.; Tomsick, J. A.; Zoglauer, A. C. [Space Sciences Laboratory, University of California, Berkeley (United States); Chang, H.-K.; Tseng, C.-H.; Yang, C.-Y. [Institute of Astronomy, National Tsing Hua University, Taiwan (China); Jean, P.; Ballmoos, P. von [IRAP Toulouse (France); Lin, C.-H. [Institute of Physics, Academia Sinica, Taiwan (China); Amman, M. [Lawrence Berkeley National Laboratory (United States)

    2017-10-20

    Astrophysical polarization measurements in the soft gamma-ray band are becoming more feasible as detectors with high position and energy resolution are deployed. Previous work has shown that the minimum detectable polarization (MDP) of an ideal Compton polarimeter can be improved by ∼21% when an unbinned, maximum likelihood method (MLM) is used instead of the standard approach of fitting a sinusoid to a histogram of azimuthal scattering angles. Here we outline a procedure for implementing this maximum likelihood approach for real, nonideal polarimeters. As an example, we use the recent observation of GRB 160530A with the Compton Spectrometer and Imager. We find that the MDP for this observation is reduced by 20% when the MLM is used instead of the standard method.

  17. Synthesis, structure, luminescent, and magnetic properties of carbonato-bridged Zn(II)2Ln(III)2 complexes [(μ4-CO3)2{Zn(II)L(n)Ln(III)(NO3)}2] (Ln(III) = Gd(III), Tb(III), Dy(III); L(1) = N,N'-bis(3-methoxy-2-oxybenzylidene)-1,3-propanediaminato, L(2) = N,N'-bis(3-ethoxy-2-oxybenzylidene)-1,3-propanediaminato).

    Science.gov (United States)

    Ehama, Kiyomi; Ohmichi, Yusuke; Sakamoto, Soichiro; Fujinami, Takeshi; Matsumoto, Naohide; Mochida, Naotaka; Ishida, Takayuki; Sunatsuki, Yukinari; Tsuchimoto, Masanobu; Re, Nazzareno

    2013-11-04

    of Zn(II)2Dy(III)2 were not detected. The fine structure assignable to the (5)D4 → (7)F6 transition of ZnTb1 and ZnTb2 is in good accord with the energy pattern from the magnetic analysis. The Zn(II)2Ln(III)2 complexes (Ln(III) = Tb(III), Dy(III)) showed an out-of-phase signal with frequency-dependence in alternating current susceptibility, indicative of single molecule magnet. Under a dc bias field of 1000 Oe, the signals become significantly more intense and the energy barrier, Δ/kB, for the magnetic relaxation was estimated from the Arrhenius plot to be 39(1) and 42(8) K for ZnTb1 and ZnTb2, and 52(2) and 67(2) K for ZnDy1 and ZnDy2, respectively.

  18. Articulación de fones en individuos clase esqueletal I,II y III Speech patterns in skeletal class I, II and III subjects

    Directory of Open Access Journals (Sweden)

    Pía Villanueva

    2009-09-01

    Full Text Available OBJETIVO: determinar los patrones de articulación de fones consonánticos en sujetos de habla española chilena clases I, II y III esqueletal; comparar las diferencias fonéticas que existan entre clases esqueletales. MÉTODOS: se seleccionaron 54 individuos que cumplían con los criterios de inclusión determinados mediante un examen clínico intraoral y a través del análisis de Ricketts, y se conformaron los grupos de estudio de pacientes clases esqueletales I, II y III. Se les realizó un examen fonoarticulatorio estandarizado para determinar los fones modificados y el patrón articulatorio compensatorio realizado. RESULTADOS: se observaron cambios en el punto de articulación de fones consonánticos en las tres clases esqueletales, con diferencias significativas en los grupos de fones anteriores y medios entre pacientes clases I y II, sólo en el grupo de los fones anteriores entre pacientes I y III. Entre pacientes clases II y III no se observaron diferencias significativas. Se reportan modificaciones y compensaciones cualitativamente distintas entre las clases esqueletales. CONCLUSIONES: en relación a pacientes clase I, los pacientes clase II o III, presentan distinto grado de modificación en el punto de articulación de fones consonánticos. Las diferencias observadas se relacionan con los patrones esqueletales propios de cada clase.PURPOSE: to determine the consonant phonemes articulation patterns in Chilean skeletal class I, II and III Spanish speakers and compare their phonetic differences. METHODS: fifty-four skeletal class I, II and III subjects were selected, based on intraoral clinical examination and Ricketts cephalometric analysis, constituting the study groups. A standardized phonoarticulatory test was applied to each patient to determine the modified phonemes and their compensatory patterns. RESULTS: the findings indicate changes in articulation in all three groups. Significant differences were found in anterior and medium

  19. Fluorescence Resonance Energy Transfer of the Tb(III)-Nd(III) Binary System in Molten LiCl-KCl Eutectic Salt

    Energy Technology Data Exchange (ETDEWEB)

    Kim, B. Y. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Yun, J. I. [KAIST, Daejeon (Korea, Republic of)

    2015-05-15

    The lanthanides act as a neutron poison in nuclear reactor with large neutron absorption cross section. For that reason, very low amount of lanthanides is required in the recovered U/TRU ingot product from pyrochemical process. In view of that, the investigation of thermodynamic properties and chemical behaviors of lanthanides in molten chloride salt are necessary to estimate the performance efficiency of pyrochemical process. However, there are uncertainties about knowledge and understanding of basic mechanisms in pyrochemical process, such as chemical speciation and redox behaviors due to the lack of in-situ monitoring methods for high temperature molten salt. The spectroscopic analysis is one of the probable techniques for in-situ qualitative and quantitative analysis. Recently, a few fluorescence spectroscopic measurements on single lanthanide element in molten LiCl-KCl eutectic have been investigated. The fluorescence intensity and the fluorescence lifetime of Tb(III) were decreased as increasing the concentration of Nd(III), demonstrating collisional quenching between donor ions and acceptor ions. The Forster distance (..0) of Tb(III)-Nd(III) binary system in molten LiCl-KCl eutectic was determined in the specific range of .... (0.1-1.0) and .. (1.387-1.496)

  20. Composite likelihood and two-stage estimation in family studies

    DEFF Research Database (Denmark)

    Andersen, Elisabeth Anne Wreford

    2004-01-01

    In this paper register based family studies provide the motivation for linking a two-stage estimation procedure in copula models for multivariate failure time data with a composite likelihood approach. The asymptotic properties of the estimators in both parametric and semi-parametric models are d...

  1. Robust Gaussian Process Regression with a Student-t Likelihood

    NARCIS (Netherlands)

    Jylänki, P.P.; Vanhatalo, J.; Vehtari, A.

    2011-01-01

    This paper considers the robust and efficient implementation of Gaussian process regression with a Student-t observation model, which has a non-log-concave likelihood. The challenge with the Student-t model is the analytically intractable inference which is why several approximative methods have

  2. Absorption spectra analysis of hydrated uranium(III) complex chlorides

    Science.gov (United States)

    Karbowiak, M.; Gajek, Z.; Drożdżyński, J.

    2000-11-01

    Absorption spectra of powdered samples of hydrated uranium(III) complex chlorides of the formulas NH 4UCl 4 · 4H 2O and CsUCl 4 · 3H 2O have been recorded at 4.2 K in the 4000-26 000 cm -1 range. The analysis of the spectra enabled the determination of crystal-field parameters and assignment of 83 and 77 crystal-field levels for the tetrahydrate and trihydrate, respectively. The energies of the levels were computed by applying a simplified angular overlap model as well as a semiempirical Hamiltonian representing the combined atomic and crystal-field interactions. Ab initio calculations have enabled the application of a simplified parameterization and the determination of the starting values of the AOM parameters. The received results have proved that the AOM approach can quite well predict both the structure of the ground multiplet and the positions of the crystal-field levels in the 17 000-25 000 cm -1 range, usually obscured by strong f-d bands.

  3. The Fear of Pain Questionnaire-III and the Fear of Pain Questionnaire-Short Form: a confirmatory factor analysis

    DEFF Research Database (Denmark)

    Vambheim, Sara M.; Lyby, Peter Solvoll; Aslaksen, Per M.

    2017-01-01

    .Aims and methods: The purpose of the study was to investigate the model fit, reliability and validity of the FPQ-III and the FPQ-SF in a Norwegian nonclinical sample, using confirmatory factor analysis (CFA). The second aim was to explore the model fit of the two scales in male and female subgroups separately...... the questionnaires, the model fit, validity and reliability were compared across sex using CFA.Results: The results revealed that both models' original factor structures had poor fit. However, the FPQ-SF had a better fit overall, compared to the FPQ-III. The model fit of the two models differed across sex...

  4. [Cephalometric analysis in cases with Class III malocclusions].

    Science.gov (United States)

    Rak, D

    1989-01-01

    Various orthodontic class III anomalies, classified into several experimental groups, and eugnathic occlusions serving as controls were studied by roentgencephalometry. The objective of the study was to detect possible distinctions in the quantitative values of two variables chosen and to select the variables which most significantly discriminate the group of class III orthodontic anomalies. Attempts were made to ascertain whether or not there were sex-related differences. The teleroentgenograms of 269 examines, aged 10-18 years, of both sexes were analyzed. The experimental group consisted of 89 examinees class III orthodontic anomalies. The control group consisted of 180 examines with eugnathic occlusion. Latero-lateral skull roentgenograms were taken observing the rules of roentgenocephalometry. Using acetate paper, the drawings of profile teleroentgenograms were elaborated and the reference points and lines were entered. A total of 38 variables were analyzed, of which there were 10 linear, 19 angular, and 8 variables were obtained by mathematical calculation; the age variable was also analyzed. In statistical analyses an electronic computer was used. The results are presented in tables and graphs. The results obtained showed that: --compared to the findings in the control group, the subjects in the experimental group displayed significant changes in the following craniofacial characteristics a negative difference in the position of the apical base of the jaw, manifest concavity of the osseous profile and diminished convexity of the profile of soft parts, retroinclination of the lower incisors, mandibular prognathism, increased mandibular angle and increased mandibular proportion compared to maxillary and the anterior cranial base; --with regard to the sex of the examinees, only four linear variables of significantly discriminating character were selected, so that in can be concluded that there were no significant sex differences among the morphological

  5. Sorption of trace amounts of gallium (III) on iron (III) oxide

    International Nuclear Information System (INIS)

    Music, S.; Gessner, M.; Wolf, R.H.H.

    1979-01-01

    The sorption of trace amounts of gallium(III) on iron(III) oxide has been studied as a function of pH. Optimum conditions have been found for the preconcentration of traces of gallium(III) by iron(III) oxide. The influence of surface active substances and of complexing agents on the sorption of trace amounts of gallium(III) on iron(III) oxide has been also studied. (orig.) [de

  6. Sorption of trace amounts of gallium (III) on iron (III) oxide

    Energy Technology Data Exchange (ETDEWEB)

    Music, S; Gessner, M; Wolf, R H.H. [Institut Rudjer Boskovic, Zagreb (Yugoslavia)

    1979-01-01

    The sorption of trace amounts of gallium(III) on iron(III) oxide has been studied as a function of pH. Optimum conditions have been found for the preconcentration of traces of gallium(III) by iron(III) oxide. The influence of surface active substances and of complexing agents on the sorption of trace amounts of gallium(III) on iron(III) oxide has been also studied.

  7. LIKELIHOOD ESTIMATION OF PARAMETERS USING SIMULTANEOUSLY MONITORED PROCESSES

    DEFF Research Database (Denmark)

    Friis-Hansen, Peter; Ditlevsen, Ove Dalager

    2004-01-01

    The topic is maximum likelihood inference from several simultaneously monitored response processes of a structure to obtain knowledge about the parameters of other not monitored but important response processes when the structure is subject to some Gaussian load field in space and time. The consi....... The considered example is a ship sailing with a given speed through a Gaussian wave field....

  8. Near-exact distributions for the block equicorrelation and equivariance likelihood ratio test statistic

    Science.gov (United States)

    Coelho, Carlos A.; Marques, Filipe J.

    2013-09-01

    In this paper the authors combine the equicorrelation and equivariance test introduced by Wilks [13] with the likelihood ratio test (l.r.t.) for independence of groups of variables to obtain the l.r.t. of block equicorrelation and equivariance. This test or its single block version may find applications in many areas as in psychology, education, medicine, genetics and they are important "in many tests of multivariate analysis, e.g. in MANOVA, Profile Analysis, Growth Curve analysis, etc" [12, 9]. By decomposing the overall hypothesis into the hypotheses of independence of groups of variables and the hypothesis of equicorrelation and equivariance we are able to obtain the expressions for the overall l.r.t. statistic and its moments. From these we obtain a suitable factorization of the characteristic function (c.f.) of the logarithm of the l.r.t. statistic, which enables us to develop highly manageable and precise near-exact distributions for the test statistic.

  9. The impact of Basel III on money creation: A synthetic analysis

    OpenAIRE

    Xiong, Wanting; Wang, Yougui

    2017-01-01

    Recent evidences provoke broad rethinking of the role of banks in money creation. The authors argue that apart from the reserve requirement, prudential regulations also play important roles in constraining the money supply. Specifically, they study three Basel III regulations and theoretically analyze their standalone and collective impacts. The authors find that 1) the money multiplier under Basel III is not constant but a decreasing function of the monetary base; 2) the determinants of the ...

  10. Regularization parameter selection methods for ill-posed Poisson maximum likelihood estimation

    International Nuclear Information System (INIS)

    Bardsley, Johnathan M; Goldes, John

    2009-01-01

    In image processing applications, image intensity is often measured via the counting of incident photons emitted by the object of interest. In such cases, image data noise is accurately modeled by a Poisson distribution. This motivates the use of Poisson maximum likelihood estimation for image reconstruction. However, when the underlying model equation is ill-posed, regularization is needed. Regularized Poisson likelihood estimation has been studied extensively by the authors, though a problem of high importance remains: the choice of the regularization parameter. We will present three statistically motivated methods for choosing the regularization parameter, and numerical examples will be presented to illustrate their effectiveness

  11. Estimating the spatial scale of herbicide and soil interactions by nested sampling, hierarchical analysis of variance and residual maximum likelihood

    Energy Technology Data Exchange (ETDEWEB)

    Price, Oliver R., E-mail: oliver.price@unilever.co [Warwick-HRI, University of Warwick, Wellesbourne, Warwick, CV32 6EF (United Kingdom); University of Reading, Soil Science Department, Whiteknights, Reading, RG6 6UR (United Kingdom); Oliver, Margaret A. [University of Reading, Soil Science Department, Whiteknights, Reading, RG6 6UR (United Kingdom); Walker, Allan [Warwick-HRI, University of Warwick, Wellesbourne, Warwick, CV32 6EF (United Kingdom); Wood, Martin [University of Reading, Soil Science Department, Whiteknights, Reading, RG6 6UR (United Kingdom)

    2009-05-15

    An unbalanced nested sampling design was used to investigate the spatial scale of soil and herbicide interactions at the field scale. A hierarchical analysis of variance based on residual maximum likelihood (REML) was used to analyse the data and provide a first estimate of the variogram. Soil samples were taken at 108 locations at a range of separating distances in a 9 ha field to explore small and medium scale spatial variation. Soil organic matter content, pH, particle size distribution, microbial biomass and the degradation and sorption of the herbicide, isoproturon, were determined for each soil sample. A large proportion of the spatial variation in isoproturon degradation and sorption occurred at sampling intervals less than 60 m, however, the sampling design did not resolve the variation present at scales greater than this. A sampling interval of 20-25 m should ensure that the main spatial structures are identified for isoproturon degradation rate and sorption without too great a loss of information in this field. - Estimating the spatial scale of herbicide and soil interactions by nested sampling.

  12. Estimating the spatial scale of herbicide and soil interactions by nested sampling, hierarchical analysis of variance and residual maximum likelihood

    International Nuclear Information System (INIS)

    Price, Oliver R.; Oliver, Margaret A.; Walker, Allan; Wood, Martin

    2009-01-01

    An unbalanced nested sampling design was used to investigate the spatial scale of soil and herbicide interactions at the field scale. A hierarchical analysis of variance based on residual maximum likelihood (REML) was used to analyse the data and provide a first estimate of the variogram. Soil samples were taken at 108 locations at a range of separating distances in a 9 ha field to explore small and medium scale spatial variation. Soil organic matter content, pH, particle size distribution, microbial biomass and the degradation and sorption of the herbicide, isoproturon, were determined for each soil sample. A large proportion of the spatial variation in isoproturon degradation and sorption occurred at sampling intervals less than 60 m, however, the sampling design did not resolve the variation present at scales greater than this. A sampling interval of 20-25 m should ensure that the main spatial structures are identified for isoproturon degradation rate and sorption without too great a loss of information in this field. - Estimating the spatial scale of herbicide and soil interactions by nested sampling.

  13. Likelihood inference for a fractionally cointegrated vector autoregressive model

    DEFF Research Database (Denmark)

    Johansen, Søren; Ørregård Nielsen, Morten

    2012-01-01

    such that the process X_{t} is fractional of order d and cofractional of order d-b; that is, there exist vectors ß for which ß'X_{t} is fractional of order d-b, and no other fractionality order is possible. We define the statistical model by 0inference when the true values satisfy b0¿1/2 and d0-b0......We consider model based inference in a fractionally cointegrated (or cofractional) vector autoregressive model with a restricted constant term, ¿, based on the Gaussian likelihood conditional on initial values. The model nests the I(d) VAR model. We give conditions on the parameters...... process in the parameters when errors are i.i.d. with suitable moment conditions and initial values are bounded. When the limit is deterministic this implies uniform convergence in probability of the conditional likelihood function. If the true value b0>1/2, we prove that the limit distribution of (ß...

  14. Organizational Justice and Men's Likelihood to Sexually Harass: The Moderating Role of Sexism and Personality

    Science.gov (United States)

    Krings, Franciska; Facchin, Stephanie

    2009-01-01

    This study demonstrated relations between men's perceptions of organizational justice and increased sexual harassment proclivities. Respondents reported higher likelihood to sexually harass under conditions of low interactional justice, suggesting that sexual harassment likelihood may increase as a response to perceived injustice. Moreover, the…

  15. Iron (III) oxyhydroxide in isopropyl alcohol preparation, characterization and solvothermal treatment

    International Nuclear Information System (INIS)

    Carvalho, E.L.C.N.; Jafelicci Junior, M.

    1989-01-01

    Iron (III) nitrate hydrolysis was carried out in isopropyl alcohol solution by an aqueous amonia gas stream resulting in iron (III) oxyhydroxide sol. It has been investigated in this work the solvothermal treatment of this colloidal system at 120 0 C and 24 hours. Iron (III) oxyhydroxide freshly obtained and solvothermally treated. Samples were dryed by lyophilization. Products obtained were characterized by the following techniques: spectrophotometric iron analysis by 1,10-orthophenantroline complexation method, powder X-ray diffraction, vibrational infrared spectra and differential thermal analysis. After solvothermal treatment resulting product was crystallized into hematite, while freshly iron (III) oxyhydroxide was non crystalline. Both of them are very active powder, showing high water adsorption [pt

  16. Cost-utility analysis of adjuvant chemotherapy in patients with stage III colon cancer in Thailand.

    Science.gov (United States)

    Lerdkiattikorn, Panattharin; Chaikledkaew, Usa; Lausoontornsiri, Wirote; Chindavijak, Somjin; Khuhaprema, Thirawud; Tantai, Narisa; Teerawattananon, Yot

    2015-01-01

    In Thailand, there has been no economic evaluation study of adjuvant chemotherapy for stage III colon cancer patients after resection. This study aims to evaluate the cost-utility of all chemotherapy regimens currently used in Thailand compared with the adjuvant 5-fluorouracil/leucovorin (5-FU/LV) plus capecitabine as the first-line therapy for metastatic disease in patients with stage III colon cancer after resection. A cost-utility analysis was performed to estimate the relevant lifetime costs and health outcomes of chemotherapy regimens based on a societal perspective using a Markov model. The results suggested that the adjuvant 5-FU/LV plus capecitabine as the first-line therapy for metastatic disease would be the most cost-effective chemotherapy. The adjuvant FOLFOX and FOLFIRI as the first-line treatment for metastatic disease would be cost-effective with an incremental cost-effectiveness ratio of 299,365 Thai baht per QALY gained based on a societal perspective if both prices of FOLFOX and FOLFIRI were decreased by 40%.

  17. FEMAXI-III. An axisymmetric finite element computer code for the analysis of fuel rod performance

    International Nuclear Information System (INIS)

    Ichikawa, M.; Nakajima, T.; Okubo, T.; Iwano, Y.; Ito, K.; Kashima, K.; Saito, H.

    1980-01-01

    For the analysis of local deformation of fuel rods, which is closely related to PCI failure in LWR, FEMAXI-III has been developed as an improved version based on the essential models of FEMAXI-II, MIPAC, and FEAST codes. The major features of FEMAXI-III are as follows: Elasto-plasticity, creep, pellet cracking, relocation, densification, hot pressing, swelling, fission gas release, and their interrelated effects are considered. Contact conditions between pellet and cladding are exactly treated, where sliding or sticking is defined by iterations. Special emphasis is placed on creep and pellet cracking. In the former, an implicit algorithm is applied to improve numerical stability. In the latter, the pellet is assumed to be non-tension material. The recovery of pellet stiffness under compression is related to initial relocation. Quadratic isoparametric elements are used. The skyline method is applied to solve linear stiffness equation to reduce required core memories. The basic performance of the code has been proven to be satisfactory. (author)

  18. Lanthanum(III) and praseodymium(III) complexes with bidentate and tetradentate Schiff base ligands containing indole ring

    International Nuclear Information System (INIS)

    Rai, Anita; Sengupta, Soumitra Kumar; Pandey, Om Prakash

    2000-01-01

    Complexes of lanthanum(III) and praseodymium(III) with Schiff bases, prepared from isatin with aniline, 4-chloroaniline, 2- bromoaniline, 2-nitroaniline (Hl), ethylenediamine, o- phenylenediamine and 4-methyl-o-phenylenediamine (H 2 L') have been synthesised and their physico-chemical properties investigated using elemental analysis, molar conductivities, magnetic susceptibility measurements and spectral (visible, infrared and 1 H NMR) data. The Schiff bases HL bind in a bidentate manner while schiff bases H 2 L' bind in a tetradentate manner. The probable structures of the complexes are proposed. (author)

  19. Menyoal Elaboration Likelihood Model (ELM dan Teori Retorika

    Directory of Open Access Journals (Sweden)

    Yudi Perbawaningsih

    2012-06-01

    Full Text Available Abstract: Persuasion is a communication process to establish or change attitudes, which can be understood through theory of Rhetoric and theory of Elaboration Likelihood Model (ELM. This study elaborates these theories in a Public Lecture series which to persuade the students in choosing their concentration of study. The result shows that in term of persuasion effectiveness it is not quite relevant to separate the message and its source. The quality of source is determined by the quality of the message, and vice versa. Separating the two routes of the persuasion process as described in the ELM theory would not be relevant. Abstrak: Persuasi adalah proses komunikasi untuk membentuk atau mengubah sikap, yang dapat dipahami dengan teori Retorika dan teori Elaboration Likelihood Model (ELM. Penelitian ini mengelaborasi teori tersebut dalam Kuliah Umum sebagai sarana mempersuasi mahasiswa untuk memilih konsentrasi studi studi yang didasarkan pada proses pengolahan informasi. Menggunakan metode survey, didapatkan hasil yaitu tidaklah cukup relevan memisahkan pesan dan narasumber dalam melihat efektivitas persuasi. Keduanya menyatu yang berarti bahwa kualitas narasumber ditentukan oleh kualitas pesan yang disampaikannya, dan sebaliknya. Memisahkan proses persuasi dalam dua lajur seperti yang dijelaskan dalam ELM teori menjadi tidak relevan.

  20. Potentiometric studies on some ternary complexes of Nd(III), Sm(III), Gd(III) and Ho(III) with cyclohexanediaminetetraacetic acid as primary ligand

    International Nuclear Information System (INIS)

    Marathe, D.G.; Munshi, K.N.

    1983-01-01

    The formation constants of the ternary complexes of neodymium(III), samarium(III), gadlonium(III) and holmium(III) with cyclohexanediaminetetraacetic acid (CyDTA) as primary ligand and dihydroxynaphthalene (DHN), dihydroxynaphthalene-6-sulphonic acid (DHNSA) and cateechol-3,5-disulphonic acid (CDSA) as secondary ligands have been investigated by potentiometric titration technique. The secondary ligands have been investigated by potentiometric titration technique. The values of formation constants of 1:1:1 ternary chelates are reported at three different temperatures, and at a fixed ionic strength, μ = 0.1 M (NaClO 4 ). (author)

  1. ZE3RA: the ZEPLIN-III Reduction and Analysis package

    International Nuclear Information System (INIS)

    Neves, F; Chepel, V; DeViveiros, L; Lindote, A; Lopes, M I; Akimov, D Yu; Belov, V A; Burenkov, A A; Kobyakin, A S; Kovalenko, A G; Araújo, H M; Currie, A; Horn, M; Lebedenko, V N; Barnes, E J; Ghag, C; Hollingsworth, A; Edwards, B; Kalmus, G E; Lüscher, R

    2011-01-01

    ZE3RA is the software package responsible for processing the raw data from the ZEPLIN-III dark matter experiment and its reduction into a set of parameters used in all subsequent analyses. The detector is a liquid xenon time projection chamber with scintillation and electroluminescence signals read out by an array of 31 photomultipliers. The dual range 62-channel data stream is optimised for the detection of scintillation pulses down to a single photoelectron and of ionisation signals as small as those produced by single electrons. We discuss in particular several strategies related to data filtering, pulse finding and pulse clustering which are tuned using calibration data to recover the best electron/nuclear recoil discrimination near the detection threshold, where most dark matter elastic scattering signatures are expected. The software was designed assuming only minimal knowledge of the physics underlying the detection principle, allowing an unbiased analysis of the experimental results and easy extension to other detectors with similar requirements.

  2. Computer-based guidelines for concrete pavements : HIPERPAV III : user manual

    Science.gov (United States)

    2009-10-01

    This user manual provides guidance on how to use the new High PERformance PAVing (HIPERPAV) III software program for the analysis of early-age Portland cement concrete pavement (PCCP) behavior. HIPERPAV III includes several improvements over prev...

  3. Joint maximum-likelihood magnitudes of presumed underground nuclear test explosions

    Science.gov (United States)

    Peacock, Sheila; Douglas, Alan; Bowers, David

    2017-08-01

    Body-wave magnitudes (mb) of 606 seismic disturbances caused by presumed underground nuclear test explosions at specific test sites between 1964 and 1996 have been derived from station amplitudes collected by the International Seismological Centre (ISC), by a joint inversion for mb and station-specific magnitude corrections. A maximum-likelihood method was used to reduce the upward bias of network mean magnitudes caused by data censoring, where arrivals at stations that do not report arrivals are assumed to be hidden by the ambient noise at the time. Threshold noise levels at each station were derived from the ISC amplitudes using the method of Kelly and Lacoss, which fits to the observed magnitude-frequency distribution a Gutenberg-Richter exponential decay truncated at low magnitudes by an error function representing the low-magnitude threshold of the station. The joint maximum-likelihood inversion is applied to arrivals from the sites: Semipalatinsk (Kazakhstan) and Novaya Zemlya, former Soviet Union; Singer (Lop Nor), China; Mururoa and Fangataufa, French Polynesia; and Nevada, USA. At sites where eight or more arrivals could be used to derive magnitudes and station terms for 25 or more explosions (Nevada, Semipalatinsk and Mururoa), the resulting magnitudes and station terms were fixed and a second inversion carried out to derive magnitudes for additional explosions with three or more arrivals. 93 more magnitudes were thus derived. During processing for station thresholds, many stations were rejected for sparsity of data, obvious errors in reported amplitude, or great departure of the reported amplitude-frequency distribution from the expected left-truncated exponential decay. Abrupt changes in monthly mean amplitude at a station apparently coincide with changes in recording equipment and/or analysis method at the station.

  4. L.U.St: a tool for approximated maximum likelihood supertree reconstruction.

    Science.gov (United States)

    Akanni, Wasiu A; Creevey, Christopher J; Wilkinson, Mark; Pisani, Davide

    2014-06-12

    Supertrees combine disparate, partially overlapping trees to generate a synthesis that provides a high level perspective that cannot be attained from the inspection of individual phylogenies. Supertrees can be seen as meta-analytical tools that can be used to make inferences based on results of previous scientific studies. Their meta-analytical application has increased in popularity since it was realised that the power of statistical tests for the study of evolutionary trends critically depends on the use of taxon-dense phylogenies. Further to that, supertrees have found applications in phylogenomics where they are used to combine gene trees and recover species phylogenies based on genome-scale data sets. Here, we present the L.U.St package, a python tool for approximate maximum likelihood supertree inference and illustrate its application using a genomic data set for the placental mammals. L.U.St allows the calculation of the approximate likelihood of a supertree, given a set of input trees, performs heuristic searches to look for the supertree of highest likelihood, and performs statistical tests of two or more supertrees. To this end, L.U.St implements a winning sites test allowing ranking of a collection of a-priori selected hypotheses, given as a collection of input supertree topologies. It also outputs a file of input-tree-wise likelihood scores that can be used as input to CONSEL for calculation of standard tests of two trees (e.g. Kishino-Hasegawa, Shimidoara-Hasegawa and Approximately Unbiased tests). This is the first fully parametric implementation of a supertree method, it has clearly understood properties, and provides several advantages over currently available supertree approaches. It is easy to implement and works on any platform that has python installed. bitBucket page - https://afro-juju@bitbucket.org/afro-juju/l.u.st.git. Davide.Pisani@bristol.ac.uk.

  5. Performance of penalized maximum likelihood in estimation of genetic covariances matrices

    Directory of Open Access Journals (Sweden)

    Meyer Karin

    2011-11-01

    Full Text Available Abstract Background Estimation of genetic covariance matrices for multivariate problems comprising more than a few traits is inherently problematic, since sampling variation increases dramatically with the number of traits. This paper investigates the efficacy of regularized estimation of covariance components in a maximum likelihood framework, imposing a penalty on the likelihood designed to reduce sampling variation. In particular, penalties that "borrow strength" from the phenotypic covariance matrix are considered. Methods An extensive simulation study was carried out to investigate the reduction in average 'loss', i.e. the deviation in estimated matrices from the population values, and the accompanying bias for a range of parameter values and sample sizes. A number of penalties are examined, penalizing either the canonical eigenvalues or the genetic covariance or correlation matrices. In addition, several strategies to determine the amount of penalization to be applied, i.e. to estimate the appropriate tuning factor, are explored. Results It is shown that substantial reductions in loss for estimates of genetic covariance can be achieved for small to moderate sample sizes. While no penalty performed best overall, penalizing the variance among the estimated canonical eigenvalues on the logarithmic scale or shrinking the genetic towards the phenotypic correlation matrix appeared most advantageous. Estimating the tuning factor using cross-validation resulted in a loss reduction 10 to 15% less than that obtained if population values were known. Applying a mild penalty, chosen so that the deviation in likelihood from the maximum was non-significant, performed as well if not better than cross-validation and can be recommended as a pragmatic strategy. Conclusions Penalized maximum likelihood estimation provides the means to 'make the most' of limited and precious data and facilitates more stable estimation for multi-dimensional analyses. It should

  6. Statistical analysis of maximum likelihood estimator images of human brain FDG PET studies

    International Nuclear Information System (INIS)

    Llacer, J.; Veklerov, E.; Hoffman, E.J.; Nunez, J.; Coakley, K.J.

    1993-01-01

    The work presented in this paper evaluates the statistical characteristics of regional bias and expected error in reconstructions of real PET data of human brain fluorodeoxiglucose (FDG) studies carried out by the maximum likelihood estimator (MLE) method with a robust stopping rule, and compares them with the results of filtered backprojection (FBP) reconstructions and with the method of sieves. The task that the authors have investigated is that of quantifying radioisotope uptake in regions-of-interest (ROI's). They first describe a robust methodology for the use of the MLE method with clinical data which contains only one adjustable parameter: the kernel size for a Gaussian filtering operation that determines final resolution and expected regional error. Simulation results are used to establish the fundamental characteristics of the reconstructions obtained by out methodology, corresponding to the case in which the transition matrix is perfectly known. Then, data from 72 independent human brain FDG scans from four patients are used to show that the results obtained from real data are consistent with the simulation, although the quality of the data and of the transition matrix have an effect on the final outcome

  7. Revision and extension to the analysis of the third spectrum of tellurium: Te III

    International Nuclear Information System (INIS)

    Tauheed, A.; Naz, A.

    2011-01-01

    The spectrum of doubly ionized tellurium atom (Te III) has been investigated in the vacuum ultraviolet wavelength region. The ground configuration of Te III is 5s 2 5p 2 and the excited configurations are of the type 5s 2 5p nl. The core excitation leads to a 5s5p 3 configuration. Cowan's multi-configuration interaction code was utilized to predict the ion structure. The observed spectrum of tellurium was recorded on a 3-m normal incidence vacuum spectrograph of Antigonish Laboratory (Canada) in the wavelength region of 300 - 2000 A by using a triggered spark light source for the excitation of the spectrum. The 5s 2 5p 2 - [ 5s 2 5p (5d + 6d + 7d + 6s + 7s + 8s) + 5s5p 3 ] transition array has been analyzed. Previously reported levels by Joshi et al have been confirmed while the older analysis by Crooker and Joshi has been revised and extended to include the 5s 2 5p (5d, 6d, 7d, 6s,7s, 8s) and 5s5p 3 configurations. Least-squares- fitted parametric calculations were used to interpret the final results. One hundred and fifty spectral lines have been identified to establish 60 energy levels. Our wavelength accuracy for unblended and sharp lines is better than ±0.005 A. The ionization potential of Te III was found to be 224550 ± 300 cm -1 (27.841 ± 0.037eV).

  8. Analyzing multivariate survival data using composite likelihood and flexible parametric modeling of the hazard functions

    DEFF Research Database (Denmark)

    Nielsen, Jan; Parner, Erik

    2010-01-01

    In this paper, we model multivariate time-to-event data by composite likelihood of pairwise frailty likelihoods and marginal hazards using natural cubic splines. Both right- and interval-censored data are considered. The suggested approach is applied on two types of family studies using the gamma...

  9. Solvent effects on extraction of aluminum(III), gallium(III), and indium(III), with decanoic acid

    International Nuclear Information System (INIS)

    Yamada, Hiromichi; Hayashi, Hisao; Fujii, Yukio; Mizuta, Masateru

    1986-01-01

    Extraction of aluminum(III) and indium(III) with decanoic acid in 1-octanol was carried out at 25 deg C and at an aqueous ionic strength of 0.1 mol dm -3 (NaClO 4 ). Monomeric and tetrameric aluminum(III) decanoates and monomeric indium(III) decanoate are responsible for the extraction. From a comparison of the present results with those obtained from the previous works, the polymerization of the extracted species was found to be more extensive in benzene than in 1-octanol, and the metal decanoates were highly polymerized in the following order in both solvents: Al > Ga > In. (author)

  10. Parameterizing Spatial Models of Infectious Disease Transmission that Incorporate Infection Time Uncertainty Using Sampling-Based Likelihood Approximations.

    Directory of Open Access Journals (Sweden)

    Rajat Malik

    Full Text Available A class of discrete-time models of infectious disease spread, referred to as individual-level models (ILMs, are typically fitted in a Bayesian Markov chain Monte Carlo (MCMC framework. These models quantify probabilistic outcomes regarding the risk of infection of susceptible individuals due to various susceptibility and transmissibility factors, including their spatial distance from infectious individuals. The infectious pressure from infected individuals exerted on susceptible individuals is intrinsic to these ILMs. Unfortunately, quantifying this infectious pressure for data sets containing many individuals can be computationally burdensome, leading to a time-consuming likelihood calculation and, thus, computationally prohibitive MCMC-based analysis. This problem worsens when using data augmentation to allow for uncertainty in infection times. In this paper, we develop sampling methods that can be used to calculate a fast, approximate likelihood when fitting such disease models. A simple random sampling approach is initially considered followed by various spatially-stratified schemes. We test and compare the performance of our methods with both simulated data and data from the 2001 foot-and-mouth disease (FMD epidemic in the U.K. Our results indicate that substantial computation savings can be obtained--albeit, of course, with some information loss--suggesting that such techniques may be of use in the analysis of very large epidemic data sets.

  11. From reads to genes to pathways: differential expression analysis of RNA-Seq experiments using Rsubread and the edgeR quasi-likelihood pipeline [version 1; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Yunshun Chen

    2016-06-01

    Full Text Available In recent years, RNA sequencing (RNA-seq has become a very widely used technology for profiling gene expression. One of the most common aims of RNA-seq profiling is to identify genes or molecular pathways that are differentially expressed (DE between two or more biological conditions. This article demonstrates a computational workflow for the detection of DE genes and pathways from RNA-seq data by providing a complete analysis of an RNA-seq experiment profiling epithelial cell subsets in the mouse mammary gland. The workflow uses R software packages from the open-source Bioconductor project and covers all steps of the analysis pipeline, including alignment of read sequences, data exploration, differential expression analysis, visualization and pathway analysis. Read alignment and count quantification is conducted using the Rsubread package and the statistical analyses are performed using the edgeR package. The differential expression analysis uses the quasi-likelihood functionality of edgeR.

  12. MAXIMUM-LIKELIHOOD-ESTIMATION OF THE ENTROPY OF AN ATTRACTOR

    NARCIS (Netherlands)

    SCHOUTEN, JC; TAKENS, F; VANDENBLEEK, CM

    In this paper, a maximum-likelihood estimate of the (Kolmogorov) entropy of an attractor is proposed that can be obtained directly from a time series. Also, the relative standard deviation of the entropy estimate is derived; it is dependent on the entropy and on the number of samples used in the

  13. Maximum likelihood estimation for cytogenetic dose-response curves

    International Nuclear Information System (INIS)

    Frome, E.L; DuFrain, R.J.

    1983-10-01

    In vitro dose-response curves are used to describe the relation between the yield of dicentric chromosome aberrations and radiation dose for human lymphocytes. The dicentric yields follow the Poisson distribution, and the expected yield depends on both the magnitude and the temporal distribution of the dose for low LET radiation. A general dose-response model that describes this relation has been obtained by Kellerer and Rossi using the theory of dual radiation action. The yield of elementary lesions is kappa[γd + g(t, tau)d 2 ], where t is the time and d is dose. The coefficient of the d 2 term is determined by the recovery function and the temporal mode of irradiation. Two special cases of practical interest are split-dose and continuous exposure experiments, and the resulting models are intrinsically nonlinear in the parameters. A general purpose maximum likelihood estimation procedure is described and illustrated with numerical examples from both experimental designs. Poisson regression analysis is used for estimation, hypothesis testing, and regression diagnostics. Results are discussed in the context of exposure assessment procedures for both acute and chronic human radiation exposure

  14. Maximum likelihood estimation for cytogenetic dose-response curves

    Energy Technology Data Exchange (ETDEWEB)

    Frome, E.L; DuFrain, R.J.

    1983-10-01

    In vitro dose-response curves are used to describe the relation between the yield of dicentric chromosome aberrations and radiation dose for human lymphocytes. The dicentric yields follow the Poisson distribution, and the expected yield depends on both the magnitude and the temporal distribution of the dose for low LET radiation. A general dose-response model that describes this relation has been obtained by Kellerer and Rossi using the theory of dual radiation action. The yield of elementary lesions is kappa(..gamma..d + g(t, tau)d/sup 2/), where t is the time and d is dose. The coefficient of the d/sup 2/ term is determined by the recovery function and the temporal mode of irradiation. Two special cases of practical interest are split-dose and continuous exposure experiments, and the resulting models are intrinsically nonlinear in the parameters. A general purpose maximum likelihood estimation procedure is described and illustrated with numerical examples from both experimental designs. Poisson regression analysis is used for estimation, hypothesis testing, and regression diagnostics. Results are discussed in the context of exposure assessment procedures for both acute and chronic human radiation exposure.

  15. Narrow band interference cancelation in OFDM: Astructured maximum likelihood approach

    KAUST Repository

    Sohail, Muhammad Sadiq; Al-Naffouri, Tareq Y.; Al-Ghadhban, Samir N.

    2012-01-01

    This paper presents a maximum likelihood (ML) approach to mitigate the effect of narrow band interference (NBI) in a zero padded orthogonal frequency division multiplexing (ZP-OFDM) system. The NBI is assumed to be time variant and asynchronous

  16. Predicting Likelihood of Surgery Prior to First Visit in Patients with Back and Lower Extremity Symptoms: A simple mathematical model based on over 8000 patients.

    Science.gov (United States)

    Boden, Lauren M; Boden, Stephanie A; Premkumar, Ajay; Gottschalk, Michael B; Boden, Scott D

    2018-02-09

    Retrospective analysis of prospectively collected data. To create a data-driven triage system stratifying patients by likelihood of undergoing spinal surgery within one year of presentation. Low back pain (LBP) and radicular lower extremity (LE) symptoms are common musculoskeletal problems. There is currently no standard data-derived triage process based on information that can be obtained prior to the initial physician-patient encounter to direct patients to the optimal physician type. We analyzed patient-reported data from 8006 patients with a chief complaint of LBP and/or LE radicular symptoms who presented to surgeons at a large multidisciplinary spine center between September 1, 2005 and June 30, 2016. Univariate and multivariate analysis identified independent risk factors for undergoing spinal surgery within one year of initial visit. A model incorporating these risk factors was created using a random sample of 80% of the total patients in our cohort, and validated on the remaining 20%. The baseline one-year surgery rate within our cohort was 39% for all patients and 42% for patients with LE symptoms. Those identified as high likelihood by the center's existing triage process had a surgery rate of 45%. The new triage scoring system proposed in this study was able to identify a high likelihood group in which 58% underwent surgery, which is a 46% higher surgery rate than in non-triaged patients and a 29% improvement from our institution's existing triage system. The data-driven triage model and scoring system derived and validated in this study (Spine Surgery Likelihood model [SSL-11]), significantly improved existing processes in predicting the likelihood of undergoing spinal surgery within one year of initial presentation. This triage system will allow centers to more selectively screen for surgical candidates and more effectively direct patients to surgeons or non-operative spine specialists. 4.

  17. Likelihood analysis of the sub-GUT MSSM in light of LHC 13-TeV data

    Science.gov (United States)

    Costa, J. C.; Bagnaschi, E.; Sakurai, K.; Borsato, M.; Buchmueller, O.; Citron, M.; De Roeck, A.; Dolan, M. J.; Ellis, J. R.; Flächer, H.; Heinemeyer, S.; Lucio, M.; Santos, D. Martínez; Olive, K. A.; Richards, A.; Weiglein, G.

    2018-02-01

    We describe a likelihood analysis using MasterCode of variants of the MSSM in which the soft supersymmetry-breaking parameters are assumed to have universal values at some scale M_in below the supersymmetric grand unification scale M_GUT, as can occur in mirage mediation and other models. In addition to M_in, such `sub-GUT' models have the 4 parameters of the CMSSM, namely a common gaugino mass m_{1/2}, a common soft supersymmetry-breaking scalar mass m_0, a common trilinear mixing parameter A and the ratio of MSSM Higgs vevs tan β , assuming that the Higgs mixing parameter μ > 0. We take into account constraints on strongly- and electroweakly-interacting sparticles from ˜ 36/fb of LHC data at 13 TeV and the LUX and 2017 PICO, XENON1T and PandaX-II searches for dark matter scattering, in addition to the previous LHC and dark matter constraints as well as full sets of flavour and electroweak constraints. We find a preference for M_in˜ 10^5 to 10^9 GeV, with M_in˜ M_GUT disfavoured by Δ χ ^2 ˜ 3 due to the BR(B_{s, d} → μ ^+μ ^-) constraint. The lower limits on strongly-interacting sparticles are largely determined by LHC searches, and similar to those in the CMSSM. We find a preference for the LSP to be a Bino or Higgsino with m_{\\tilde{χ }^01} ˜ 1 TeV, with annihilation via heavy Higgs bosons H / A and stop coannihilation, or chargino coannihilation, bringing the cold dark matter density into the cosmological range. We find that spin-independent dark matter scattering is likely to be within reach of the planned LUX-Zeplin and XENONnT experiments. We probe the impact of the (g-2)_μ constraint, finding similar results whether or not it is included.

  18. Thermal decomposition of yttrium(III) propionate and butyrate

    DEFF Research Database (Denmark)

    Grivel, Jean-Claude

    2013-01-01

    The thermal decompositions of yttrium(III) propionate monohydrate (Y(C2H5CO2)3·H2O) and yttrium(III) butyrate dihydrate (Y(C3H7CO2)3·2H2O) were studied in argon by means of thermogravimetry, differential thermal analysis, IR-spectroscopy, X-ray diffraction and hot-stage microscopy. These two...

  19. On the insufficiency of arbitrarily precise covariance matrices: non-Gaussian weak-lensing likelihoods

    Science.gov (United States)

    Sellentin, Elena; Heavens, Alan F.

    2018-01-01

    We investigate whether a Gaussian likelihood, as routinely assumed in the analysis of cosmological data, is supported by simulated survey data. We define test statistics, based on a novel method that first destroys Gaussian correlations in a data set, and then measures the non-Gaussian correlations that remain. This procedure flags pairs of data points that depend on each other in a non-Gaussian fashion, and thereby identifies where the assumption of a Gaussian likelihood breaks down. Using this diagnosis, we find that non-Gaussian correlations in the CFHTLenS cosmic shear correlation functions are significant. With a simple exclusion of the most contaminated data points, the posterior for s8 is shifted without broadening, but we find no significant reduction in the tension with s8 derived from Planck cosmic microwave background data. However, we also show that the one-point distributions of the correlation statistics are noticeably skewed, such that sound weak-lensing data sets are intrinsically likely to lead to a systematically low lensing amplitude being inferred. The detected non-Gaussianities get larger with increasing angular scale such that for future wide-angle surveys such as Euclid or LSST, with their very small statistical errors, the large-scale modes are expected to be increasingly affected. The shifts in posteriors may then not be negligible and we recommend that these diagnostic tests be run as part of future analyses.

  20. Can Asperger syndrome be distinguished from autism? An anatomic likelihood meta-analysis of MRI studies.

    Science.gov (United States)

    Yu, Kevin K; Cheung, Charlton; Chua, Siew E; McAlonan, Gráinne M

    2011-11-01

    The question of whether Asperger syndrome can be distinguished from autism has attracted much debate and may even incur delay in diagnosis and intervention. Accordingly, there has been a proposal for Asperger syndrome to be subsumed under autism in the forthcoming Diagnostic and Statistical Manual of Mental Disorders, fifth edition, in 2013. One approach to resolve this question has been to adopt the criterion of absence of clinically significant language or cognitive delay--essentially, the "absence of language delay." To our knowledge, this is the first meta-analysis of magnetic resonance imaging (MRI) studies of people with autism to compare absence with presence of language delay. It capitalizes on the voxel-based morphometry (VBM) approach to systematically explore the whole brain for anatomic correlates of delay and no delay in language acquisition in people with autism spectrum disorders. We conducted a systematic search for VBM MRI studies of grey matter volume in people with autism. Studies with a majority (at least 70%) of participants with autism diagnoses and a history of language delay were assigned to the autism group (n = 151, control n = 190). Those with a majority (at least 70%) of individuals with autism diagnoses and no language delay were assigned to the Asperger syndrome group (n = 149, control n = 214). We entered study coordinates into anatomic likelihood estimation meta-analysis software with sampling size weighting to compare grey matter summary maps driven by Asperger syndrome or autism. The summary autism grey matter map showed lower volumes in the cerebellum, right uncus, dorsal hippocampus and middle temporal gyrus compared with controls; grey matter volumes were greater in the bilateral caudate, prefrontal lobe and ventral temporal lobe. The summary Asperger syndrome map indicated lower grey matter volumes in the bilateral amygdala/hippocampal gyrus and prefrontal lobe, left occipital gyrus, right cerebellum, putamen and precuneus

  1. Likelihood-based methods for evaluating principal surrogacy in augmented vaccine trials.

    Science.gov (United States)

    Liu, Wei; Zhang, Bo; Zhang, Hui; Zhang, Zhiwei

    2017-04-01

    There is growing interest in assessing immune biomarkers, which are quick to measure and potentially predictive of long-term efficacy, as surrogate endpoints in randomized, placebo-controlled vaccine trials. This can be done under a principal stratification approach, with principal strata defined using a subject's potential immune responses to vaccine and placebo (the latter may be assumed to be zero). In this context, principal surrogacy refers to the extent to which vaccine efficacy varies across principal strata. Because a placebo recipient's potential immune response to vaccine is unobserved in a standard vaccine trial, augmented vaccine trials have been proposed to produce the information needed to evaluate principal surrogacy. This article reviews existing methods based on an estimated likelihood and a pseudo-score (PS) and proposes two new methods based on a semiparametric likelihood (SL) and a pseudo-likelihood (PL), for analyzing augmented vaccine trials. Unlike the PS method, the SL method does not require a model for missingness, which can be advantageous when immune response data are missing by happenstance. The SL method is shown to be asymptotically efficient, and it performs similarly to the PS and PL methods in simulation experiments. The PL method appears to have a computational advantage over the PS and SL methods.

  2. An Efficient UD-Based Algorithm for the Computation of Maximum Likelihood Sensitivity of Continuous-Discrete Systems

    DEFF Research Database (Denmark)

    Boiroux, Dimitri; Juhl, Rune; Madsen, Henrik

    2016-01-01

    This paper addresses maximum likelihood parameter estimation of continuous-time nonlinear systems with discrete-time measurements. We derive an efficient algorithm for the computation of the log-likelihood function and its gradient, which can be used in gradient-based optimization algorithms....... This algorithm uses UD decomposition of symmetric matrices and the array algorithm for covariance update and gradient computation. We test our algorithm on the Lotka-Volterra equations. Compared to the maximum likelihood estimation based on finite difference gradient computation, we get a significant speedup...

  3. On-line Speciation of Cr(III) and Cr(VI) by Flow Injection Analysis With Spectrophotometric Detection and Chemometrics

    DEFF Research Database (Denmark)

    Diacu, Elena; Andersen, Jens Enevold Thaulov

    2003-01-01

    A flow injection system has been developed, for on-line speciation. of Cr(III) and Cr(VI) by the Diphenylcarbazide (DPC) method with H2O2 oxidation followed by spectrophotometric detection at the 550 nm wavelength. The data thus obtained were subjected to a chemometric analysis (PLS), which showe...

  4. Mechatronic systems and materials III

    CERN Document Server

    Gosiewski, Zdzislaw

    2009-01-01

    This very interesting volume is divided into 24 sections; each of which covers, in detail, one aspect of the subject-matter: I. Industrial robots; II. Microrobotics; III. Mobile robots; IV. Teleoperation, telerobotics, teleoperated semi-autonomous systems; V. Sensors and actuators in mechatronics; VI. Control of mechatronic systems; VII. Analysis of vibration and deformation; VIII. Optimization, optimal design; IX. Integrated diagnostics; X. Failure analysis; XI. Tribology in mechatronic systems; XII. Analysis of signals; XIII. Measurement techniques; XIV. Multifunctional and smart materials;

  5. Practical aspects of a maximum likelihood estimation method to extract stability and control derivatives from flight data

    Science.gov (United States)

    Iliff, K. W.; Maine, R. E.

    1976-01-01

    A maximum likelihood estimation method was applied to flight data and procedures to facilitate the routine analysis of a large amount of flight data were described. Techniques that can be used to obtain stability and control derivatives from aircraft maneuvers that are less than ideal for this purpose are described. The techniques involve detecting and correcting the effects of dependent or nearly dependent variables, structural vibration, data drift, inadequate instrumentation, and difficulties with the data acquisition system and the mathematical model. The use of uncertainty levels and multiple maneuver analysis also proved to be useful in improving the quality of the estimated coefficients. The procedures used for editing the data and for overall analysis are also discussed.

  6. An investigation on the fatigue behavior of DCB specimen bonded with aluminum foam at Mode III

    Energy Technology Data Exchange (ETDEWEB)

    Lee, J. H.; Cho, J. U. [Kongju University, Cheonan (Korea, Republic of); Zhao, G [School of Aerospace, Xian Jiaotong University, Xian (China); Cho, C. [Dept. of Mechanical Engineering, Inha University, Incheon (Korea, Republic of)

    2016-10-15

    Aluminum foam with its excellent physical and mechanical characteristics is a lightweight metallic material used with good quality in vehicle bumpers, internal shock absorbers on planes, as materials for vessel joints etc. On the contrary, when aluminum foam is used without sufficient investigation, there is the likelihood of damage or destruction of the machine or mechanical structure, and in extreme case it may cause to human casualties. This study aims to analyze the characteristics of adhesive structures with aluminum foam for the closed-type aluminum foam used primarily as shock absorbers. The fatigue analyses of the DCB test specimens at mode III with aluminum foam are verified through a fatigue experiment. As the analysis results, test specimen models with the thicknesses (t) of 35 mm, 45 mm and 55 mm showed the peak load occurrence approximately after the progress from 0 to 50 cycles. And afterwards the load gradually decreased as the cycles increased. The peak loads for each DCB test specimens were ±0.80 kN for the specimen thickness(t) of 35 mm, ±0.98 kN for the specimen thickness(t) of 45 mm and ±1.18 kN for the specimen thickness(t) of 55 mm. It is also shown that the peak load occurring on the specimen increased as specimen thickness increased. These study results are compared with the specimen thickness of 35 mm model as the basis. When the specimen thickness is increased as much as 10 mm, the peak load is increased approximately 1.25 times. When the specimen thickness is also increased as much as 20 mm, the peak load is increased 1.5 times. The analysis data and the real experiment data showed similar results each other. Therefore, it can be thought that the analysis data is applicable in real field. And it is estimated that the mechanical characteristics of the DCB test specimen at mode III during the fatigue load conditions can be systematically and efficiently analyzed.

  7. Improved anomaly detection using multi-scale PLS and generalized likelihood ratio test

    KAUST Repository

    Madakyaru, Muddu

    2017-02-16

    Process monitoring has a central role in the process industry to enhance productivity, efficiency, and safety, and to avoid expensive maintenance. In this paper, a statistical approach that exploit the advantages of multiscale PLS models (MSPLS) and those of a generalized likelihood ratio (GLR) test to better detect anomalies is proposed. Specifically, to consider the multivariate and multi-scale nature of process dynamics, a MSPLS algorithm combining PLS and wavelet analysis is used as modeling framework. Then, GLR hypothesis testing is applied using the uncorrelated residuals obtained from MSPLS model to improve the anomaly detection abilities of these latent variable based fault detection methods even further. Applications to a simulated distillation column data are used to evaluate the proposed MSPLS-GLR algorithm.

  8. Improved anomaly detection using multi-scale PLS and generalized likelihood ratio test

    KAUST Repository

    Madakyaru, Muddu; Harrou, Fouzi; Sun, Ying

    2017-01-01

    Process monitoring has a central role in the process industry to enhance productivity, efficiency, and safety, and to avoid expensive maintenance. In this paper, a statistical approach that exploit the advantages of multiscale PLS models (MSPLS) and those of a generalized likelihood ratio (GLR) test to better detect anomalies is proposed. Specifically, to consider the multivariate and multi-scale nature of process dynamics, a MSPLS algorithm combining PLS and wavelet analysis is used as modeling framework. Then, GLR hypothesis testing is applied using the uncorrelated residuals obtained from MSPLS model to improve the anomaly detection abilities of these latent variable based fault detection methods even further. Applications to a simulated distillation column data are used to evaluate the proposed MSPLS-GLR algorithm.

  9. An analysis on Public Service Announcements (PSA) within the scope of Elaboration Likelihood Model: Orange and Hazelnut Consumption Samples

    OpenAIRE

    Bical, Adil; Yılmaz, R. Ayhan

    2018-01-01

    The purpose of the study is to reveal that how persuasion works in public service announcements on hazelnut and orange consumption ones broadcasted in Turkey. According to Petty and Cacioppo, Elaboration Likelihood Model explains the process of persuasion on two routes: central and peripheral. In-depth interviews were conducted to obtain the goal of the study. Respondents were asked whether they process the message of the PSA centrally or peripherally. Advertisements on consumption of hazelnu...

  10. A simple route to maximum-likelihood estimates of two-locus

    Indian Academy of Sciences (India)

    Home; Journals; Journal of Genetics; Volume 94; Issue 3. A simple route to maximum-likelihood estimates of two-locus recombination fractions under inequality restrictions. Iain L. Macdonald Philasande Nkalashe. Research Note Volume 94 Issue 3 September 2015 pp 479-481 ...

  11. The asymptotic behaviour of the maximum likelihood function of Kriging approximations using the Gaussian correlation function

    CSIR Research Space (South Africa)

    Kok, S

    2012-07-01

    Full Text Available continuously as the correlation function hyper-parameters approach zero. Since the global minimizer of the maximum likelihood function is an asymptote in this case, it is unclear if maximum likelihood estimation (MLE) remains valid. Numerical ill...

  12. Multilevel maximum likelihood estimation with application to covariance matrices

    Czech Academy of Sciences Publication Activity Database

    Turčičová, Marie; Mandel, J.; Eben, Kryštof

    Published online: 23 January ( 2018 ) ISSN 0361-0926 R&D Projects: GA ČR GA13-34856S Institutional support: RVO:67985807 Keywords : Fisher information * High dimension * Hierarchical maximum likelihood * Nested parameter spaces * Spectral diagonal covariance model * Sparse inverse covariance model Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.311, year: 2016

  13. SMORN-III benchmark test on reactor noise analysis methods

    International Nuclear Information System (INIS)

    Shinohara, Yoshikuni; Hirota, Jitsuya

    1984-02-01

    A computational benchmark test was performed in conjunction with the Third Specialists Meeting on Reactor Noise (SMORN-III) which was held in Tokyo, Japan in October 1981. This report summarizes the results of the test as well as the works made for preparation of the test. (author)

  14. Approximate maximum parsimony and ancestral maximum likelihood.

    Science.gov (United States)

    Alon, Noga; Chor, Benny; Pardi, Fabio; Rapoport, Anat

    2010-01-01

    We explore the maximum parsimony (MP) and ancestral maximum likelihood (AML) criteria in phylogenetic tree reconstruction. Both problems are NP-hard, so we seek approximate solutions. We formulate the two problems as Steiner tree problems under appropriate distances. The gist of our approach is the succinct characterization of Steiner trees for a small number of leaves for the two distances. This enables the use of known Steiner tree approximation algorithms. The approach leads to a 16/9 approximation ratio for AML and asymptotically to a 1.55 approximation ratio for MP.

  15. Modelling maximum likelihood estimation of availability

    International Nuclear Information System (INIS)

    Waller, R.A.; Tietjen, G.L.; Rock, G.W.

    1975-01-01

    Suppose the performance of a nuclear powered electrical generating power plant is continuously monitored to record the sequence of failure and repairs during sustained operation. The purpose of this study is to assess one method of estimating the performance of the power plant when the measure of performance is availability. That is, we determine the probability that the plant is operational at time t. To study the availability of a power plant, we first assume statistical models for the variables, X and Y, which denote the time-to-failure and the time-to-repair variables, respectively. Once those statistical models are specified, the availability, A(t), can be expressed as a function of some or all of their parameters. Usually those parameters are unknown in practice and so A(t) is unknown. This paper discusses the maximum likelihood estimator of A(t) when the time-to-failure model for X is an exponential density with parameter, lambda, and the time-to-repair model for Y is an exponential density with parameter, theta. Under the assumption of exponential models for X and Y, it follows that the instantaneous availability at time t is A(t)=lambda/(lambda+theta)+theta/(lambda+theta)exp[-[(1/lambda)+(1/theta)]t] with t>0. Also, the steady-state availability is A(infinity)=lambda/(lambda+theta). We use the observations from n failure-repair cycles of the power plant, say X 1 , X 2 , ..., Xsub(n), Y 1 , Y 2 , ..., Ysub(n) to present the maximum likelihood estimators of A(t) and A(infinity). The exact sampling distributions for those estimators and some statistical properties are discussed before a simulation model is used to determine 95% simulation intervals for A(t). The methodology is applied to two examples which approximate the operating history of two nuclear power plants. (author)

  16. Maximum likelihood positioning for gamma-ray imaging detectors with depth of interaction measurement

    International Nuclear Information System (INIS)

    Lerche, Ch.W.; Ros, A.; Monzo, J.M.; Aliaga, R.J.; Ferrando, N.; Martinez, J.D.; Herrero, V.; Esteve, R.; Gadea, R.; Colom, R.J.; Toledo, J.; Mateo, F.; Sebastia, A.; Sanchez, F.; Benlloch, J.M.

    2009-01-01

    The center of gravity algorithm leads to strong artifacts for gamma-ray imaging detectors that are based on monolithic scintillation crystals and position sensitive photo-detectors. This is a consequence of using the centroids as position estimates. The fact that charge division circuits can also be used to compute the standard deviation of the scintillation light distribution opens a way out of this drawback. We studied the feasibility of maximum likelihood estimation for computing the true gamma-ray photo-conversion position from the centroids and the standard deviation of the light distribution. The method was evaluated on a test detector that consists of the position sensitive photomultiplier tube H8500 and a monolithic LSO crystal (42mmx42mmx10mm). Spatial resolution was measured for the centroids and the maximum likelihood estimates. The results suggest that the maximum likelihood positioning is feasible and partially removes the strong artifacts of the center of gravity algorithm.

  17. Maximum likelihood positioning for gamma-ray imaging detectors with depth of interaction measurement

    Energy Technology Data Exchange (ETDEWEB)

    Lerche, Ch.W. [Grupo de Sistemas Digitales, ITACA, Universidad Politecnica de Valencia, 46022 Valencia (Spain)], E-mail: lerche@ific.uv.es; Ros, A. [Grupo de Fisica Medica Nuclear, IFIC, Universidad de Valencia-Consejo Superior de Investigaciones Cientificas, 46980 Paterna (Spain); Monzo, J.M.; Aliaga, R.J.; Ferrando, N.; Martinez, J.D.; Herrero, V.; Esteve, R.; Gadea, R.; Colom, R.J.; Toledo, J.; Mateo, F.; Sebastia, A. [Grupo de Sistemas Digitales, ITACA, Universidad Politecnica de Valencia, 46022 Valencia (Spain); Sanchez, F.; Benlloch, J.M. [Grupo de Fisica Medica Nuclear, IFIC, Universidad de Valencia-Consejo Superior de Investigaciones Cientificas, 46980 Paterna (Spain)

    2009-06-01

    The center of gravity algorithm leads to strong artifacts for gamma-ray imaging detectors that are based on monolithic scintillation crystals and position sensitive photo-detectors. This is a consequence of using the centroids as position estimates. The fact that charge division circuits can also be used to compute the standard deviation of the scintillation light distribution opens a way out of this drawback. We studied the feasibility of maximum likelihood estimation for computing the true gamma-ray photo-conversion position from the centroids and the standard deviation of the light distribution. The method was evaluated on a test detector that consists of the position sensitive photomultiplier tube H8500 and a monolithic LSO crystal (42mmx42mmx10mm). Spatial resolution was measured for the centroids and the maximum likelihood estimates. The results suggest that the maximum likelihood positioning is feasible and partially removes the strong artifacts of the center of gravity algorithm.

  18. Synthesis and characterization of La(III), Pr(III), Nd(III), Sm(III), Eu(III), Gd(III), Tb(III) and Dy(III) complexes of 2-acetylfuran-2-thenoylhydrazone

    International Nuclear Information System (INIS)

    Singh, B.; Singh, Praveen K.

    1998-01-01

    The reaction of 2-acetylfuran-2-thenoylhydrazone(afth) with Ln(III) trichlorides yields complexes of the type [Ln(afth)Cl 2 (H 2 O)(EtOH)]Cl, [Ln(III) = La, Pr, Nd, Sm, Eu, Gd, Tb and Dy]. The complexes have been characterized by molar conductance, magnetic susceptibility and TGA and DTA measurements, magnetic susceptibility and TGA and DTA measurements, FAB mass, infrared, proton NMR, electronic absorption and emission spectra. The terbium complex is found to be monomer from the FAB mass spectrum. The IR and NMR spectra suggest neutral tridentate behaviour of the Schiff base. A coordination number seven is proposed around the metal ions. Emission spectra suggest C 3v , symmetry around the metal ion with capped octahedron geometry for the europium complex. (author)

  19. Simulation-based marginal likelihood for cluster strong lensing cosmology

    Science.gov (United States)

    Killedar, M.; Borgani, S.; Fabjan, D.; Dolag, K.; Granato, G.; Meneghetti, M.; Planelles, S.; Ragone-Figueroa, C.

    2018-01-01

    Comparisons between observed and predicted strong lensing properties of galaxy clusters have been routinely used to claim either tension or consistency with Λ cold dark matter cosmology. However, standard approaches to such cosmological tests are unable to quantify the preference for one cosmology over another. We advocate approximating the relevant Bayes factor using a marginal likelihood that is based on the following summary statistic: the posterior probability distribution function for the parameters of the scaling relation between Einstein radii and cluster mass, α and β. We demonstrate, for the first time, a method of estimating the marginal likelihood using the X-ray selected z > 0.5 Massive Cluster Survey clusters as a case in point and employing both N-body and hydrodynamic simulations of clusters. We investigate the uncertainty in this estimate and consequential ability to compare competing cosmologies, which arises from incomplete descriptions of baryonic processes, discrepancies in cluster selection criteria, redshift distribution and dynamical state. The relation between triaxial cluster masses at various overdensities provides a promising alternative to the strong lensing test.

  20. Maximum Likelihood Blood Velocity Estimator Incorporating Properties of Flow Physics

    DEFF Research Database (Denmark)

    Schlaikjer, Malene; Jensen, Jørgen Arendt

    2004-01-01

    )-data under investigation. The flow physic properties are exploited in the second term, as the range of velocity values investigated in the cross-correlation analysis are compared to the velocity estimates in the temporal and spatial neighborhood of the signal segment under investigation. The new estimator...... has been compared to the cross-correlation (CC) estimator and the previously developed maximum likelihood estimator (MLE). The results show that the CMLE can handle a larger velocity search range and is capable of estimating even low velocity levels from tissue motion. The CC and the MLE produce...... for the CC and the MLE. When the velocity search range is set to twice the limit of the CC and the MLE, the number of incorrect velocity estimates are 0, 19.1, and 7.2% for the CMLE, CC, and MLE, respectively. The ability to handle a larger search range and estimating low velocity levels was confirmed...

  1. Maximum likelihood pixel labeling using a spatially variant finite mixture model

    International Nuclear Information System (INIS)

    Gopal, S.S.; Hebert, T.J.

    1996-01-01

    We propose a spatially-variant mixture model for pixel labeling. Based on this spatially-variant mixture model we derive an expectation maximization algorithm for maximum likelihood estimation of the pixel labels. While most algorithms using mixture models entail the subsequent use of a Bayes classifier for pixel labeling, the proposed algorithm yields maximum likelihood estimates of the labels themselves and results in unambiguous pixel labels. The proposed algorithm is fast, robust, easy to implement, flexible in that it can be applied to any arbitrary image data where the number of classes is known and, most importantly, obviates the need for an explicit labeling rule. The algorithm is evaluated both quantitatively and qualitatively on simulated data and on clinical magnetic resonance images of the human brain

  2. Preparation and characterisation of mixed ligand complexes of Co(III), Fe(III) and Cr(III) containing phthalimide and phenols

    International Nuclear Information System (INIS)

    Miah, M.A.J.; Islam, M.S.; Pal, S.C.; Barma, T.K.

    1996-01-01

    Some novel mixed ligand complexes of Co(III), Fe(III) and Cr(III) containing phthalimide as primary and 2-aminophenol and 3-aminophenol as secondary ligands have been synthesized and characterised on the basis of elemental analyses, conductivity and magnetic measurements and infrared and electronic spectral studies. Complexes containing 2-aminophenol are 1:1 electrolyte in N,N dimethylformamide. Spectral studies indicate that all the complexes exhibit octahedral geometry. The complexes have the general composition; K[M(pim)/sub 2/(L)/sub 2/]; where m=Co(III), Fe(III) and Cr(III), pim-anion of phthalimamide and L=anion of 2-aminophenol and 3-aminophenol. (author)

  3. Iron and Arsenic Speciation During As(III) Oxidation by Manganese Oxides in the Presence of Fe(II): Molecular-Level Characterization Using XAFS, Mössbauer, and TEM Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Yun [Environmental Soil Chemistry Research Group, Delaware Environmental Institute, University of Delaware, Newark, Delaware 19716, United States; Kukkadapu, Ravi K. [Environmental Molecular Sciences Laboratory, Pacific Northwest National Laboratory, Richland, Washington 99354, United States; Livi, Kenneth J. T. [The High-Resolution Analytical Electron Microbeam Facility, Department of Earth and Planetary Sciences, The Johns Hopkins University, Baltimore, Maryland 21218, United States; Xu, Wenqian [Department of Chemistry, Brookhaven National Lab, Upton, New York 11796, United States; Li, Wei [Environmental Soil Chemistry Research Group, Delaware Environmental Institute, University of Delaware, Newark, Delaware 19716, United States; Key Laboratory of Surficial Geochemistry, Ministry of Education, School of Earth Sciences and Engineering, Nanjing University, Nanjing 210046, People’s Republic of China; Sparks, Donald L. [Environmental Soil Chemistry Research Group, Delaware Environmental Institute, University of Delaware, Newark, Delaware 19716, United States

    2018-01-17

    The redox state and speciation of metalloid arsenic (As) determine its toxicity and mobility. Knowledge of biogeochemical processes influencing the As redox state is therefore important to understand and predict its environmental behavior. Many previous studies examined As(III) oxidation by various Mn-oxides, but little is known the environmental influences (e.g. co-existing ions) on such process. In this study, we investigated the mechanisms of As(III) oxidation by a poorly crystalline hexagonal birnessite (δ-MnO2) in the presence of Fe(II) using X-ray absorption spectroscopy (XAS), Mössbauer spectroscopy and transmission electron microscopy (TEM) coupled with energy-dispersive X-ray spectroscopy (EDS). As K-edge X-ray absorption near edge spectroscopy (XANES) analysis revealed that, at low Fe(II) concentration (100 μM), As(V) was the predominant As species on the solid phase, while at higher Fe(II) concentration (200-1000 μM), both As(III) and As(V) were sorbed on the solid phase. As K-edge extended X-ray absorption fine structure spectroscopy (EXAFS) analysis showed an increasing As-Mn/Fe distance over time, indicating As prefers to bind with the newly formed Fe(III)-(hydr)oxides. As adsorbed on Fe(III)-(hydr)oxides as a bidentate binuclear corner-sharing complex. Both Mössbauer and TEM-EDS investigations demonstrated that the oxidized Fe(III) products formed during Fe(II) oxidation by δ-MnO2 were predominantly ferrihydrite, goethite, and ferric arsenate like compounds. However, Fe EXAFS analysis also suggested the formation of a small amount of lepidocrocite. The Mn K-edge XANES data indicated that As(III) and Fe(II) oxidation occurs as a two electron transfer with δ-MnO2 and the observed Mn(III) is due to conproportionation of surface sorbed Mn(II) with Mn(IV) in δ-MnO2 structure. This study reveals that the mechanisms of As(III) oxidation by δ-MnO2 in the presence of Fe(II) are very complex, involving many simultaneous reactions, and the formation of

  4. Is Primary Prostate Cancer Treatment Influenced by Likelihood of Extraprostatic Disease? A Surveillance, Epidemiology and End Results Patterns of Care Study

    International Nuclear Information System (INIS)

    Holmes, Jordan A.; Wang, Andrew Z.; Hoffman, Karen E.; Hendrix, Laura H.; Rosenman, Julian G.; Carpenter, William R.; Godley, Paul A.; Chen, Ronald C.

    2012-01-01

    Purpose: To examine the patterns of primary treatment in a recent population-based cohort of prostate cancer patients, stratified by the likelihood of extraprostatic cancer as predicted by disease characteristics available at diagnosis. Methods and Materials: A total of 157,371 patients diagnosed from 2004 to 2008 with clinically localized and potentially curable (node-negative, nonmetastatic) prostate cancer, who have complete information on prostate-specific antigen, Gleason score, and clinical stage, were included. Patients with clinical T1/T2 disease were grouped into categories of 50% likelihood of having extraprostatic disease using the Partin nomogram. Clinical T3/T4 patients were examined separately as the highest-risk group. Logistic regression was used to examine the association between patient group and receipt of each primary treatment, adjusting for age, race, year of diagnosis, marital status, Surveillance, Epidemiology and End Results database region, and county-level education. Separate models were constructed for primary surgery, external-beam radiotherapy (RT), and conservative management. Results: On multivariable analysis, increasing likelihood of extraprostatic disease was significantly associated with increasing use of RT and decreased conservative management. Use of surgery also increased. Patients with >50% likelihood of extraprostatic cancer had almost twice the odds of receiving prostatectomy as those with 50% likelihood of extraprostatic cancer (34%) and clinical T3–T4 disease (24%). The proportion of patients who received prostatectomy or conservative management was approximately 50% or slightly higher in all groups. Conclusions: There may be underutilization of RT in older prostate cancer patients and those with likely extraprostatic disease. Because more than half of prostate cancer patients do not consult with a radiation oncologist, a multidisciplinary consultation may affect the treatment decision-making process.

  5. Bias Correction for the Maximum Likelihood Estimate of Ability. Research Report. ETS RR-05-15

    Science.gov (United States)

    Zhang, Jinming

    2005-01-01

    Lord's bias function and the weighted likelihood estimation method are effective in reducing the bias of the maximum likelihood estimate of an examinee's ability under the assumption that the true item parameters are known. This paper presents simulation studies to determine the effectiveness of these two methods in reducing the bias when the item…

  6. Studies of Some Lanthanide(III Nitrate Complexes of Schiff Base Ligands

    Directory of Open Access Journals (Sweden)

    Kishor Arora Mukesh Sharma

    2009-01-01

    Full Text Available The studies of 16 new lanthanide(III nitrate complexes of Schiff base ligands are discussed. Schiff bases were obtained by the condensation of 2–methyl–4–N,N–bis–2' –cyanoethyl aminobenzaldehyde with aniline and 3 different substituted anilines. Lanthanide(III nitrates, viz. gadolinium(III nitrate, lanthanum(III nitrate, samarium(III nitrate and cerium(III nitrate were chosen to synthesize new complexes. The complexes were characterized on the basis of physicochemical studies viz. elemental analysis, spectral, viz. IR and electronic spectral and magnetic studies. TGA studies of some of the representative complexes were also done. Some of the representative complexes were also screened for the anti microbial studies.

  7. GENERALIZATION OF RAYLEIGH MAXIMUM LIKELIHOOD DESPECKLING FILTER USING QUADRILATERAL KERNELS

    Directory of Open Access Journals (Sweden)

    S. Sridevi

    2013-02-01

    Full Text Available Speckle noise is the most prevalent noise in clinical ultrasound images. It visibly looks like light and dark spots and deduce the pixel intensity as murkiest. Gazing at fetal ultrasound images, the impact of edge and local fine details are more palpable for obstetricians and gynecologists to carry out prenatal diagnosis of congenital heart disease. A robust despeckling filter has to be contrived to proficiently suppress speckle noise and simultaneously preserve the features. The proposed filter is the generalization of Rayleigh maximum likelihood filter by the exploitation of statistical tools as tuning parameters and use different shapes of quadrilateral kernels to estimate the noise free pixel from neighborhood. The performance of various filters namely Median, Kuwahura, Frost, Homogenous mask filter and Rayleigh maximum likelihood filter are compared with the proposed filter in terms PSNR and image profile. Comparatively the proposed filters surpass the conventional filters.

  8. Implementation and assessment of a likelihood ratio approach for the evaluation of LA-ICP-MS evidence in forensic glass analysis.

    Science.gov (United States)

    van Es, Andrew; Wiarda, Wim; Hordijk, Maarten; Alberink, Ivo; Vergeer, Peter

    2017-05-01

    For the comparative analysis of glass fragments, a method using Laser Ablation Inductively Coupled Plasma Mass Spectrometry (LA-ICP-MS) is in use at the NFI, giving measurements of the concentration of 18 elements. An important question is how to evaluate the results as evidence that a glass sample originates from a known glass source or from an arbitrary different glass source. One approach is the use of matching criteria e.g. based on a t-test or overlap of confidence intervals. An important drawback of this method is the fact that the rarity of the glass composition is not taken into account. A similar match can have widely different evidential values. In addition the use of fixed matching criteria can give rise to a "fall off the cliff" effect. Small differences may result in a match or a non-match. In this work a likelihood ratio system is presented, largely based on the two-level model as proposed by Aitken and Lucy [1], and Aitken, Zadora and Lucy [2]. Results show that the output from the two-level model gives good discrimination between same and different source hypotheses, but a post-hoc calibration step is necessary to improve the accuracy of the likelihood ratios. Subsequently, the robustness and performance of the LR system are studied. Results indicate that the output of the LR system is robust to the sample properties of the dataset used for calibration. Furthermore, the empirical upper and lower bound method [3], designed to deal with extrapolation errors in the density models, results in minimum and maximum values of the LR outputted by the system of 3.1×10 -3 and 3.4×10 4 . Calibration of the system, as measured by empirical cross-entropy, shows good behavior over the complete prior range. Rates of misleading evidence are small: for same-source comparisons, 0.3% of LRs support a different-source hypothesis; for different-source comparisons, 0.2% supports a same-source hypothesis. The authors use the LR system in reporting of glass cases to

  9. Stratospheric Aerosol and Gas Experiment III on the International Space Station (SAGE III/ISS)

    Science.gov (United States)

    Gasbarre, Joseph; Walker, Richard; Cisewski, Michael; Zawodny, Joseph; Cheek, Dianne; Thornton, Brooke

    2015-01-01

    The Stratospheric Aerosol and Gas Experiment III on the International Space Station (SAGE III/ISS) mission will extend the SAGE data record from the ideal vantage point of the International Space Station (ISS). The ISS orbital inclination is ideal for SAGE measurements providing coverage between 70 deg north and 70 deg south latitude. The SAGE data record includes an extensively validated data set including aerosol optical depth data dating to the Stratospheric Aerosol Measurement (SAM) experiments in 1975 and 1978 and stratospheric ozone profile data dating to the Stratospheric Aerosol and Gas Experiment (SAGE) in 1979. These and subsequent data records, notably from the SAGE II experiment launched on the Earth Radiation Budget Satellite in 1984 and the SAGE III experiment launched on the Russian Meteor-3M satellite in 2001, have supported a robust, long-term assessment of key atmospheric constituents. These scientific measurements provide the basis for the analysis of five of the nine critical constituents (aerosols, ozone (O3), nitrogen dioxide (NO2), water vapor (H2O), and air density using O2) identified in the U.S. National Plan for Stratospheric Monitoring. SAGE III on ISS was originally scheduled to fly on the ISS in the same timeframe as the Meteor-3M mission, but was postponed due to delays in ISS construction. The project was re-established in 2009.

  10. Evaluating Fast Maximum Likelihood-Based Phylogenetic Programs Using Empirical Phylogenomic Data Sets

    Science.gov (United States)

    Zhou, Xiaofan; Shen, Xing-Xing; Hittinger, Chris Todd

    2018-01-01

    Abstract The sizes of the data matrices assembled to resolve branches of the tree of life have increased dramatically, motivating the development of programs for fast, yet accurate, inference. For example, several different fast programs have been developed in the very popular maximum likelihood framework, including RAxML/ExaML, PhyML, IQ-TREE, and FastTree. Although these programs are widely used, a systematic evaluation and comparison of their performance using empirical genome-scale data matrices has so far been lacking. To address this question, we evaluated these four programs on 19 empirical phylogenomic data sets with hundreds to thousands of genes and up to 200 taxa with respect to likelihood maximization, tree topology, and computational speed. For single-gene tree inference, we found that the more exhaustive and slower strategies (ten searches per alignment) outperformed faster strategies (one tree search per alignment) using RAxML, PhyML, or IQ-TREE. Interestingly, single-gene trees inferred by the three programs yielded comparable coalescent-based species tree estimations. For concatenation-based species tree inference, IQ-TREE consistently achieved the best-observed likelihoods for all data sets, and RAxML/ExaML was a close second. In contrast, PhyML often failed to complete concatenation-based analyses, whereas FastTree was the fastest but generated lower likelihood values and more dissimilar tree topologies in both types of analyses. Finally, data matrix properties, such as the number of taxa and the strength of phylogenetic signal, sometimes substantially influenced the programs’ relative performance. Our results provide real-world gene and species tree phylogenetic inference benchmarks to inform the design and execution of large-scale phylogenomic data analyses. PMID:29177474

  11. Applying exclusion likelihoods from LHC searches to extended Higgs sectors

    International Nuclear Information System (INIS)

    Bechtle, Philip; Heinemeyer, Sven; Staal, Oscar; Stefaniak, Tim; Weiglein, Georg

    2015-01-01

    LHC searches for non-standard Higgs bosons decaying into tau lepton pairs constitute a sensitive experimental probe for physics beyond the Standard Model (BSM), such as supersymmetry (SUSY). Recently, the limits obtained from these searches have been presented by the CMS collaboration in a nearly model-independent fashion - as a narrow resonance model - based on the full 8 TeV dataset. In addition to publishing a 95 % C.L. exclusion limit, the full likelihood information for the narrowresonance model has been released. This provides valuable information that can be incorporated into global BSM fits. We present a simple algorithm that maps an arbitrary model with multiple neutral Higgs bosons onto the narrow resonance model and derives the corresponding value for the exclusion likelihood from the CMS search. This procedure has been implemented into the public computer code HiggsBounds (version 4.2.0 and higher). We validate our implementation by cross-checking against the official CMS exclusion contours in three Higgs benchmark scenarios in the Minimal Supersymmetric Standard Model (MSSM), and find very good agreement. Going beyond validation, we discuss the combined constraints of the ττ search and the rate measurements of the SM-like Higgs at 125 GeV in a recently proposed MSSM benchmark scenario, where the lightest Higgs boson obtains SM-like couplings independently of the decoupling of the heavier Higgs states. Technical details for how to access the likelihood information within HiggsBounds are given in the appendix. The program is available at http:// higgsbounds.hepforge.org. (orig.)

  12. Existence and uniqueness of the maximum likelihood estimator for models with a Kronecker product covariance structure

    NARCIS (Netherlands)

    Ros, B.P.; Bijma, F.; de Munck, J.C.; de Gunst, M.C.M.

    2016-01-01

    This paper deals with multivariate Gaussian models for which the covariance matrix is a Kronecker product of two matrices. We consider maximum likelihood estimation of the model parameters, in particular of the covariance matrix. There is no explicit expression for the maximum likelihood estimator

  13. Computation of the Likelihood in Biallelic Diffusion Models Using Orthogonal Polynomials

    Directory of Open Access Journals (Sweden)

    Claus Vogl

    2014-11-01

    Full Text Available In population genetics, parameters describing forces such as mutation, migration and drift are generally inferred from molecular data. Lately, approximate methods based on simulations and summary statistics have been widely applied for such inference, even though these methods waste information. In contrast, probabilistic methods of inference can be shown to be optimal, if their assumptions are met. In genomic regions where recombination rates are high relative to mutation rates, polymorphic nucleotide sites can be assumed to evolve independently from each other. The distribution of allele frequencies at a large number of such sites has been called “allele-frequency spectrum” or “site-frequency spectrum” (SFS. Conditional on the allelic proportions, the likelihoods of such data can be modeled as binomial. A simple model representing the evolution of allelic proportions is the biallelic mutation-drift or mutation-directional selection-drift diffusion model. With series of orthogonal polynomials, specifically Jacobi and Gegenbauer polynomials, or the related spheroidal wave function, the diffusion equations can be solved efficiently. In the neutral case, the product of the binomial likelihoods with the sum of such polynomials leads to finite series of polynomials, i.e., relatively simple equations, from which the exact likelihoods can be calculated. In this article, the use of orthogonal polynomials for inferring population genetic parameters is investigated.

  14. Calculus III essentials

    CERN Document Server

    REA, Editors of

    2012-01-01

    REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Calculus III includes vector analysis, real valued functions, partial differentiation, multiple integrations, vector fields, and infinite series.

  15. A randomized phase III trial comparing concomitant chemoradiotherapy versus radiotherapy alone in advanced head and neck cancers - mature results: Yoodhvir Singh Nagar, Lucknow Cancer Institute, India

    International Nuclear Information System (INIS)

    Singh, S.; Kumar, S.; Datta, N.R.

    2003-01-01

    To evaluate contribution of concomitant chemoradiotherapy (CTRT) over and above radiotherapy alone (RT) in previously untreated stage III/IV, squamous cell carcinoma of the head and neck (SCCH and N). Patients with cancers of oral cavity (OC), oropharynx (OP), supraglottis (SG) and hypopharynx (HP) were randomized into RT arm or CTRT arm. Radiotherapy was identical in both arms (70Gy/35 fractions/7 weeks). In CTRT arm concomitant cisplatin (35mg/m 2 ) was given weekly for seven cycles. Surgery was reserved for salvage purposes when required. From May 1996 to December 1998, 155 patients (RT=78, CTRT=77) were enrolled and 139 patients (RT1,CTRT=68) were assessable. Over 90% patients in both arms completed planned treatment. The complete response rate was 51% in RT arm and 71% in CTRT arm (p=0.017). The median disease free survival (DFS) and overall survival (OS) in RT arm and CTRT arm were 3 months vs. 11 months (p=0.0009) and 9 months vs. 26 months (p=0.01) respectively. The 5-year DFS and OS in the two arms were 10% vs. 27% (p=0.000) and 16% vs. 29% (p=0.01) respectively. Acute grade III toxicity was comparable in both the arms (12% vs. 16%, p=0.74). Late grade I/II toxicity was higher in CTRT arm (70% vs. 51%, p=0.09). Serious late toxicities were not seen in both arms. On univariate analysis the favorable factors for immediate response were protocol (CTRT better), primary site (SG and OP better than OC and HP), T-stage (T1-T2 better), Nodes (N0 better than N+), Stage (III > IV), KPS (>80) and lesser overall treatment time (OTT). Multivariate analysis retained protocol, T stage, N stage and OTT as factors independently affecting the immediate response. Addition of concomitant weekly cisplatin (35mg/m 2 ) to radiotherapy improves the likelihood of local control, DFS and OS with acceptable acute and late toxicities and can be recommended as a standard form of treatment in advanced SCCH and N

  16. Maximum likelihood estimation of the parameters of nonminimum phase and noncausal ARMA models

    DEFF Research Database (Denmark)

    Rasmussen, Klaus Bolding

    1994-01-01

    The well-known prediction-error-based maximum likelihood (PEML) method can only handle minimum phase ARMA models. This paper presents a new method known as the back-filtering-based maximum likelihood (BFML) method, which can handle nonminimum phase and noncausal ARMA models. The BFML method...... is identical to the PEML method in the case of a minimum phase ARMA model, and it turns out that the BFML method incorporates a noncausal ARMA filter with poles outside the unit circle for estimation of the parameters of a causal, nonminimum phase ARMA model...

  17. Reporting of the translation and cultural adaptation procedures of the Addenbrooke's Cognitive Examination version III (ACE-III) and its predecessors: a systematic review.

    Science.gov (United States)

    Mirza, Nadine; Panagioti, Maria; Waheed, Muhammad Wali; Waheed, Waquas

    2017-09-13

    The ACE-III, a gold standard for screening cognitive impairment, is restricted by language and culture, with no uniform set of guidelines for its adaptation. To develop guidelines a compilation of all the adaptation procedures undertaken by adapters of the ACE-III and its predecessors is needed. We searched EMBASE, Medline and PsychINFO and screened publications from a previous review. We included publications on adapted versions of the ACE-III and its predecessors, extracting translation and cultural adaptation procedures and assessing their quality. We deemed 32 papers suitable for analysis. 7 translation steps were identified and we determined which items of the ACE-III are culturally dependent. This review lists all adaptations of the ACE, ACE-R and ACE-III, rates the reporting of their adaptation procedures and summarises adaptation procedures into steps that can be undertaken by adapters.

  18. Two-Stage Maximum Likelihood Estimation (TSMLE for MT-CDMA Signals in the Indoor Environment

    Directory of Open Access Journals (Sweden)

    Sesay Abu B

    2004-01-01

    Full Text Available This paper proposes a two-stage maximum likelihood estimation (TSMLE technique suited for multitone code division multiple access (MT-CDMA system. Here, an analytical framework is presented in the indoor environment for determining the average bit error rate (BER of the system, over Rayleigh and Ricean fading channels. The analytical model is derived for quadrature phase shift keying (QPSK modulation technique by taking into account the number of tones, signal bandwidth (BW, bit rate, and transmission power. Numerical results are presented to validate the analysis, and to justify the approximations made therein. Moreover, these results are shown to agree completely with those obtained by simulation.

  19. Kuosheng Mark III containment analyses using GOTHIC

    Energy Technology Data Exchange (ETDEWEB)

    Lin, Ansheng, E-mail: samuellin1999@iner.gov.tw; Chen, Yen-Shu; Yuann, Yng-Ruey

    2013-10-15

    Highlights: • The Kuosheng Mark III containment model is established using GOTHIC. • Containment pressure and temperature responses due to LOCA are presented. • The calculated results are all below the design values and compared with the FSAR results. • The calculated results can be served as an analysis reference for an SPU project in the future. -- Abstract: Kuosheng nuclear power plant in Taiwan is a twin-unit BWR/6 plant, and both units utilize the Mark III containment. Currently, the plant is performing a stretch power uprate (SPU) project to increase the core thermal power to 103.7% OLTP (original licensed thermal power). However, the containment response in the Kuosheng Final Safety Analysis Report (FSAR) was completed more than twenty-five years ago. The purpose of this study is to establish a Kuosheng Mark III containment model using the containment program GOTHIC. The containment pressure and temperature responses under the design-basis accidents, which are the main steam line break (MSLB) and the recirculation line break (RCLB) accidents, are investigated. Short-term and long-term analyses are presented in this study. The short-term analysis is to calculate the drywell peak pressure and temperature which happen in the early stage of the LOCAs. The long-term analysis is to calculate the peak pressure and temperature of the reactor building space. In the short-term analysis, the calculated peak drywell to wetwell differential pressure is 140.6 kPa for the MSLB, which is below than the design value of 189.6 kPa. The calculated peak drywell temperature is 158 °C, which is still below the design value of 165.6 °C. In addition, in the long-term analysis, the calculated peak containment pressure is 47 kPa G, which is below the design value of 103.4 kPa G. The calculated peak values of containment temperatures are 74.7 °C, which is lower than the design value of 93.3 °C. Therefore, the Kuosheng Mark III containment can maintain the integrity after

  20. Microdosimetry measurements with the RME-III on the space shuttle

    International Nuclear Information System (INIS)

    Hardy, K.; Golightly, M.J.; Atwell, W.; Quam, W.

    1994-01-01

    Since December 1988 (STS-27) the USAF Armstrong Laboratory, in conjunction with the NASA Space Radiation Analysis Group, has been conducting microdosimetry measurements on selected high-altitude, high-inclination Space Shuttle mission with the RME-III. The RME-III is a portable, self-contained, active dosimeter system featuring a three-channel tissue equivalent proportional counter (TEPC) which measures particle fluence and computes dose and dose equivalent at operator selected time intervals. The total accumulated absorbed dose and dose equivalent are displayed real time, while the data and the time of the interval dose readings are stored in memory modules for later analysis. Analysis of the time-resolved data permits correlation of the radiation exposure with geographic position, altitude, and spacecraft shielding and orientation. The RME-III has flown on 15 Shuttle missions to date and measurements are in good agreement with other dosimetry measurements made on the Shuttle

  1. Derivation of LDA log likelihood ratio one-to-one classifier

    NARCIS (Netherlands)

    Spreeuwers, Lieuwe Jan

    2014-01-01

    The common expression for the Likelihood Ratio classifier using LDA assumes that the reference class mean is available. In biometrics, this is often not the case and only a single sample of the reference class is available. In this paper expressions are derived for biometric comparison between

  2. Counseling Pretreatment and the Elaboration Likelihood Model of Attitude Change.

    Science.gov (United States)

    Heesacker, Martin

    1986-01-01

    Results of the application of the Elaboration Likelihood Model (ELM) to a counseling context revealed that more favorable attitudes toward counseling occurred as subjects' ego involvement increased and as intervention quality improved. Counselor credibility affected the degree to which subjects' attitudes reflected argument quality differences.…

  3. Likelihood updating of random process load and resistance parameters by monitoring

    DEFF Research Database (Denmark)

    Friis-Hansen, Peter; Ditlevsen, Ove Dalager

    2003-01-01

    that maximum likelihood estimation is a rational alternative to an arbitrary weighting for least square fitting. The derived likelihood function gets singularities if the spectrum is prescribed with zero values at some frequencies. This is often the case for models of technically relevant processes......, even though it is of complicated mathematical form, allows an approximate Bayesian updating and control of the time development of the parameters. Some of these parameters can be structural parameters that by too much change reveal progressing damage or other malfunctioning. Thus current process......Spectral parameters for a stationary Gaussian process are most often estimated by Fourier transformation of a realization followed by some smoothing procedure. This smoothing is often a weighted least square fitting of some prespecified parametric form of the spectrum. In this paper it is shown...

  4. Menyoal Elaboration Likelihood Model (ELM) dan Teori Retorika

    OpenAIRE

    Yudi Perbawaningsih

    2012-01-01

    Abstract: Persuasion is a communication process to establish or change attitudes, which can be understood through theory of Rhetoric and theory of Elaboration Likelihood Model (ELM). This study elaborates these theories in a Public Lecture series which to persuade the students in choosing their concentration of study. The result shows that in term of persuasion effectiveness it is not quite relevant to separate the message and its source. The quality of source is determined by the quality of ...

  5. Menyoal Elaboration Likelihood Model (ELM) Dan Teori Retorika

    OpenAIRE

    Perbawaningsih, Yudi

    2012-01-01

    : Persuasion is a communication process to establish or change attitudes, which can be understood through theory of Rhetoric and theory of Elaboration Likelihood Model (ELM). This study elaborates these theories in a Public Lecture series which to persuade the students in choosing their concentration of study. The result shows that in term of persuasion effectiveness it is not quite relevant to separate the message and its source. The quality of source is determined by the quality of the mess...

  6. Monte Carlo studies of ZEPLIN III

    CERN Document Server

    Dawson, J; Davidge, D C R; Gillespie, J R; Howard, A S; Jones, W G; Joshi, M; Lebedenko, V N; Sumner, T J; Quenby, J J

    2002-01-01

    A Monte Carlo simulation of a two-phase xenon dark matter detector, ZEPLIN III, has been achieved. Results from the analysis of a simulated data set are presented, showing primary and secondary signal distributions from low energy gamma ray events.

  7. Maximum-likelihood fitting of data dominated by Poisson statistical uncertainties

    International Nuclear Information System (INIS)

    Stoneking, M.R.; Den Hartog, D.J.

    1996-06-01

    The fitting of data by χ 2 -minimization is valid only when the uncertainties in the data are normally distributed. When analyzing spectroscopic or particle counting data at very low signal level (e.g., a Thomson scattering diagnostic), the uncertainties are distributed with a Poisson distribution. The authors have developed a maximum-likelihood method for fitting data that correctly treats the Poisson statistical character of the uncertainties. This method maximizes the total probability that the observed data are drawn from the assumed fit function using the Poisson probability function to determine the probability for each data point. The algorithm also returns uncertainty estimates for the fit parameters. They compare this method with a χ 2 -minimization routine applied to both simulated and real data. Differences in the returned fits are greater at low signal level (less than ∼20 counts per measurement). the maximum-likelihood method is found to be more accurate and robust, returning a narrower distribution of values for the fit parameters with fewer outliers

  8. Sequence analysis of mitochondrial DNA hypervariable region III of ...

    African Journals Online (AJOL)

    The aims of this research were to study mitochondrial DNA hypervariable region III and establish the degree of variation characteristic of a fragment. The mitochondrial DNA (mtDNA) is a small circular genome located within the mitochondria in the cytoplasm of the cell and a smaller 1.2 kb pair fragment, called the control ...

  9. Luminescence study on solvation of americium(III), curium(III) and several lanthanide(III) ions in nonaqueous and binary mixed solvents

    International Nuclear Information System (INIS)

    Kimura, T.; Nagaishi, R.; Kato, Y.; Yoshida, Z.

    2001-01-01

    The luminescence lifetimes of An(III) and Ln(III) ions [An=Am and Cm; Ln=Nd, Sm, Eu, Tb and Dy] were measured in dimethyl sulfoxide(DMSO), N,N-dimethylformamide(DMF), methanol(MeOH), water and their perdeuterated solvents. Nonradiative decay rates of the ions were in the order of H 2 O > MeOH > DMF > DMSO, indicating that O-H vibration is more effective quencher than C-H, C=O, and S=O vibrations in the solvent molecules. Maximal lifetime ratios τ D /τ H were observed for Eu(III) in H 2 O, for Sm(III) in MeOH and DMF, and for Sm(III) and Dy(III) in DMSO. The solvent composition in the first coordination sphere of Cm(III) and Ln(III) in binary mixed solvents was also studied by measuring the luminescence lifetime. Cm(III) and Ln(III) were preferentially solvated by DMSO in DMSO-H 2 O, by DMF in DMF-H 2 O, and by H 2 O in MeOH-H 2 O over the whole range of the solvent composition. The order of the preferential solvation, i.e., DMSO > DMF > H 2 O > MeOH, correlates with the relative basicity of these solvents. The Gibbs free energy of transfer of ions from water to nonaqueous solvents was further estimated from the degree of the preferential solvation. (orig.)

  10. Analysis of novel silicon and III-V solar cells by simulation and experiment; Analyse neuartiger Silizium- und III-V-Solarzellen mittels Simulation und Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Hermle, Martin

    2008-11-27

    This work presents various simulation studies of silicon and III-V solar cells. For standard silicon solar cells, one of the critical parameters to obtain good performance, is the rear side recombination velocity. The optical and electrical differences of the different cell structures were determined. The optical differences and the effective recombination velocity Sback of the different rear side structures for 1 Ohmcm material were extracted. Beside standard silicon solar cells, back junction silicon solar cells were investigated. Especially the influence of the front surface field and the electrical shading due to the rear side, was investigated. In the last two chapters, III-V solar cells were analysed. For the simulation of III-V multi-junction solar cells, the simulation of the tunneldiode is the basic prerequisite. In this work, the numerical calibration of an GaAs tunneldiode was achieved by using an non-local tunnel model. Using this model, it was possible to successfully simulate a III-V tandem solar cell. The last chapter deals with an optimization of the III-V 3-junction cell for space applications. Especially the influence of the GaAs middle cell was investigated. Due to structural changes, the end-of-life efficiency was drastically increased.

  11. Robust Likelihoods for Inflationary Gravitational Waves from Maps of Cosmic Microwave Background Polarization

    Science.gov (United States)

    Switzer, Eric Ryan; Watts, Duncan J.

    2016-01-01

    The B-mode polarization of the cosmic microwave background provides a unique window into tensor perturbations from inflationary gravitational waves. Survey effects complicate the estimation and description of the power spectrum on the largest angular scales. The pixel-space likelihood yields parameter distributions without the power spectrum as an intermediate step, but it does not have the large suite of tests available to power spectral methods. Searches for primordial B-modes must rigorously reject and rule out contamination. Many forms of contamination vary or are uncorrelated across epochs, frequencies, surveys, or other data treatment subsets. The cross power and the power spectrum of the difference of subset maps provide approaches to reject and isolate excess variance. We develop an analogous joint pixel-space likelihood. Contamination not modeled in the likelihood produces parameter-dependent bias and complicates the interpretation of the difference map. We describe a null test that consistently weights the difference map. Excess variance should either be explicitly modeled in the covariance or be removed through reprocessing the data.

  12. Cost analysis of surgically treated pressure sores stage III and IV.

    Science.gov (United States)

    Filius, A; Damen, T H C; Schuijer-Maaskant, K P; Polinder, S; Hovius, S E R; Walbeehm, E T

    2013-11-01

    Health-care costs associated with pressure sores are significant and their financial burden is likely to increase even further. The aim of this study was to analyse the direct medical costs of hospital care for surgical treatment of pressure sores stage III and IV. We performed a retrospective chart study of patients who were surgically treated for stage III and IV pressure sores between 2007 and 2010. Volumes of health-care use were obtained for all patients and direct medical costs were subsequently calculated. In addition, we evaluated the effect of location and number of pressure sores on total costs. A total of 52 cases were identified. Average direct medical costs in hospital were €20,957 for the surgical treatment of pressure sores stage III or IV; average direct medical costs for patients with one pressure sore on an extremity (group 1, n = 5) were €30,286, €10,113 for patients with one pressure sore on the trunk (group 2, n = 32) and €40,882 for patients with multiple pressure sores (group 3, n = 15). The additional costs for patients in group 1 and group 3 compared to group 2 were primarily due to longer hospitalisation. The average direct medical costs for surgical treatment of pressure sores stage III and IV were high. Large differences in costs were related to the location and number of pressure sores. Insight into the distribution of these costs allows identification of high-risk patients and enables the development of specific cost-reducing measures. Copyright © 2013 British Association of Plastic, Reconstructive and Aesthetic Surgeons. Published by Elsevier Ltd. All rights reserved.

  13. Luminescence studies of Sm(III) and Cm(III) complexes in NaSCN/DHDECMP extraction systems

    CERN Document Server

    Chung, D Y; Kimura, T

    1999-01-01

    Laser-induced fluorescence (LIF) studies of Sm(III) and Cm(III) complexes in the NaSCN/DHDECMP solvent extraction system were carried out. Luminescence lifetimes were measured to determine the number of water molecules coordinated to Sm(III), Tb(III), Dy(III), and Cm(III) in the sodium thiocyanate solution and in the DHDECMP phase. The hydration number of Sm(III), Tb(III), Dy(III), and Cm(III) in the sodium thiocyanate solution decreased linearly with increasing sodium thiocyanate concentration. The hydration numbers of Sm(III), Dy(III), and Cm(III) in the DHDECMP phase decreased with increasing sodium thiocyanate concentration. The water molecules in the inner coordination sphere of Sm(III) and Dy(III) extracted into the DHDECMP were not completely removed at low sodium thiocyanate concentration but decreased with increasing sodium thiocyanate concentration. However, in the case of Cm(III) extracted into the DHDECMP phase from the sodium thiocyanate solution, there was no water in the inner coordination sphe...

  14. A simplification of the likelihood ratio test statistic for testing ...

    African Journals Online (AJOL)

    The traditional likelihood ratio test statistic for testing hypothesis about goodness of fit of multinomial probabilities in one, two and multi – dimensional contingency table was simplified. Advantageously, using the simplified version of the statistic to test the null hypothesis is easier and faster because calculating the expected ...

  15. Impact of Fe(III)-OM complexes and Fe(III) polymerization on SOM pools reactivity under different land uses

    Science.gov (United States)

    Giannetta, B.; Plaza, C.; Zaccone, C.; Siebecker, M. G.; Rovira, P.; Vischetti, C.; Sparks, D. L.

    2017-12-01

    Soil organic matter (SOM) protection and long-term accumulation are controlled by adsorption to mineral surfaces in different ways, depending on its molecular structure and pedo-climatic conditions. Iron (Fe) oxides are known to be key regulators of the soil carbon (C) cycle, and Fe speciation in soils is highly dependent on environmental conditions and chemical interactions with SOM. However, the molecular structure and hydrolysis of Fe species formed in association with SOM is still poorly described. We hypothesize the existence of two pools of Fe which interact with SOM: mononuclear Fe(III)-SOM complexes and precipitated Fe(III) hydroxides. To verify our hypothesis, we investigated the interactions between Fe(III) and physically isolated soil fractions by means of batch experiments at pH 7. Specifically, we examined the fine silt plus clay (FSi+C) fraction, obtained by ultrasonic dispersion and wet sieving. The soil samples spanned several land uses, including coniferous forest (CFS), grassland (GS), technosols (TS) and agricultural (AS) soils. Solid phase products and supernatants were analyzed for C and Fe content. X-ray diffraction (XRD) and Brunauer-Emmett-Teller (BET) analysis were also performed. Attenuated total reflectance Fourier transform infrared spectroscopy (ATR-FTIR) was used to assess the main C functional groups involved in C complexation and desorption experiments. Preliminary linear combination fitting (LCF) of Fe K-edge extended X-ray absorption fine structure (EXAFS) spectra suggested the formation of ferrihydrite-like polymeric Fe(III) oxides in reacted CFS and GS samples, with higher C and Fe concentration. Conversely, mononuclear Fe(III) OM complexes dominated the speciation for TS and AS samples, characterized by lower C and Fe concentration, inhibiting the hydrolysis and polymerization of Fe (III). This approach will help revealing the mechanisms by which SOM pools can control Fe(III) speciation, and will elucidate how both Fe(III

  16. A Game Theoretical Approach to Hacktivism: Is Attack Likelihood a Product of Risks and Payoffs?

    Science.gov (United States)

    Bodford, Jessica E; Kwan, Virginia S Y

    2018-02-01

    The current study examines hacktivism (i.e., hacking to convey a moral, ethical, or social justice message) through a general game theoretic framework-that is, as a product of costs and benefits. Given the inherent risk of carrying out a hacktivist attack (e.g., legal action, imprisonment), it would be rational for the user to weigh these risks against perceived benefits of carrying out the attack. As such, we examined computer science students' estimations of risks, payoffs, and attack likelihood through a game theoretic design. Furthermore, this study aims at constructing a descriptive profile of potential hacktivists, exploring two predicted covariates of attack decision making, namely, peer prevalence of hacking and sex differences. Contrary to expectations, results suggest that participants' estimations of attack likelihood stemmed solely from expected payoffs, rather than subjective risks. Peer prevalence significantly predicted increased payoffs and attack likelihood, suggesting an underlying descriptive norm in social networks. Notably, we observed no sex differences in the decision to attack, nor in the factors predicting attack likelihood. Implications for policymakers and the understanding and prevention of hacktivism are discussed, as are the possible ramifications of widely communicated payoffs over potential risks in hacking communities.

  17. Likelihood-Based Inference of B Cell Clonal Families.

    Directory of Open Access Journals (Sweden)

    Duncan K Ralph

    2016-10-01

    Full Text Available The human immune system depends on a highly diverse collection of antibody-making B cells. B cell receptor sequence diversity is generated by a random recombination process called "rearrangement" forming progenitor B cells, then a Darwinian process of lineage diversification and selection called "affinity maturation." The resulting receptors can be sequenced in high throughput for research and diagnostics. Such a collection of sequences contains a mixture of various lineages, each of which may be quite numerous, or may consist of only a single member. As a step to understanding the process and result of this diversification, one may wish to reconstruct lineage membership, i.e. to cluster sampled sequences according to which came from the same rearrangement events. We call this clustering problem "clonal family inference." In this paper we describe and validate a likelihood-based framework for clonal family inference based on a multi-hidden Markov Model (multi-HMM framework for B cell receptor sequences. We describe an agglomerative algorithm to find a maximum likelihood clustering, two approximate algorithms with various trade-offs of speed versus accuracy, and a third, fast algorithm for finding specific lineages. We show that under simulation these algorithms greatly improve upon existing clonal family inference methods, and that they also give significantly different clusters than previous methods when applied to two real data sets.

  18. Association of Eu(III) and Cm(III) with Bacillus subtilis and Halobacterium salinarum

    International Nuclear Information System (INIS)

    Ozaki, Takuo; Kimura, Takaumi; Ohnuki, Toshihiko; Yoshida, Zenko

    2002-01-01

    Adsorption behavior of Eu(III) and Cm(III) by Bacillus subtilis and Halobacterium salinarum was investigated. Both microorganisms showed almost identical pH dependence on the distribution ratio (K d ) of the metals examined, i.e., K d of Eu(III) and Cm(III) increased with an increase of pH. The coordination state of Eu(III) adsorbed on the microorganisms was studied by time-resolved laser-induced fluorescence spectroscopy (TRLFS). The coordination states of Eu(III) adsorbed on the B. subtilis and H. salinarum was of different characteristics. H. salinarum exhibited more outer-spherical interaction with Eu(III) than B. subtilis. (author)

  19. Quantitative risk analysis of the pipeline GASDUC III - solutions

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Edmilson P.; Bettoni, Izabel Cristina [PETROBRAS, Rio de Janeiro, RJ (Brazil)

    2009-07-01

    In this work the quantitative risks analysis to the external public of the Pipeline Cabiunas - REDUC (GASDUC III), with 180 km, linking the municipalities of Macae and Duque de Caxias - RJ was performed by the Companies PETROBRAS and ITSEMAP do Brasil. In addition to the large diameter of the pipeline 38 inches and high operation pressure 100 kgf/cm{sup 2} operating with natural gas through several densely populated areas. Initially, the individual risk contours were calculated without considering mitigating measures, obtaining as result the individual risk contour with frequencies of 1x10{sup -06} per year involving sensitive occupations and therefore considered unacceptable when compared with the INEA criterion. The societal risk was calculated for eight densely populated areas and their respective FN-curves situated below the advised limit established by INEA, except for two areas that required the proposal of additional mitigating measures to the reduction of societal risk. Regarding to societal risk, the FN-curve should be below the advised limit presented in the Technical Instruction of INEA. The individual and societal risk were reassessed incorporating some mitigating measures and the results situated below the advised limits established by INEA and PETROBRAS has obtained the license for installation of the pipeline. (author)

  20. Revision and extension to the analysis of the third spectrum of bromine: Br III

    Science.gov (United States)

    Jabeen, S.; Tauheed, A.

    2015-03-01

    The spectrum of doubly ionized bromine (Br III) has been investigated in the vacuum ultraviolet wavelength region. Br2+ is an As-like ion with ground configuration of 4s24p3, thus a 3-electron system possessing a complex structure. The theoretical prediction was made using Cowan's quasi-relativistic Hartree-Fock code with superposition of configurations involving the 4s4p4, 4s24p2 (4d+5d+6d+5s+6s+7s), 4s4p3 (5p+4f), 4p4(4d+5s), 4s24p5s5p, 4s4p2 (4d2+5s2), 4s4p24f2 configurations for the even parity matrix and the 4s24p3, 4s24p2 (5p+6p+4f+5f) configurations for the odd parity matrix. Several previously reported levels of Br III have been revised, and new configurations have been added to the analysis. The spectrum used for this work was recorded on a 3-m normal incidence spectrograph in the wavelength region of 400-1326 Å using a triggered vacuum spark source. One hundred and two energy levels belonging to the 4s24p3, 4s4p4, 4s24p2 (4d+5d+6d+5s+6s +7s) configurations have been established, eighty-six being new. Two hundred and seventy-eight lines have been identified in this spectrum. The accuracy of our wavelength measurements for sharp and unblended lines is ±0.006 Å. The ionization potential of Br III was found to be 281,250±100 cm-1 (34.870±0.012 eV).

  1. A biclustering algorithm for binary matrices based on penalized Bernoulli likelihood

    KAUST Repository

    Lee, Seokho; Huang, Jianhua Z.

    2013-01-01

    We propose a new biclustering method for binary data matrices using the maximum penalized Bernoulli likelihood estimation. Our method applies a multi-layer model defined on the logits of the success probabilities, where each layer represents a

  2. Coalescent-based species tree inference from gene tree topologies under incomplete lineage sorting by maximum likelihood.

    Science.gov (United States)

    Wu, Yufeng

    2012-03-01

    Incomplete lineage sorting can cause incongruence between the phylogenetic history of genes (the gene tree) and that of the species (the species tree), which can complicate the inference of phylogenies. In this article, I present a new coalescent-based algorithm for species tree inference with maximum likelihood. I first describe an improved method for computing the probability of a gene tree topology given a species tree, which is much faster than an existing algorithm by Degnan and Salter (2005). Based on this method, I develop a practical algorithm that takes a set of gene tree topologies and infers species trees with maximum likelihood. This algorithm searches for the best species tree by starting from initial species trees and performing heuristic search to obtain better trees with higher likelihood. This algorithm, called STELLS (which stands for Species Tree InfErence with Likelihood for Lineage Sorting), has been implemented in a program that is downloadable from the author's web page. The simulation results show that the STELLS algorithm is more accurate than an existing maximum likelihood method for many datasets, especially when there is noise in gene trees. I also show that the STELLS algorithm is efficient and can be applied to real biological datasets. © 2011 The Author. Evolution© 2011 The Society for the Study of Evolution.

  3. Modified Moment, Maximum Likelihood and Percentile Estimators for the Parameters of the Power Function Distribution

    Directory of Open Access Journals (Sweden)

    Azam Zaka

    2014-10-01

    Full Text Available This paper is concerned with the modifications of maximum likelihood, moments and percentile estimators of the two parameter Power function distribution. Sampling behavior of the estimators is indicated by Monte Carlo simulation. For some combinations of parameter values, some of the modified estimators appear better than the traditional maximum likelihood, moments and percentile estimators with respect to bias, mean square error and total deviation.

  4. Failed refutations: further comments on parsimony and likelihood methods and their relationship to Popper's degree of corroboration.

    Science.gov (United States)

    de Queiroz, Kevin; Poe, Steven

    2003-06-01

    Kluge's (2001, Syst. Biol. 50:322-330) continued arguments that phylogenetic methods based on the statistical principle of likelihood are incompatible with the philosophy of science described by Karl Popper are based on false premises related to Kluge's misrepresentations of Popper's philosophy. Contrary to Kluge's conjectures, likelihood methods are not inherently verificationist; they do not treat every instance of a hypothesis as confirmation of that hypothesis. The historical nature of phylogeny does not preclude phylogenetic hypotheses from being evaluated using the probability of evidence. The low absolute probabilities of hypotheses are irrelevant to the correct interpretation of Popper's concept termed degree of corroboration, which is defined entirely in terms of relative probabilities. Popper did not advocate minimizing background knowledge; in any case, the background knowledge of both parsimony and likelihood methods consists of the general assumption of descent with modification and additional assumptions that are deterministic, concerning which tree is considered most highly corroborated. Although parsimony methods do not assume (in the sense of entailing) that homoplasy is rare, they do assume (in the sense of requiring to obtain a correct phylogenetic inference) certain things about patterns of homoplasy. Both parsimony and likelihood methods assume (in the sense of implying by the manner in which they operate) various things about evolutionary processes, although violation of those assumptions does not always cause the methods to yield incorrect phylogenetic inferences. Test severity is increased by sampling additional relevant characters rather than by character reanalysis, although either interpretation is compatible with the use of phylogenetic likelihood methods. Neither parsimony nor likelihood methods assess test severity (critical evidence) when used to identify a most highly corroborated tree(s) based on a single method or model and a

  5. Relationship between tumor gene expression and recurrence in four independent studies of patients with stage II/III colon cancer treated with surgery alone or surgery plus adjuvant fluorouracil plus leucovorin.

    Science.gov (United States)

    O'Connell, Michael J; Lavery, Ian; Yothers, Greg; Paik, Soonmyung; Clark-Langone, Kim M; Lopatin, Margarita; Watson, Drew; Baehner, Frederick L; Shak, Steven; Baker, Joffre; Cowens, J Wayne; Wolmark, Norman

    2010-09-01

    These studies were conducted to determine the relationship between quantitative tumor gene expression and risk of cancer recurrence in patients with stage II or III colon cancer treated with surgery alone or surgery plus fluorouracil (FU) and leucovorin (LV) to develop multigene algorithms to quantify the risk of recurrence as well as the likelihood of differential treatment benefit of FU/LV adjuvant chemotherapy for individual patients. We performed quantitative reverse transcription polymerase chain reaction (RT-qPCR) on RNA extracted from fixed, paraffin-embedded (FPE) tumor blocks from patients with stage II or III colon cancer who were treated with surgery alone (n = 270 from National Surgical Adjuvant Breast and Bowel Project [NSABP] C-01/C-02 and n = 765 from Cleveland Clinic [CC]) or surgery plus FU/LV (n = 308 from NSABP C-04 and n = 508 from NSABP C-06). Overall, 761 candidate genes were studied in C-01/C-02 and C-04, and a subset of 375 genes was studied in CC/C-06. A combined analysis of the four studies identified 48 genes significantly associated with risk of recurrence and 66 genes significantly associated with FU/LV benefit (with four genes in common). Seven recurrence-risk genes, six FU/LV-benefit genes, and five reference genes were selected, and algorithms were developed to identify groups of patients with low, intermediate, and high likelihood of recurrence and benefit from FU/LV. RT-qPCR of FPE colon cancer tissue applied to four large independent populations has been used to develop multigene algorithms for estimating recurrence risk and benefit from FU/LV. These algorithms are being independently validated, and their clinical utility is being evaluated in the Quick and Simple and Reliable (QUASAR) study.

  6. Statistical Bias in Maximum Likelihood Estimators of Item Parameters.

    Science.gov (United States)

    1982-04-01

    34 a> E r’r~e r ,C Ie I# ne,..,.rVi rnd Id.,flfv b1 - bindk numb.r) I; ,t-i i-cd I ’ tiie bias in the maximum likelihood ,st i- i;, ’ t iIeiIrs in...NTC, IL 60088 Psychometric Laboratory University of North Carolina I ERIC Facility-Acquisitions Davie Hall 013A 4833 Rugby Avenue Chapel Hill, NC

  7. Performances of the likelihood-ratio classifier based on different data modelings

    NARCIS (Netherlands)

    Chen, C.; Veldhuis, Raymond N.J.

    2008-01-01

    The classical likelihood ratio classifier easily collapses in many biometric applications especially with independent training-test subjects. The reason lies in the inaccurate estimation of the underlying user-specific feature density. Firstly, the feature density estimation suffers from

  8. Sorption of small amounts of europium(III) on iron(III) hydroxide and oxide

    International Nuclear Information System (INIS)

    Music, S.; Gessner, M.; Wolf, R.H.H.

    1979-01-01

    The sorption of small amounts of europium(III) on iron(III) hydroxide and oxide has been studied as a function of pH. The mechanism of sorption is discussed. Optimum conditions have been found for the preconcentration of small or trace amounts of europium(III) by iron(III) hydroxide and oxide. The influence of complexing agents (EDTA, oxalate, tartrate and 5-sulfosalicylic acid) on the sorption of small amounts of europium(III) on iron(III) oxide has also been studied. (author)

  9. Synthesis, Characterization and Antibacterial Studies of N-(Benzothiazol-2-yl)-4-chlorobenzenesulphonamide and Its Neodymium(III) and Thallium(III) Complexes.

    Science.gov (United States)

    Obasi, Lawrence Nnamdi; Oruma, Uchechukwu Susan; Al-Swaidan, Ibrahim Abdulrazak; Ramasami, Ponnadurai; Ezeorah, Chigozie Julius; Ochonogor, Alfred Ezinna

    2017-02-22

    N -(Benzothiazol-2-yl)-4-chlorobenzenesulphonamide (NBTCS) was synthesized by condensation reaction of 4-chlorobenzenesulphonyl chloride and 2-aminobenzothiazole in acetone under reflux. Neodymium(III) and thallium(III) complexes of the ligand were also synthesized. Both ligand and metal complexes were characterized using UV-Vis, IR, ¹H- and 13 C-NMR spectroscopies, elemental analysis and molar conductance measurement. IR studies revealed that the ligand is tridentate and coordinates to the metal ions through nitrogen and oxygen atoms of the sulphonamide group and nitrogen atom attached to benzothiazole ring. The neodymium(III) complex displays a coordination number of eight while thallium(III) complex displays a coordination number of six. The ligand and its complexes were screened in vitro for their antibacterial activities against Escherichia coli strains ( E. coli 6 and E. coli 13 ), Proteus species, Staphylococcus aureus and Pseudomonas aeruginosa using the agar well diffusion technique. The synthesized compounds were found to be more active against the microorganisms screened relative to ciprofloxacin, gentamicin and co-trimoxazole.

  10. Rapid identification of Enterobacter hormaechei and Enterobacter cloacae genetic cluster III.

    Science.gov (United States)

    Ohad, S; Block, C; Kravitz, V; Farber, A; Pilo, S; Breuer, R; Rorman, E

    2014-05-01

    Enterobacter cloacae complex bacteria are of both clinical and environmental importance. Phenotypic methods are unable to distinguish between some of the species in this complex, which often renders their identification incomplete. The goal of this study was to develop molecular assays to identify Enterobacter hormaechei and Ent. cloacae genetic cluster III which are relatively frequently encountered in clinical material. The molecular assays developed in this study are qPCR technology based and served to identify both Ent. hormaechei and Ent. cloacae genetic cluster III. qPCR results were compared to hsp60 sequence analysis. Most clinical isolates were assigned to Ent. hormaechei subsp. steigerwaltii and Ent. cloacae genetic cluster III. The latter was proportionately more frequently isolated from bloodstream infections than from other material (P < 0·05). The qPCR assays detecting Ent. hormaechei and Ent. cloacae genetic cluster III demonstrated high sensitivity and specificity. The presented qPCR assays allow accurate and rapid identification of clinical isolates of the Ent. cloacae complex. The improved identifications obtained can specifically assist analysis of Ent. hormaechei and Ent. cloacae genetic cluster III in nosocomial outbreaks and can promote rapid environmental monitoring. An association was observed between Ent. cloacae cluster III and systemic infection that deserves further attention. © 2014 The Society for Applied Microbiology.

  11. Disproportionation of hydroxylamine by water-soluble iron(III) porphyrinate compounds.

    Science.gov (United States)

    Bari, Sara E; Amorebieta, Valentín T; Gutiérrez, María M; Olabe, José A; Doctorovich, Fabio

    2010-01-01

    The reactions of hydroxylamine (HA) with several water-soluble iron(III) porphyrinate compounds, namely iron(III) meso-tetrakis-(N-ethylpyridinium-2yl)-porphyrinate ([Fe(III)(TEPyP)](5+)), iron(III) meso-tetrakis-(4-sulphonatophenyl)-porphyrinate ([Fe(III)(TPPS)](3-)), and microperoxidase 11 ([Fe(III)(MP11)]) were studied for different [Fe(III)(Porph)]/[HA] ratios, under anaerobic conditions at neutral pH. Efficient catalytic processes leading to the disproportionation of HA by these iron(III) porphyrinates were evidenced for the first time. As a common feature, only N(2) and N(2)O were found as gaseous, nitrogen-containing oxidation products, while NH(3) was the unique reduced species detected. Different N(2)/N(2)O ratios obtained with these three porphyrinates strongly suggest distinctive mechanistic scenarios: while [Fe(III)(TEPyP)](5+) and [Fe(III)(MP11)] formed unknown steady-state porphyrinic intermediates in the presence of HA, [Fe(III)(TPPS)](3-) led to the well characterized soluble intermediate, [Fe(II)(TPPS)NO](4-). Free-radical formation was only evidenced for [Fe(III)(TEPyP)](5+), as a consequence of a metal centered reduction. We discuss the catalytic pathways of HA disproportionation on the basis of the distribution of gaseous products, free radicals formation, the nature of porphyrinic intermediates, the Fe(II)/Fe(III) redox potential, the coordinating capabilities of each complex, and the kinetic analysis. The absence of NO(2)(-) revealed either that no HAO-like activity was operative under our reaction conditions, or that NO(2)(-), if formed, was consumed in the reaction milieu.

  12. Isolation: analysis and properties of three bradykinin-potentiating peptides (BPP-II, BPP-III, and BPP-V) from Bothrops neuwiedi venom.

    Science.gov (United States)

    Ferreira, L A; Galle, A; Raida, M; Schrader, M; Lebrun, I; Habermehl, G

    1998-04-01

    In the course of systematic investigations on low-molecular-weight compounds from the venom of Crotalidae and Viperidae, we have isolated and characterized at least three bradykinin-potentiating peptides (BPP-II, BPP-III, and BPP-V) from Bothrops neuwiedi venom by gel filtration on Sephadex G-25 M, Sephadex G-10 followed by HPLC. The peptides showed bradykinin-potentiating action on isolated guinea-pig ileum, for which the BPP-V was more active than of BPP-II, and BPP-III, rat arterial blood pressure, and a relevant angiotensin-converting enzyme (ACE) competitive inhibiting activity. The kinetic studies showed a Ki of the order of 9.7 x 10(-3) microM to BPP-II, 7 x 10(-3) microM to BPP-III, and 3.3 x 10(-3) microM to BPP-V. The amino acid sequence of the BPP-III has been determined to be pGlu-Gly-Gly-Trp-Pro-Arg-Pro-Gly-Pro-Glu-Ile-Pro-Pro, and the amino acid compositions of the BPP-II and BPP-V by amino acid analysis were 2Glu-2Gly-1Arg-4Pro-1Ile and 2Glu-2Gly-1Ser-3Pro-2Val-1Ile, with molecular weight of 1372, 1046, and 1078, respectively.

  13. SIMMER-III analytic thermophysical property model

    International Nuclear Information System (INIS)

    Morita, K; Tobita, Y.; Kondo, Sa.; Fischer, E.A.

    1999-05-01

    An analytic thermophysical property model using general function forms is developed for a reactor safety analysis code, SIMMER-III. The function forms are designed to represent correct behavior of properties of reactor-core materials over wide temperature ranges, especially for the thermal conductivity and the viscosity near the critical point. The most up-to-date and reliable sources for uranium dioxide, mixed-oxide fuel, stainless steel, and sodium available at present are used to determine parameters in the proposed functions. This model is also designed to be consistent with a SIMMER-III model on thermodynamic properties and equations of state for reactor-core materials. (author)

  14. Sensitivity, specificity and likelihood ratios of PCR in the diagnosis of syphilis: a systematic review and meta-analysis.

    Science.gov (United States)

    Gayet-Ageron, Angèle; Lautenschlager, Stephan; Ninet, Béatrice; Perneger, Thomas V; Combescure, Christophe

    2013-05-01

    To systematically review and estimate pooled sensitivity and specificity of the polymerase chain reaction (PCR) technique compared to recommended reference tests in the diagnosis of suspected syphilis at various stages and in various biological materials. Systematic review and meta-analysis. Search of three electronic bibliographic databases from January 1990 to January 2012 and the abstract books of five congresses specialized in the infectious diseases' field (1999-2011). Search key terms included syphilis, Treponema pallidum or neurosyphilis and molecular amplification, polymerase chain reaction or PCR. We included studies that used both reference tests to diagnose syphilis plus PCR and we presented pooled estimates of PCR sensitivity, specificity, and positive and negative likelihood ratios (LR) per syphilis stages and biological materials. Of 1160 identified abstracts, 69 were selected and 46 studies used adequate reference tests to diagnose syphilis. Sensitivity was highest in the swabs from primary genital or anal chancres (78.4%; 95% CI: 68.2-86.0) and in blood from neonates with congenital syphilis (83.0%; 55.0-95.2). Most pooled specificities were ∼95%, except those in blood. A positive PCR is highly informative with a positive LR around 20 in ulcers or skin lesions. In the blood, the positive LR was syphilis diagnosis in lesions. PCR is a useful diagnostic tool in ulcers, especially when serology is still negative and in medical settings with a high prevalence of syphilis.

  15. Likelihood analysis of the sub-GUT MSSM in light of LHC 13-TeV data

    Energy Technology Data Exchange (ETDEWEB)

    Costa, J.C.; Buchmueller, O.; Citron, M.; Richards, A. [Imperial College, High Energy Physics Group, Blackett Laboratory, London (United Kingdom); Bagnaschi, E.; Weiglein, G. [DESY, Hamburg (Germany); Sakurai, K. [University of Warsaw, Faculty of Physics, Institute of Theoretical Physics, Warsaw (Poland); Borsato, M.; Lucio, M.; Santos, D.M. [Universidade de Santiago de Compostela, Instituto Galego de Fisica de Altas Enerxias, Santiago de Compostela (Spain); Roeck, A. de [CERN, Experimental Physics Department, Geneva (Switzerland); Antwerp University, Wilrijk (Belgium); Dolan, M.J. [University of Melbourne, School of Physics, ARC Centre of Excellence for Particle Physics at the Terascale, Parkville (Australia); Ellis, J.R. [King' s College London, Theoretical Particle Physics and Cosmology Group, Department of Physics, London (United Kingdom); National Institute of Chemical Physics and Biophysics, Tallinn (Estonia); CERN, Theoretical Physics Department, Geneva (Switzerland); Flaecher, H. [University of Bristol, H.H. Wills Physics Laboratory, Bristol (United Kingdom); Heinemeyer, S. [Campus of International Excellence UAM + CSIC, Madrid (Spain); Instituto de Fisica Teorica UAM-CSIC, Madrid (Spain); Instituto de Fisica de Cantabria (CSIC-UC), Santander (Spain); Olive, K.A. [University of Minnesota, School of Physics and Astronomy, William I. Fine Theoretical Physics Institute, Minneapolis, MN (United States)

    2018-02-15

    We describe a likelihood analysis using MasterCode of variants of the MSSM in which the soft supersymmetry-breaking parameters are assumed to have universal values at some scale M{sub in} below the supersymmetric grand unification scale M{sub GUT}, as can occur in mirage mediation and other models. In addition to M{sub in}, such 'sub-GUT' models have the 4 parameters of the CMSSM, namely a common gaugino mass m{sub 1/2}, a common soft supersymmetry-breaking scalar mass m{sub 0}, a common trilinear mixing parameter A and the ratio of MSSM Higgs vevs tan β, assuming that the Higgs mixing parameter μ > 0. We take into account constraints on strongly- and electroweakly-interacting sparticles from ∝ 36/fb of LHC data at 13 TeV and the LUX and 2017 PICO, XENON1T and PandaX-II searches for dark matter scattering, in addition to the previous LHC and dark matter constraints as well as full sets of flavour and electroweak constraints. We find a preference for M{sub in} ∝ 10{sup 5} to 10{sup 9} GeV, with M{sub in} ∝ M{sub GUT} disfavoured by Δχ{sup 2} ∝ 3 due to the BR(B{sub s,d} → μ{sup +}μ{sup -}) constraint. The lower limits on strongly-interacting sparticles are largely determined by LHC searches, and similar to those in the CMSSM. We find a preference for the LSP to be a Bino or Higgsino with m{sub χ{sup 0}{sub 1}} ∝ 1 TeV, with annihilation via heavy Higgs bosons H/A and stop coannihilation, or chargino coannihilation, bringing the cold dark matter density into the cosmological range. We find that spin-independent dark matter scattering is likely to be within reach of the planned LUX-Zeplin and XENONnT experiments. We probe the impact of the (g-2){sub μ} constraint, finding similar results whether or not it is included. (orig.)

  16. Simultaneous determination of exponential background and Gaussian peak functions in gamma ray scintillation spectrometers by maximum likelihood technique

    International Nuclear Information System (INIS)

    Eisler, P.; Youl, S.; Lwin, T.; Nelson, G.

    1983-01-01

    Simultaneous fitting of peaks and background functions from gamma-ray spectrometry using multichannel pulse height analysis is considered. The specific case of Gaussian peak and exponential background is treated in detail with respect to simultaneous estimation of both functions by using a technique which incorporates maximum likelihood method as well as a graphical method. Theoretical expressions for the standard errors of the estimates are also obtained. The technique is demonstrated for two experimental data sets. (orig.)

  17. Stage III Melanoma in the Axilla: Patterns of Regional Recurrence After Surgery With and Without Adjuvant Radiation Therapy

    Energy Technology Data Exchange (ETDEWEB)

    Pinkham, Mark B., E-mail: mark.pinkham@health.qld.gov.au [Department of Radiation Oncology, Princess Alexandra Hospital, Brisbane (Australia); University of Queensland, Brisbane (Australia); Foote, Matthew C. [Department of Radiation Oncology, Princess Alexandra Hospital, Brisbane (Australia); Queensland Melanoma Project, Princess Alexandra Hospital, Brisbane (Australia); Diamantina Institute, Brisbane (Australia); University of Queensland, Brisbane (Australia); Burmeister, Elizabeth [Nursing Practice Development Unit, Princess Alexandra Hospital, Brisbane (Australia); Research Centre for Clinical and Community Practice, Griffith University, Brisbane (Australia); Thomas, Janine [Queensland Melanoma Project, Princess Alexandra Hospital, Brisbane (Australia); Meakin, Janelle [Clinical Trials Research Unit, Princess Alexandra Hospital, Brisbane (Australia); Smithers, B. Mark [Queensland Melanoma Project, Princess Alexandra Hospital, Brisbane (Australia); University of Queensland, Brisbane (Australia); Burmeister, Bryan H. [Department of Radiation Oncology, Princess Alexandra Hospital, Brisbane (Australia); Queensland Melanoma Project, Princess Alexandra Hospital, Brisbane (Australia); University of Queensland, Brisbane (Australia)

    2013-07-15

    Purpose: To describe the anatomic distribution of regionally recurrent disease in patients with stage III melanoma in the axilla after curative-intent surgery with and without adjuvant radiation therapy. Methods and Materials: A single-institution, retrospective analysis of a prospective database of 277 patients undergoing curative-intent treatment for stage III melanoma in the axilla between 1992 and 2012 was completed. For patients who received radiation therapy and those who did not, patterns of regional recurrence were analyzed, and univariate analyses were performed to assess for potential factors associated with location of recurrence. Results: There were 121 patients who received adjuvant radiation therapy because their clinicopathologic features conferred a greater risk of regional recurrence. There were 156 patients who received no radiation therapy. The overall axillary control rate was 87%. There were 37 patients with regional recurrence; 17 patients had received adjuvant radiation therapy (14%), and 20 patients (13%) had not. The likelihood of in-field nodal recurrence was significantly less in the adjuvant radiation therapy group (P=.01) and significantly greater in sites adjacent to the axilla (P=.02). Patients with high-risk clinicopathologic features who did not receive adjuvant radiation therapy also tended to experience in-field failure rather than adjacent-field failure. Conclusions: Patients who received adjuvant radiation therapy were more likely to experience recurrence in the adjacent-field regions rather than in the in-field regions. This may not simply reflect higher-risk pathology. Using this data, it may be possible to improve outcomes by reducing the number of adjacent-field recurrences after adjuvant radiation therapy.

  18. Stage III Melanoma in the Axilla: Patterns of Regional Recurrence After Surgery With and Without Adjuvant Radiation Therapy

    International Nuclear Information System (INIS)

    Pinkham, Mark B.; Foote, Matthew C.; Burmeister, Elizabeth; Thomas, Janine; Meakin, Janelle; Smithers, B. Mark; Burmeister, Bryan H.

    2013-01-01

    Purpose: To describe the anatomic distribution of regionally recurrent disease in patients with stage III melanoma in the axilla after curative-intent surgery with and without adjuvant radiation therapy. Methods and Materials: A single-institution, retrospective analysis of a prospective database of 277 patients undergoing curative-intent treatment for stage III melanoma in the axilla between 1992 and 2012 was completed. For patients who received radiation therapy and those who did not, patterns of regional recurrence were analyzed, and univariate analyses were performed to assess for potential factors associated with location of recurrence. Results: There were 121 patients who received adjuvant radiation therapy because their clinicopathologic features conferred a greater risk of regional recurrence. There were 156 patients who received no radiation therapy. The overall axillary control rate was 87%. There were 37 patients with regional recurrence; 17 patients had received adjuvant radiation therapy (14%), and 20 patients (13%) had not. The likelihood of in-field nodal recurrence was significantly less in the adjuvant radiation therapy group (P=.01) and significantly greater in sites adjacent to the axilla (P=.02). Patients with high-risk clinicopathologic features who did not receive adjuvant radiation therapy also tended to experience in-field failure rather than adjacent-field failure. Conclusions: Patients who received adjuvant radiation therapy were more likely to experience recurrence in the adjacent-field regions rather than in the in-field regions. This may not simply reflect higher-risk pathology. Using this data, it may be possible to improve outcomes by reducing the number of adjacent-field recurrences after adjuvant radiation therapy

  19. The gap between fatherhood and couplehood desires among Israeli gay men and estimations of their likelihood.

    Science.gov (United States)

    Shenkman, Geva

    2012-10-01

    This study examined the frequencies of the desires and likelihood estimations of Israeli gay men regarding fatherhood and couplehood, using a sample of 183 gay men aged 19-50. It follows previous research which indicated the existence of a gap in the United States with respect to fatherhood, and called for generalizability examinations in other countries and the exploration of possible explanations. As predicted, a gap was also found in Israel between fatherhood desires and their likelihood estimations, as well as between couplehood desires and their likelihood estimations. In addition, lower estimations of fatherhood likelihood were found to predict depression and to correlate with decreased subjective well-being. Possible psychosocial explanations are offered. Moreover, by mapping attitudes toward fatherhood and couplehood among Israeli gay men, the current study helps to extend our knowledge of several central human development motivations and their correlations with depression and subjective well-being in a less-studied sexual minority in a complex cultural climate. (PsycINFO Database Record (c) 2012 APA, all rights reserved).

  20. Biochemical and Structural Properties of Mouse Kynurenine Aminotransferase III

    Energy Technology Data Exchange (ETDEWEB)

    Han, Q.; Robinson, H; Cai, T; Tagle, D; Li, J

    2009-01-01

    Kynurenine aminotransferase III (KAT III) has been considered to be involved in the production of mammalian brain kynurenic acid (KYNA), which plays an important role in protecting neurons from overstimulation by excitatory neurotransmitters. The enzyme was identified based on its high sequence identity with mammalian KAT I, but its activity toward kynurenine and its structural characteristics have not been established. In this study, the biochemical and structural properties of mouse KAT III (mKAT III) were determined. Specifically, mKAT III cDNA was amplified from a mouse brain cDNA library, and its recombinant protein was expressed in an insect cell protein expression system. We established that mKAT III is able to efficiently catalyze the transamination of kynurenine to KYNA and has optimum activity at relatively basic conditions of around pH 9.0 and at relatively high temperatures of 50 to 60C. In addition, mKAT III is active toward a number of other amino acids. Its activity toward kynurenine is significantly decreased in the presence of methionine, histidine, glutamine, leucine, cysteine, and 3-hydroxykynurenine. Through macromolecular crystallography, we determined the mKAT III crystal structure and its structures in complex with kynurenine and glutamine. Structural analysis revealed the overall architecture of mKAT III and its cofactor binding site and active center residues. This is the first report concerning the biochemical characteristics and crystal structures of KAT III enzymes and provides a basis toward understanding the overall physiological role of mammalian KAT III in vivo and insight into regulating the levels of endogenous KYNA through modulation of the enzyme in the mouse brain.