WorldWideScience

Sample records for high clinical likelihood

  1. Likelihood ratios: Clinical application in day-to-day practice

    Directory of Open Access Journals (Sweden)

    Parikh Rajul

    2009-01-01

    Full Text Available In this article we provide an introduction to the use of likelihood ratios in clinical ophthalmology. Likelihood ratios permit the best use of clinical test results to establish diagnoses for the individual patient. Examples and step-by-step calculations demonstrate the estimation of pretest probability, pretest odds, and calculation of posttest odds and posttest probability using likelihood ratios. The benefits and limitations of this approach are discussed.

  2. Constructing diagnostic likelihood: clinical decisions using subjective versus statistical probability.

    Science.gov (United States)

    Kinnear, John; Jackson, Ruth

    2017-07-01

    Although physicians are highly trained in the application of evidence-based medicine, and are assumed to make rational decisions, there is evidence that their decision making is prone to biases. One of the biases that has been shown to affect accuracy of judgements is that of representativeness and base-rate neglect, where the saliency of a person's features leads to overestimation of their likelihood of belonging to a group. This results in the substitution of 'subjective' probability for statistical probability. This study examines clinicians' propensity to make estimations of subjective probability when presented with clinical information that is considered typical of a medical condition. The strength of the representativeness bias is tested by presenting choices in textual and graphic form. Understanding of statistical probability is also tested by omitting all clinical information. For the questions that included clinical information, 46.7% and 45.5% of clinicians made judgements of statistical probability, respectively. Where the question omitted clinical information, 79.9% of clinicians made a judgement consistent with statistical probability. There was a statistically significant difference in responses to the questions with and without representativeness information (χ2 (1, n=254)=54.45, pstatistical probability. One of the causes for this representativeness bias may be the way clinical medicine is taught where stereotypic presentations are emphasised in diagnostic decision making. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  3. Supplementary Material for: High-Order Composite Likelihood Inference for Max-Stable Distributions and Processes

    KAUST Repository

    Castruccio, Stefano

    2016-01-01

    In multivariate or spatial extremes, inference for max-stable processes observed at a large collection of points is a very challenging problem and current approaches typically rely on less expensive composite likelihoods constructed from small subsets of data. In this work, we explore the limits of modern state-of-the-art computational facilities to perform full likelihood inference and to efficiently evaluate high-order composite likelihoods. With extensive simulations, we assess the loss of information of composite likelihood estimators with respect to a full likelihood approach for some widely used multivariate or spatial extreme models, we discuss how to choose composite likelihood truncation to improve the efficiency, and we also provide recommendations for practitioners. This article has supplementary material online.

  4. High-order Composite Likelihood Inference for Max-Stable Distributions and Processes

    KAUST Repository

    Castruccio, Stefano

    2015-09-29

    In multivariate or spatial extremes, inference for max-stable processes observed at a large collection of locations is a very challenging problem in computational statistics, and current approaches typically rely on less expensive composite likelihoods constructed from small subsets of data. In this work, we explore the limits of modern state-of-the-art computational facilities to perform full likelihood inference and to efficiently evaluate high-order composite likelihoods. With extensive simulations, we assess the loss of information of composite likelihood estimators with respect to a full likelihood approach for some widely-used multivariate or spatial extreme models, we discuss how to choose composite likelihood truncation to improve the efficiency, and we also provide recommendations for practitioners. This article has supplementary material online.

  5. Quasi Maximum Likelihood Analysis of High Dimensional Constrained Factor Models

    OpenAIRE

    Li, Kunpeng; Li,Qi; Lu, Lina

    2016-01-01

    Factor models have been widely used in practice. However, an undesirable feature of a high dimensional factor model is that the model has too many parameters. An effective way to address this issue, proposed in a seminar work by Tsai and Tsay (2010), is to decompose the loadings matrix by a high-dimensional known matrix multiplying with a low-dimensional unknown matrix, which Tsai and Tsay (2010) name constrained factor models. This paper investigates the estimation and inferential theory ...

  6. Generalized Likelihood Ratio Statistics and Uncertainty Adjustments in Efficient Adaptive Design of Clinical Trials

    CERN Document Server

    Bartroff, Jay

    2011-01-01

    A new approach to adaptive design of clinical trials is proposed in a general multiparameter exponential family setting, based on generalized likelihood ratio statistics and optimal sequential testing theory. These designs are easy to implement, maintain the prescribed Type I error probability, and are asymptotically efficient. Practical issues involved in clinical trials allowing mid-course adaptation and the large literature on this subject are discussed, and comparisons between the proposed and existing designs are presented in extensive simulation studies of their finite-sample performance, measured in terms of the expected sample size and power functions.

  7. Likelihood ratio based verification in high dimensional spaces

    NARCIS (Netherlands)

    Hendrikse, Anne; Veldhuis, Raymond; Spreeuwers, Luuk

    2013-01-01

    The increase of the dimensionality of data sets often lead to problems during estimation, which are denoted as the curse of dimensionality. One of the problems of Second Order Statistics (SOS) estimation in high dimensional data is that the resulting covariance matrices are not full rank, so their i

  8. Likelihood ratio based verification in high dimensional spaces

    NARCIS (Netherlands)

    Hendrikse, A.J.; Veldhuis, Raymond N.J.; Spreeuwers, Lieuwe Jan

    The increase of the dimensionality of data sets often lead to problems during estimation, which are denoted as the curse of dimensionality. One of the problems of Second Order Statistics (SOS) estimation in high dimensional data is that the resulting covariance matrices are not full rank, so their

  9. Approximate Likelihood

    CERN Document Server

    CERN. Geneva

    2015-01-01

    Most physics results at the LHC end in a likelihood ratio test. This includes discovery and exclusion for searches as well as mass, cross-section, and coupling measurements. The use of Machine Learning (multivariate) algorithms in HEP is mainly restricted to searches, which can be reduced to classification between two fixed distributions: signal vs. background. I will show how we can extend the use of ML classifiers to distributions parameterized by physical quantities like masses and couplings as well as nuisance parameters associated to systematic uncertainties. This allows for one to approximate the likelihood ratio while still using a high dimensional feature vector for the data. Both the MEM and ABC approaches mentioned above aim to provide inference on model parameters (like cross-sections, masses, couplings, etc.). ABC is fundamentally tied Bayesian inference and focuses on the “likelihood free” setting where only a simulator is available and one cannot directly compute the likelihood for the dat...

  10. Quantitative comparison of OSEM and penalized likelihood image reconstruction using relative difference penalties for clinical PET.

    Science.gov (United States)

    Ahn, Sangtae; Ross, Steven G; Asma, Evren; Miao, Jun; Jin, Xiao; Cheng, Lishui; Wollenweber, Scott D; Manjeshwar, Ravindra M

    2015-08-07

    Ordered subset expectation maximization (OSEM) is the most widely used algorithm for clinical PET image reconstruction. OSEM is usually stopped early and post-filtered to control image noise and does not necessarily achieve optimal quantitation accuracy. As an alternative to OSEM, we have recently implemented a penalized likelihood (PL) image reconstruction algorithm for clinical PET using the relative difference penalty with the aim of improving quantitation accuracy without compromising visual image quality. Preliminary clinical studies have demonstrated visual image quality including lesion conspicuity in images reconstructed by the PL algorithm is better than or at least as good as that in OSEM images. In this paper we evaluate lesion quantitation accuracy of the PL algorithm with the relative difference penalty compared to OSEM by using various data sets including phantom data acquired with an anthropomorphic torso phantom, an extended oval phantom and the NEMA image quality phantom; clinical data; and hybrid clinical data generated by adding simulated lesion data to clinical data. We focus on mean standardized uptake values and compare them for PL and OSEM using both time-of-flight (TOF) and non-TOF data. The results demonstrate improvements of PL in lesion quantitation accuracy compared to OSEM with a particular improvement in cold background regions such as lungs.

  11. Maximum Likelihood Estimation of Time-Varying Loadings in High-Dimensional Factor Models

    DEFF Research Database (Denmark)

    Mikkelsen, Jakob Guldbæk; Hillebrand, Eric; Urga, Giovanni

    In this paper, we develop a maximum likelihood estimator of time-varying loadings in high-dimensional factor models. We specify the loadings to evolve as stationary vector autoregressions (VAR) and show that consistent estimates of the loadings parameters can be obtained by a two-step maximum...... likelihood estimation procedure. In the first step, principal components are extracted from the data to form factor estimates. In the second step, the parameters of the loadings VARs are estimated as a set of univariate regression models with time-varying coefficients. We document the finite...

  12. Bayesian penalized log-likelihood ratio approach for dose response clinical trial studies.

    Science.gov (United States)

    Tang, Yuanyuan; Cai, Chunyan; Sun, Liangrui; He, Jianghua

    2017-02-13

    In literature, there are a few unified approaches to test proof of concept and estimate a target dose, including the multiple comparison procedure using modeling approach, and the permutation approach proposed by Klingenberg. We discuss and compare the operating characteristics of these unified approaches and further develop an alternative approach in a Bayesian framework based on the posterior distribution of a penalized log-likelihood ratio test statistic. Our Bayesian approach is much more flexible to handle linear or nonlinear dose-response relationships and is more efficient than the permutation approach. The operating characteristics of our Bayesian approach are comparable to and sometimes better than both approaches in a wide range of dose-response relationships. It yields credible intervals as well as predictive distribution for the response rate at a specific dose level for the target dose estimation. Our Bayesian approach can be easily extended to continuous, categorical, and time-to-event responses. We illustrate the performance of our proposed method with extensive simulations and Phase II clinical trial data examples.

  13. A Scoring Tool to Identify East African HIV-1 Serodiscordant Partnerships with a High Likelihood of Pregnancy.

    Directory of Open Access Journals (Sweden)

    Renee Heffron

    Full Text Available HIV-1 prevention programs targeting HIV-1 serodiscordant couples need to identify couples that are likely to become pregnant to facilitate discussions about methods to minimize HIV-1 risk during pregnancy attempts (i.e. safer conception or effective contraception when pregnancy is unintended. A clinical prediction tool could be used to identify HIV-1 serodiscordant couples with a high likelihood of pregnancy within one year.Using standardized clinical prediction methods, we developed and validated a tool to identify heterosexual East African HIV-1 serodiscordant couples with an increased likelihood of becoming pregnant in the next year. Datasets were from three prospectively followed cohorts, including nearly 7,000 couples from Kenya and Uganda participating in HIV-1 prevention trials and delivery projects.The final score encompassed the age of the woman, woman's number of children living, partnership duration, having had condomless sex in the past month, and non-use of an effective contraceptive. The area under the curve (AUC for the probability of the score to correctly predict pregnancy was 0.74 (95% CI 0.72-0.76. Scores ≥ 7 predicted a pregnancy incidence of >17% per year and captured 78% of the pregnancies. Internal and external validation confirmed the predictive ability of the score.A pregnancy likelihood score encompassing basic demographic, clinical and behavioral factors defined African HIV-1 serodiscordant couples with high one-year pregnancy incidence rates. This tool could be used to engage African HIV-1 serodiscordant couples in counseling discussions about fertility intentions in order to offer services for safer conception or contraception that align with their reproductive goals.

  14. The profile likelihood ratio and the look elsewhere effect in high energy physics

    CERN Document Server

    Ranucci, Gioacchino

    2012-01-01

    The experimental issue of the search for new particles of unknown mass poses the challenge of exploring a wide interval to look for the usual signatures represented by excess of events above the background. A side effect of such a broad range quest is that the significance calculations valid for signals of known location are no more applicable when such an information is missing. This circumstance is commonly termed in high energy physics applications as the look elsewhere effect. How it concretely manifests in a specific problem of signal search depends upon the particular strategy adopted to unravel the sought-after signal from the underlying background. In this respect an increasingly popular method is the profile likelihood ratio, especially because of its asymptotic behavior dictated by one of the most famous statistic result, the Wilks' theorem. This work is centered on the description of the look elsewhere effect in the framework of the profile likelihood methodology, in particular proposing a conjectu...

  15. Methods for flexible sample-size design in clinical trials: Likelihood, weighted, dual test, and promising zone approaches.

    Science.gov (United States)

    Shih, Weichung Joe; Li, Gang; Wang, Yining

    2016-03-01

    Sample size plays a crucial role in clinical trials. Flexible sample-size designs, as part of the more general category of adaptive designs that utilize interim data, have been a popular topic in recent years. In this paper, we give a comparative review of four related methods for such a design. The likelihood method uses the likelihood ratio test with an adjusted critical value. The weighted method adjusts the test statistic with given weights rather than the critical value. The dual test method requires both the likelihood ratio statistic and the weighted statistic to be greater than the unadjusted critical value. The promising zone approach uses the likelihood ratio statistic with the unadjusted value and other constraints. All four methods preserve the type-I error rate. In this paper we explore their properties and compare their relationships and merits. We show that the sample size rules for the dual test are in conflict with the rules of the promising zone approach. We delineate what is necessary to specify in the study protocol to ensure the validity of the statistical procedure and what can be kept implicit in the protocol so that more flexibility can be attained for confirmatory phase III trials in meeting regulatory requirements. We also prove that under mild conditions, the likelihood ratio test still preserves the type-I error rate when the actual sample size is larger than the re-calculated one.

  16. Penalized Likelihood Methods for Estimation of Sparse High Dimensional Directed Acyclic Graphs

    CERN Document Server

    Shojaie, Ali

    2009-01-01

    Directed acyclic graphs (DAGs) are commonly used to represent causal relationships among random variables in graphical models. Applications of these models arise in the study of physical, as well as biological systems, where directed edges between nodes represent the influence of components of the system on each other. The general problem of estimating DAGs from observed data is computationally NP-hard, Moreover two directed graphs may be observationally equivalent. When the nodes exhibit a natural ordering, the problem of estimating directed graphs reduces to the problem of estimating the structure of the network. In this paper, we propose a penalized likelihood approach that directly estimates the adjacency matrix of DAGs. Both lasso and adaptive lasso penalties are considered and an efficient algorithm is proposed for estimation of high dimensional DAGs. We study variable selection consistency of the two penalties when the number of variables grows to infinity with the sample size. We show that although la...

  17. Frequency-Domain Maximum-Likelihood Estimation of High-Voltage Pulse Transformer Model Parameters

    CERN Document Server

    Aguglia, D

    2014-01-01

    This paper presents an offline frequency-domain nonlinear and stochastic identification method for equivalent model parameter estimation of high-voltage pulse transformers. Such kinds of transformers are widely used in the pulsed-power domain, and the difficulty in deriving pulsed-power converter optimal control strategies is directly linked to the accuracy of the equivalent circuit parameters. These components require models which take into account electric fields energies represented by stray capacitance in the equivalent circuit. These capacitive elements must be accurately identified, since they greatly influence the general converter performances. A nonlinear frequency-based identification method, based on maximum-likelihood estimation, is presented, and a sensitivity analysis of the best experimental test to be considered is carried out. The procedure takes into account magnetic saturation and skin effects occurring in the windings during the frequency tests. The presented method is validated by experim...

  18. Direct reconstruction of the source intensity distribution of a clinical linear accelerator using a maximum likelihood expectation maximization algorithm.

    Science.gov (United States)

    Papaconstadopoulos, P; Levesque, I R; Maglieri, R; Seuntjens, J

    2016-02-07

    Direct determination of the source intensity distribution of clinical linear accelerators is still a challenging problem for small field beam modeling. Current techniques most often involve special equipment and are difficult to implement in the clinic. In this work we present a maximum-likelihood expectation-maximization (MLEM) approach to the source reconstruction problem utilizing small fields and a simple experimental set-up. The MLEM algorithm iteratively ray-traces photons from the source plane to the exit plane and extracts corrections based on photon fluence profile measurements. The photon fluence profiles were determined by dose profile film measurements in air using a high density thin foil as build-up material and an appropriate point spread function (PSF). The effect of other beam parameters and scatter sources was minimized by using the smallest field size ([Formula: see text] cm(2)). The source occlusion effect was reproduced by estimating the position of the collimating jaws during this process. The method was first benchmarked against simulations for a range of typical accelerator source sizes. The sources were reconstructed with an accuracy better than 0.12 mm in the full width at half maximum (FWHM) to the respective electron sources incident on the target. The estimated jaw positions agreed within 0.2 mm with the expected values. The reconstruction technique was also tested against measurements on a Varian Novalis Tx linear accelerator and compared to a previously commissioned Monte Carlo model. The reconstructed FWHM of the source agreed within 0.03 mm and 0.11 mm to the commissioned electron source in the crossplane and inplane orientations respectively. The impact of the jaw positioning, experimental and PSF uncertainties on the reconstructed source distribution was evaluated with the former presenting the dominant effect.

  19. Empirical likelihood

    CERN Document Server

    Owen, Art B

    2001-01-01

    Empirical likelihood provides inferences whose validity does not depend on specifying a parametric model for the data. Because it uses a likelihood, the method has certain inherent advantages over resampling methods: it uses the data to determine the shape of the confidence regions, and it makes it easy to combined data from multiple sources. It also facilitates incorporating side information, and it simplifies accounting for censored, truncated, or biased sampling.One of the first books published on the subject, Empirical Likelihood offers an in-depth treatment of this method for constructing confidence regions and testing hypotheses. The author applies empirical likelihood to a range of problems, from those as simple as setting a confidence region for a univariate mean under IID sampling, to problems defined through smooth functions of means, regression models, generalized linear models, estimating equations, or kernel smooths, and to sampling with non-identically distributed data. Abundant figures offer vi...

  20. A 35-40% Likelihood of a Highly Damaging Tokyo Earthquake in Next 30 Years

    Science.gov (United States)

    Stein, R. S.; Toda, S.; Parsons, T.; Bozkurt, S. B.

    2005-12-01

    Tokyo and its outlying cities are home to one-quarter of Japan's 127 million people. Highly destructive earthquakes struck the capital in 1703, 1855 and 1923, the last of which took 105,000 lives. Reoccurrence of any of these shocks today would cost about one trillion dollars, of which perhaps 10% is insured. Fueled by Tokyo's rich data trove but hindered by its complexity, we carried out a new hazard assessment. We used the prehistoric record of great earthquakes preserved in uplifted marine terraces and tsunami deposits (17 M~8 shocks in the past 7,000 years), historical shaking (10,000 intensity observations in the past 400 years), the dense modern seismic network (300,000 earthquakes in the past 30 years), and the world's best geodetic array (150 GPS vectors spanning the past 10 years). We propose that a dislodged block of the Pacific plate is jammed between the Pacific, Philippine Sea and Eurasian plates beneath Tokyo, and controls much of Tokyo's seismic behavior for M≤7.5 shocks, including the damaging 1855 M~7.3 Ansei-Edo shock. On the basis of frequency-magnitude curves, earthquakes similar to the Ansei-Edo event should be quite frequent (25-35% likelihood in an average 30-yr period), and so such events dominate the combined probabilities. In contrast, our renewal model for the great 1703 and 1923 type plate boundary shocks yields a ~1% probability for the next 30 yr, with a time-averaged 30-yr probability of ~8%. The resulting net likelihood for severe shaking in Tokyo, Kawasaki, and Yokohama for the next 30 years is 25%-40%, but how can it be validated? The long historical record in Kanto affords a rare opportunity to calculate the probability of shaking in an alternative manner, based almost exclusively on intensity observations. This approach permits robust estimates for the spatial distribution of shaking, even for sites with few observations. The resulting probability of severe shaking over an average 30-yr period is ~35% in the Tokyo, Kawasaki

  1. Rising Above Chaotic Likelihoods

    CERN Document Server

    Du, Hailiang

    2014-01-01

    Berliner (Likelihood and Bayesian prediction for chaotic systems, J. Am. Stat. Assoc. 1991) identified a number of difficulties in using the likelihood function within the Bayesian paradigm for state estimation and parameter estimation of chaotic systems. Even when the equations of the system are given, he demonstrated "chaotic likelihood functions" of initial conditions and parameter values in the 1-D Logistic Map. Chaotic likelihood functions, while ultimately smooth, have such complicated small scale structure as to cast doubt on the possibility of identifying high likelihood estimates in practice. In this paper, the challenge of chaotic likelihoods is overcome by embedding the observations in a higher dimensional sequence-space, which is shown to allow good state estimation with finite computational power. An Importance Sampling approach is introduced, where Pseudo-orbit Data Assimilation is employed in the sequence-space in order first to identify relevant pseudo-orbits and then relevant trajectories. Es...

  2. Likelihood of Condom Use When Sexually Transmitted Diseases Are Suspected: Results from a Clinic Sample

    Science.gov (United States)

    Crosby, Richard A.; Milhausen, Robin R.; Graham, Cynthia A.; Yarber, William L.; Sanders, Stephanie A.; Charnigo, Richard; Shrier, Lydia A.

    2014-01-01

    Objective: To determine the event-level associations between perceived risk of sexually transmitted disease (STD) acquisition/transmission and condom use during penile-vaginal intercourse (PVI) among STD clinic attendees. Method: A convenience sample (N = 622) completed daily electronic assessments. Two questions were proxies of perceived risk:…

  3. Reducing the likelihood of future human activities that could affect geologic high-level waste repositories

    Energy Technology Data Exchange (ETDEWEB)

    1984-05-01

    The disposal of radioactive wastes in deep geologic formations provides a means of isolating the waste from people until the radioactivity has decayed to safe levels. However, isolating people from the wastes is a different problem, since we do not know what the future condition of society will be. The Human Interference Task Force was convened by the US Department of Energy to determine whether reasonable means exist (or could be developed) to reduce the likelihood of future human unintentionally intruding on radioactive waste isolation systems. The task force concluded that significant reductions in the likelihood of human interference could be achieved, for perhaps thousands of years into the future, if appropriate steps are taken to communicate the existence of the repository. Consequently, for two years the task force directed most of its study toward the area of long-term communication. Methods are discussed for achieving long-term communication by using permanent markers and widely disseminated records, with various steps taken to provide multiple levels of protection against loss, destruction, and major language/societal changes. Also developed is the concept of a universal symbol to denote Caution - Biohazardous Waste Buried Here. If used for the thousands of non-radioactive biohazardous waste sites in this country alone, a symbol could transcend generations and language changes, thereby vastly improving the likelihood of successful isolation of all buried biohazardous wastes.

  4. THE GENERALIZED MAXIMUM LIKELIHOOD METHOD APPLIED TO HIGH PRESSURE PHASE EQUILIBRIUM

    Directory of Open Access Journals (Sweden)

    Lúcio CARDOZO-FILHO

    1997-12-01

    Full Text Available The generalized maximum likelihood method was used to determine binary interaction parameters between carbon dioxide and components of orange essential oil. Vapor-liquid equilibrium was modeled with Peng-Robinson and Soave-Redlich-Kwong equations, using a methodology proposed in 1979 by Asselineau, Bogdanic and Vidal. Experimental vapor-liquid equilibrium data on binary mixtures formed with carbon dioxide and compounds usually found in orange essential oil were used to test the model. These systems were chosen to demonstrate that the maximum likelihood method produces binary interaction parameters for cubic equations of state capable of satisfactorily describing phase equilibrium, even for a binary such as ethanol/CO2. Results corroborate that the Peng-Robinson, as well as the Soave-Redlich-Kwong, equation can be used to describe phase equilibrium for the following systems: components of essential oil of orange/CO2.Foi empregado o método da máxima verossimilhança generalizado para determinação de parâmetros de interação binária entre os componentes do óleo essencial de laranja e dióxido de carbono. Foram usados dados experimentais de equilíbrio líquido-vapor de misturas binárias de dióxido de carbono e componentes do óleo essencial de laranja. O equilíbrio líquido-vapor foi modelado com as equações de Peng-Robinson e de Soave-Redlich-Kwong usando a metodologia proposta em 1979 por Asselineau, Bogdanic e Vidal. A escolha destes sistemas teve como objetivo demonstrar que o método da máxima verosimilhança produz parâmetros de interação binária, para equações cúbicas de estado capazes de descrever satisfatoriamente até mesmo o equilíbrio para o binário etanol/CO2. Os resultados comprovam que tanto a equação de Peng-Robinson quanto a de Soave-Redlich-Kwong podem ser empregadas para descrever o equilíbrio de fases para o sistemas: componentes do óleo essencial de laranja/CO2.

  5. Risk assessment models in genetics clinic for array comparative genomic hybridization: Clinical information can be used to predict the likelihood of an abnormal result in patients.

    Science.gov (United States)

    Marano, Rachel M; Mercurio, Laura; Kanter, Rebecca; Doyle, Richard; Abuelo, Dianne; Morrow, Eric M; Shur, Natasha

    2013-03-01

    Array comparative genomic hybridization (aCGH) testing can diagnose chromosomal microdeletions and duplications too small to be detected by conventional cytogenetic techniques. We need to consider which patients are more likely to receive a diagnosis from aCGH testing versus patients that have lower likelihood and may benefit from broader genome wide scanning. We retrospectively reviewed charts of a population of 200 patients, 117 boys and 83 girls, who underwent aCGH testing in Genetics Clinic at Rhode Island hospital between 1 January/2008 and 31 December 2010. Data collected included sex, age at initial clinical presentation, aCGH result, history of seizures, autism, dysmorphic features, global developmental delay/intellectual disability, hypotonia and failure to thrive. aCGH analysis revealed abnormal results in 34 (17%) and variants of unknown significance in 24 (12%). Patients with three or more clinical diagnoses had a 25.0% incidence of abnormal aCGH findings, while patients with two or fewer clinical diagnoses had a 12.5% incidence of abnormal aCGH findings. Currently, we provide families with a range of 10-30% of a diagnosis with aCGH testing. With increased clinical complexity, patients have an increased probability of having an abnormal aCGH result. With this, we can provide individualized risk estimates for each patient.

  6. Maximum likelihood estimation of the negative binomial dispersion parameter for highly overdispersed data, with applications to infectious diseases.

    Directory of Open Access Journals (Sweden)

    James O Lloyd-Smith

    Full Text Available BACKGROUND: The negative binomial distribution is used commonly throughout biology as a model for overdispersed count data, with attention focused on the negative binomial dispersion parameter, k. A substantial literature exists on the estimation of k, but most attention has focused on datasets that are not highly overdispersed (i.e., those with k>or=1, and the accuracy of confidence intervals estimated for k is typically not explored. METHODOLOGY: This article presents a simulation study exploring the bias, precision, and confidence interval coverage of maximum-likelihood estimates of k from highly overdispersed distributions. In addition to exploring small-sample bias on negative binomial estimates, the study addresses estimation from datasets influenced by two types of event under-counting, and from disease transmission data subject to selection bias for successful outbreaks. CONCLUSIONS: Results show that maximum likelihood estimates of k can be biased upward by small sample size or under-reporting of zero-class events, but are not biased downward by any of the factors considered. Confidence intervals estimated from the asymptotic sampling variance tend to exhibit coverage below the nominal level, with overestimates of k comprising the great majority of coverage errors. Estimation from outbreak datasets does not increase the bias of k estimates, but can add significant upward bias to estimates of the mean. Because k varies inversely with the degree of overdispersion, these findings show that overestimation of the degree of overdispersion is very rare for these datasets.

  7. Augmented Likelihood Image Reconstruction.

    Science.gov (United States)

    Stille, Maik; Kleine, Matthias; Hägele, Julian; Barkhausen, Jörg; Buzug, Thorsten M

    2016-01-01

    The presence of high-density objects remains an open problem in medical CT imaging. Data of projections passing through objects of high density, such as metal implants, are dominated by noise and are highly affected by beam hardening and scatter. Reconstructed images become less diagnostically conclusive because of pronounced artifacts that manifest as dark and bright streaks. A new reconstruction algorithm is proposed with the aim to reduce these artifacts by incorporating information about shape and known attenuation coefficients of a metal implant. Image reconstruction is considered as a variational optimization problem. The afore-mentioned prior knowledge is introduced in terms of equality constraints. An augmented Lagrangian approach is adapted in order to minimize the associated log-likelihood function for transmission CT. During iterations, temporally appearing artifacts are reduced with a bilateral filter and new projection values are calculated, which are used later on for the reconstruction. A detailed evaluation in cooperation with radiologists is performed on software and hardware phantoms, as well as on clinically relevant patient data of subjects with various metal implants. Results show that the proposed reconstruction algorithm is able to outperform contemporary metal artifact reduction methods such as normalized metal artifact reduction.

  8. PhyPA: Phylogenetic method with pairwise sequence alignment outperforms likelihood methods in phylogenetics involving highly diverged sequences.

    Science.gov (United States)

    Xia, Xuhua

    2016-09-01

    While pairwise sequence alignment (PSA) by dynamic programming is guaranteed to generate one of the optimal alignments, multiple sequence alignment (MSA) of highly divergent sequences often results in poorly aligned sequences, plaguing all subsequent phylogenetic analysis. One way to avoid this problem is to use only PSA to reconstruct phylogenetic trees, which can only be done with distance-based methods. I compared the accuracy of this new computational approach (named PhyPA for phylogenetics by pairwise alignment) against the maximum likelihood method using MSA (the ML+MSA approach), based on nucleotide, amino acid and codon sequences simulated with different topologies and tree lengths. I present a surprising discovery that the fast PhyPA method consistently outperforms the slow ML+MSA approach for highly diverged sequences even when all optimization options were turned on for the ML+MSA approach. Only when sequences are not highly diverged (i.e., when a reliable MSA can be obtained) does the ML+MSA approach outperforms PhyPA. The true topologies are always recovered by ML with the true alignment from the simulation. However, with MSA derived from alignment programs such as MAFFT or MUSCLE, the recovered topology consistently has higher likelihood than that for the true topology. Thus, the failure to recover the true topology by the ML+MSA is not because of insufficient search of tree space, but by the distortion of phylogenetic signal by MSA methods. I have implemented in DAMBE PhyPA and two approaches making use of multi-gene data sets to derive phylogenetic support for subtrees equivalent to resampling techniques such as bootstrapping and jackknifing.

  9. Likelihood of Bone Recurrence in Prior Sites of Metastasis in Patients With High-Risk Neuroblastoma

    Energy Technology Data Exchange (ETDEWEB)

    Polishchuk, Alexei L. [Department of Radiation Oncology, University of California at San Francisco School of Medicine and UCSF Benioff Children' s Hospital, San Francisco, California (United States); Li, Richard [Division of Radiation Oncology, Dana Farber/Boston Children' s Cancer and Blood Disorders Center, Brigham and Women' s Hospital, Harvard Medical School, Boston, Massachusetts (United States); Hill-Kayser, Christine [Department of Radiation Oncology, University of Pennsylvania School of Medicine, Philadelphia, Pennsylvania (United States); Little, Anthony [Division of Oncology, Children' s Hospital of Philadelphia, Philadelphia, Pennsylvania (United States); Hawkins, Randall A. [Department of Radiology, University of California at San Francisco School of Medicine and UCSF Benioff Children' s Hospital, San Francisco, California (United States); Hamilton, Jeffrey; Lau, Michael [Department of Radiation Oncology, University of California at San Francisco School of Medicine and UCSF Benioff Children' s Hospital, San Francisco, California (United States); Tran, Hung Chi [Division of Hematology/Oncology, Children' s Hospital of Los Angeles, Los Angeles, California (United States); Strahlendorf, Caron [Division of Hematology and Oncology, Department of Pediatrics, The University of British Columbia, Vancouver, British Columbia (Canada); Lemons, Richard S. [Division of Pediatric Hematology/Oncology, University of Utah School of Medicine, Salt Lake City, Utah (United States); Weinberg, Vivian [Department of Radiation Oncology, University of California at San Francisco School of Medicine and UCSF Benioff Children' s Hospital, San Francisco, California (United States); Matthay, Katherine K.; DuBois, Steven G. [Department of Pediatrics, University of California at San Francisco School of Medicine and UCSF Benioff Children' s Hospital, San Francisco, California (United States); and others

    2014-07-15

    Purpose/Objectives: Despite recent improvements in outcomes, 40% of children with high-risk neuroblastoma will experience relapse, facing a guarded prognosis for long-term cure. Whether recurrences are at new sites or sites of original disease may guide decision making during initial therapy. Methods and Materials: Eligible patients were retrospectively identified from institutional databases at first metastatic relapse of high-risk neuroblastoma. Included patients had disease involving metaiodobenzylguanidine (MIBG)-avid metastatic sites at diagnosis and first relapse, achieved a complete or partial response with no more than one residual MIBG-avid site before first relapse, and received no total body irradiation or therapy with {sup 131}I-MIBG before first relapse. Anatomically defined metastatic sites were tracked from diagnosis through first relapse to determine tendency of disease to recur at previously involved versus uninvolved sites and to assess whether this pattern was influenced by site irradiation. Results: Of 159 MIBG-avid metastatic sites identified among 43 patients at first relapse, 131 (82.4%) overlapped anatomically with the set of 525 sites present at diagnosis. This distribution was similar for bone sites, but patterns of relapse were more varied for the smaller subset of soft tissue metastases. Among all metastatic sites at diagnosis in our subsequently relapsed patient cohort, only 3 of 19 irradiated sites (15.8%) recurred as compared with 128 of 506 (25.3%) unirradiated sites. Conclusions: Metastatic bone relapse in neuroblastoma usually occurs at anatomic sites of previous disease. Metastatic sites identified at diagnosis that did not receive radiation during frontline therapy appeared to have a higher risk of involvement at first relapse relative to previously irradiated metastatic sites. These observations support the current paradigm of irradiating metastases that persist after induction chemotherapy in high-risk patients. Furthermore

  10. Maximum-Likelihood Sequence Detector for Dynamic Mode High Density Probe Storage

    CERN Document Server

    Kumar, Naveen; Ramamoorthy, Aditya; Salapaka, Murti

    2009-01-01

    There is an ever increasing need for storing data in smaller and smaller form factors driven by the ubiquitous use and increased demands of consumer electronics. A new approach of achieving a few Tb per in2 areal densities, utilizes a cantilever probe with a sharp tip that can be used to deform and assess the topography of the material. The information may be encoded by means of topographic profiles on a polymer medium. The prevalent mode of using the cantilever probe is the static mode that is known to be harsh on the probe and the media. In this paper, the high quality factor dynamic mode operation, which is known to be less harsh on the media and the probe, is analyzed for probe based high density data storage purposes. It is demonstrated that an appropriate level of abstraction is possible that obviates the need for an involved physical model. The read operation is modeled as a communication channel which incorporates the inherent system memory due to the intersymbol interference and the cantilever state ...

  11. SU-C-207A-01: A Novel Maximum Likelihood Method for High-Resolution Proton Radiography/proton CT

    Energy Technology Data Exchange (ETDEWEB)

    Collins-Fekete, C [Universite Laval, Quebec, Quebec (Canada); Centre Hospitalier University de Quebec, Quebec, QC (Canada); Mass General Hospital (United States); Harvard Medical, Boston MA (United States); Schulte, R [Loma Linda University, Loma Linda, CA (United States); Beaulieu, L [Universite Laval, Quebec, Quebec (Canada); Centre Hospitalier University de Quebec, Quebec, QC (Canada); Seco, J [Mass General Hospital (United States); Harvard Medical, Boston MA (United States); Department of Medical Physics in Radiooncology, DKFZ German Cancer Research Center, Heidelberg (Germany)

    2016-06-15

    Purpose: Multiple Coulomb scattering is the largest contributor to blurring in proton imaging. Here we tested a maximum likelihood least squares estimator (MLLSE) to improve the spatial resolution of proton radiography (pRad) and proton computed tomography (pCT). Methods: The object is discretized into voxels and the average relative stopping power through voxel columns defined from the source to the detector pixels is optimized such that it maximizes the likelihood of the proton energy loss. The length spent by individual protons in each column is calculated through an optimized cubic spline estimate. pRad images were first produced using Geant4 simulations. An anthropomorphic head phantom and the Catphan line-pair module for 3-D spatial resolution were studied and resulting images were analyzed. Both parallel and conical beam have been investigated for simulated pRad acquisition. Then, experimental data of a pediatric head phantom (CIRS) were acquired using a recently completed experimental pCT scanner. Specific filters were applied on proton angle and energy loss data to remove proton histories that underwent nuclear interactions. The MTF10% (lp/mm) was used to evaluate and compare spatial resolution. Results: Numerical simulations showed improvement in the pRad spatial resolution for the parallel (2.75 to 6.71 lp/cm) and conical beam (3.08 to 5.83 lp/cm) reconstructed with the MLLSE compared to averaging detector pixel signals. For full tomographic reconstruction, the improved pRad were used as input into a simultaneous algebraic reconstruction algorithm. The Catphan pCT reconstruction based on the MLLSE-enhanced projection showed spatial resolution improvement for the parallel (2.83 to 5.86 lp/cm) and conical beam (3.03 to 5.15 lp/cm). The anthropomorphic head pCT displayed important contrast gains in high-gradient regions. Experimental results also demonstrated significant improvement in spatial resolution of the pediatric head radiography. Conclusion: The

  12. SkyFACT: high-dimensional modeling of gamma-ray emission with adaptive templates and penalized likelihoods

    Science.gov (United States)

    Storm, Emma; Weniger, Christoph; Calore, Francesca

    2017-08-01

    We present SkyFACT (Sky Factorization with Adaptive Constrained Templates), a new approach for studying, modeling and decomposing diffuse gamma-ray emission. Like most previous analyses, the approach relies on predictions from cosmic-ray propagation codes like GALPROP and DRAGON. However, in contrast to previous approaches, we account for the fact that models are not perfect and allow for a very large number (gtrsim 105) of nuisance parameters to parameterize these imperfections. We combine methods of image reconstruction and adaptive spatio-spectral template regression in one coherent hybrid approach. To this end, we use penalized Poisson likelihood regression, with regularization functions that are motivated by the maximum entropy method. We introduce methods to efficiently handle the high dimensionality of the convex optimization problem as well as the associated semi-sparse covariance matrix, using the L-BFGS-B algorithm and Cholesky factorization. We test the method both on synthetic data as well as on gamma-ray emission from the inner Galaxy, |l|model that removes most of the residual emission from the inner Galaxy, based on conventional diffuse emission components as well as components for the Fermi bubbles, the Fermi Galactic center excess, and extended sources along the Galactic disk. Variants of this reference model can serve as basis for future studies of diffuse emission in and outside the Galactic disk.

  13. The likelihood of reaching minimum clinically important difference and substantial clinical benefit at 2 years following a 3-column osteotomy: analysis of 140 patients.

    Science.gov (United States)

    Fakurnejad, Shayan; Scheer, Justin K; Lafage, Virginie; Smith, Justin S; Deviren, Vedat; Hostin, Richard; Mundis, Gregory M; Burton, Douglas C; Klineberg, Eric; Gupta, Munish; Kebaish, Khaled; Shaffrey, Christopher I; Bess, Shay; Schwab, Frank; Ames, Christopher P

    2015-09-01

    Three-column osteotomies (3COs) are technically challenging techniques for correcting severe rigid spinal deformities. The impact of these interventions on outcomes reaching minimum clinically important difference (MCID) or substantial clinical benefit (SCB) is unclear. The objective of this study was to determine the rates of MCID and SCB in standard health-related quality of life (HRQOL) measures after 3COs in patients with adult spinal deformity (ASD). The impacts of location of the uppermost instrumented vertebra (UIV) on clinical outcomes and of maintenance on sagittal correction at 2 years postoperatively were also examined. The authors conducted a retrospective multicenter analysis of the records from adult patients who underwent 3CO with complete 2-year radiographic and clinical follow-ups. Cases were categorized according to established radiographic thresholds for pelvic tilt (> 22°), sagittal vertical axis (> 4.7 cm), and the mismatch between pelvic incidence and lumbar lordosis (> 11°). The cases were also analyzed on the basis of a UIV in the upper thoracic (T1-6) or thoracolumbar (T9-L1) region. Patient-reported outcome measures evaluated preoperatively and 2 years postoperatively included Oswestry Disability Index (ODI) scores, the Physical Component Summary and Mental Component Summary (MCS) scores of the 36-Item Short Form Health Survey, and Scoliosis Research Society-22 questionnaire (SRS-22) scores. The percentages of patients whose outcomes for these measures met MCID and SCB were compared among the groups. Data from 140 patients (101 women and 39 men) were included in the analysis; the average patient age was 57.3 ± 12.4 years (range 20-82 years). Of these patients, 94 had undergone only pedicle subtraction osteotomy (PSO) and 42 only vertebral column resection (VCR); 113 patients had a UIV in the upper thoracic (n = 63) orthoracolumbar region (n = 50). On average, 2 years postoperatively the patients had significantly improved in all HRQOL

  14. A Combined Maximum-likelihood Analysis of the High-energy Astrophysical Neutrino Flux Measured with IceCube

    Science.gov (United States)

    Aartsen, M. G.; Abraham, K.; Ackermann, M.; Adams, J.; Aguilar, J. A.; Ahlers, M.; Ahrens, M.; Altmann, D.; Anderson, T.; Archinger, M.; Arguelles, C.; Arlen, T. C.; Auffenberg, J.; Bai, X.; Barwick, S. W.; Baum, V.; Bay, R.; Beatty, J. J.; Becker Tjus, J.; Becker, K.-H.; Beiser, E.; BenZvi, S.; Berghaus, P.; Berley, D.; Bernardini, E.; Bernhard, A.; Besson, D. Z.; Binder, G.; Bindig, D.; Bissok, M.; Blaufuss, E.; Blumenthal, J.; Boersma, D. J.; Bohm, C.; Börner, M.; Bos, F.; Bose, D.; Böser, S.; Botner, O.; Braun, J.; Brayeur, L.; Bretz, H.-P.; Brown, A. M.; Buzinsky, N.; Casey, J.; Casier, M.; Cheung, E.; Chirkin, D.; Christov, A.; Christy, B.; Clark, K.; Classen, L.; Coenders, S.; Cowen, D. F.; Cruz Silva, A. H.; Daughhetee, J.; Davis, J. C.; Day, M.; de André, J. P. A. M.; De Clercq, C.; Dembinski, H.; De Ridder, S.; Desiati, P.; de Vries, K. D.; de Wasseige, G.; de With, M.; DeYoung, T.; Díaz-Vélez, J. C.; Dumm, J. P.; Dunkman, M.; Eagan, R.; Eberhardt, B.; Ehrhardt, T.; Eichmann, B.; Euler, S.; Evenson, P. A.; Fadiran, O.; Fahey, S.; Fazely, A. R.; Fedynitch, A.; Feintzeig, J.; Felde, J.; Filimonov, K.; Finley, C.; Fischer-Wasels, T.; Flis, S.; Fuchs, T.; Gaisser, T. K.; Gaior, R.; Gallagher, J.; Gerhardt, L.; Ghorbani, K.; Gier, D.; Gladstone, L.; Glagla, M.; Glüsenkamp, T.; Goldschmidt, A.; Golup, G.; Gonzalez, J. G.; Goodman, J. A.; Góra, D.; Grant, D.; Gretskov, P.; Groh, J. C.; Gross, A.; Ha, C.; Haack, C.; Haj Ismail, A.; Hallgren, A.; Halzen, F.; Hansmann, B.; Hanson, K.; Hebecker, D.; Heereman, D.; Helbing, K.; Hellauer, R.; Hellwig, D.; Hickford, S.; Hignight, J.; Hill, G. C.; Hoffman, K. D.; Hoffmann, R.; Holzapfel, K.; Homeier, A.; Hoshina, K.; Huang, F.; Huber, M.; Huelsnitz, W.; Hulth, P. O.; Hultqvist, K.; In, S.; Ishihara, A.; Jacobi, E.; Japaridze, G. S.; Jero, K.; Jurkovic, M.; Kaminsky, B.; Kappes, A.; Karg, T.; Karle, A.; Kauer, M.; Keivani, A.; Kelley, J. L.; Kemp, J.; Kheirandish, A.; Kiryluk, J.; Kläs, J.; Klein, S. R.; Kohnen, G.; Kolanoski, H.; Konietz, R.; Koob, A.; Köpke, L.; Kopper, C.; Kopper, S.; Koskinen, D. J.; Kowalski, M.; Krings, K.; Kroll, G.; Kroll, M.; Kunnen, J.; Kurahashi, N.; Kuwabara, T.; Labare, M.; Lanfranchi, J. L.; Larson, M. J.; Lesiak-Bzdak, M.; Leuermann, M.; Leuner, J.; Lünemann, J.; Madsen, J.; Maggi, G.; Mahn, K. B. M.; Maruyama, R.; Mase, K.; Matis, H. S.; Maunu, R.; McNally, F.; Meagher, K.; Medici, M.; Meli, A.; Menne, T.; Merino, G.; Meures, T.; Miarecki, S.; Middell, E.; Middlemas, E.; Miller, J.; Mohrmann, L.; Montaruli, T.; Morse, R.; Nahnhauer, R.; Naumann, U.; Niederhausen, H.; Nowicki, S. C.; Nygren, D. R.; Obertacke, A.; Olivas, A.; Omairat, A.; O'Murchadha, A.; Palczewski, T.; Paul, L.; Pepper, J. A.; Pérez de los Heros, C.; Pfendner, C.; Pieloth, D.; Pinat, E.; Posselt, J.; Price, P. B.; Przybylski, G. T.; Pütz, J.; Quinnan, M.; Rädel, L.; Rameez, M.; Rawlins, K.; Redl, P.; Reimann, R.; Relich, M.; Resconi, E.; Rhode, W.; Richman, M.; Richter, S.; Riedel, B.; Robertson, S.; Rongen, M.; Rott, C.; Ruhe, T.; Ruzybayev, B.; Ryckbosch, D.; Saba, S. M.; Sabbatini, L.; Sander, H.-G.; Sandrock, A.; Sandroos, J.; Sarkar, S.; Schatto, K.; Scheriau, F.; Schimp, M.; Schmidt, T.; Schmitz, M.; Schoenen, S.; Schöneberg, S.; Schönwald, A.; Schukraft, A.; Schulte, L.; Seckel, D.; Seunarine, S.; Shanidze, R.; Smith, M. W. E.; Soldin, D.; Spiczak, G. M.; Spiering, C.; Stahlberg, M.; Stamatikos, M.; Stanev, T.; Stanisha, N. A.; Stasik, A.; Stezelberger, T.; Stokstad, R. G.; Stössl, A.; Strahler, E. A.; Ström, R.; Strotjohann, N. L.; Sullivan, G. W.; Sutherland, M.; Taavola, H.; Taboada, I.; Ter-Antonyan, S.; Terliuk, A.; Tešić, G.; Tilav, S.; Toale, P. A.; Tobin, M. N.; Tosi, D.; Tselengidou, M.; Unger, E.; Usner, M.; Vallecorsa, S.; Vandenbroucke, J.; van Eijndhoven, N.; Vanheule, S.; van Santen, J.; Veenkamp, J.; Vehring, M.; Voge, M.; Vraeghe, M.; Walck, C.; Wallace, A.; Wallraff, M.; Wandkowsky, N.; Weaver, Ch.; Wendt, C.; Westerhoff, S.; Whelan, B. J.; Whitehorn, N.; Wichary, C.; Wiebe, K.; Wiebusch, C. H.; Wille, L.; Williams, D. R.; Wissing, H.; Wolf, M.; Wood, T. R.; Woschnagg, K.; Xu, D. L.; Xu, X. W.; Xu, Y.; Yanez, J. P.; Yodh, G.; Yoshida, S.; Zarzhitsky, P.; Zoll, M.; IceCube Collaboration

    2015-08-01

    Evidence for an extraterrestrial flux of high-energy neutrinos has now been found in multiple searches with the IceCube detector. The first solid evidence was provided by a search for neutrino events with deposited energies ≳ 30 TeV and interaction vertices inside the instrumented volume. Recent analyses suggest that the extraterrestrial flux extends to lower energies and is also visible with throughgoing, νμ-induced tracks from the Northern Hemisphere. Here, we combine the results from six different IceCube searches for astrophysical neutrinos in a maximum-likelihood analysis. The combined event sample features high-statistics samples of shower-like and track-like events. The data are fit in up to three observables: energy, zenith angle, and event topology. Assuming the astrophysical neutrino flux to be isotropic and to consist of equal flavors at Earth, the all-flavor spectrum with neutrino energies between 25 TeV and 2.8 PeV is well described by an unbroken power law with best-fit spectral index -2.50 ± 0.09 and a flux at 100 TeV of ({6.7}-1.2+1.1)× {10}-18 {{GeV}}-1 {{{s}}}-1 {{sr}}-1 {{cm}}-2. Under the same assumptions, an unbroken power law with index -2 is disfavored with a significance of 3.8σ (p = 0.0066%) with respect to the best fit. This significance is reduced to 2.1σ (p = 1.7%) if instead we compare the best fit to a spectrum with index -2 that has an exponential cut-off at high energies. Allowing the electron-neutrino flux to deviate from the other two flavors, we find a νe fraction of 0.18 ± 0.11 at Earth. The sole production of electron neutrinos, which would be characteristic of neutron-decay-dominated sources, is rejected with a significance of 3.6σ (p = 0.014%).

  15. Use of Drop-In Clinic Versus Appointment-Based Care for LGBT Youth: Influences on the Likelihood to Access Different Health-Care Structures.

    Science.gov (United States)

    Newman, Bernie S; Passidomo, Kim; Gormley, Kate; Manley, Alecia

    2014-06-01

    The structure of health-care service delivery can address barriers that make it difficult for lesbian, gay, bisexual, and transgender (LGBT) adolescents to use health services. This study explores the differences among youth who access care in one of two service delivery structures in an LGBT health-care center: the drop-in clinic or the traditional appointment-based model. Analysis of 578 records of LGBT and straight youth (aged 14-24) who accessed health care either through a drop-in clinic or appointment-based care within the first year of offering the drop-in clinic reveals patterns of use when both models are available. We studied demographic variables previously shown to be associated with general health-care access to determine how each correlated with a tendency to use the drop-in structure versus routine appointments. Once the covariates were identified, we conducted a logistic regression analysis to identify its association with likelihood of using the drop-in clinic. Insurance status, housing stability, education, race, and gender identity were most strongly associated with the type of clinic used. Youth who relied on Medicaid, those in unstable housing, and African Americans were most likely to use the drop-in clinic. Transgender youth and those with higher education were more likely to use the appointment-based clinic. Although sexual orientation and HIV status were not related to type of clinic used, youth who were HIV positive used the appointment-based clinic more frequently. Both routes to health care served distinct populations who often experience barriers to accessible, affordable, and knowledgeable care. Further study of the factors related to accessing health care may clarify the extent to which drop-in hours in a youth-friendly context may increase the use of health care by the most socially marginalized youth.

  16. Evaluation of the likelihood of reflux developing in patients with recurrent upper respiratory infections, recurrent sinusitis or recurrent otitis seen in ear-nose-throat outpatient clinics.

    Science.gov (United States)

    Önal, Zerrin; Çullu-Çokuğraş, Fügen; Işıldak, Hüseyin; Kaytaz, Asım; Kutlu, Tufan; Erkan, Tülay; Doğusoy, Gülen

    2015-01-01

    Gastroesophageal reflux is considered a risk factor for recurrent or persistent upper and lower respiratory tract conditions including asthma, chronic cough, sinusitis, laryngitis, serous otitis and paroxysmal laryngospasm. Fifty-one subjects with recurrent (more than three) episodes of upper respiratory tract infection (URTI), serous otitis or sinusitis who had been admitted to an earnose- throat (ENT) outpatient clinic during the previous year were enrolled in the present study to evaluate the presence of laryngeal and/or esophageal reflux. The participants, who were randomly selected, were questioned about symptoms of reflux, including vomiting, abdominal pain, failure to thrive, halitosis, bitter taste in the mouth, chronic cough, heartburn, constipation and hoarseness. All subjects had an endoscopic examination, an otoscopic examination, a tympanogram and upper GI system endoscopy. Esophagitis was diagnosed endoscopically and histologically. The likelihood of occurrence of esophagitis was found to be higher only among subjects with postglottic edema/erythema as determined by pathological laryngeal examination. The reflux complaints reported did not predict the development of esophagitis, but the odds of esophagitis occurring were ninefold greater among subjects with recurrent otitis. Of the subjects, 45.1% were Helicobacter pylori-positive. However, no association was found between esophagitis and Helicobacter pylori positivity. The likelihood of the occurrence of esophagitis was found to be increased in the presence of recurrent otitis media and/or postglottic edema, irrespective of the presence of reflux symptoms. We concluded that, in contrast to the situation where adults are concerned, the boundaries for discriminating laryngopharyngeal reflux from gastroesophageal reflux are somewhat blurred in pediatric patients.

  17. Clinical high risk for psychosis

    DEFF Research Database (Denmark)

    van der Steen, Y; Gimpel-Drees, J; Lataster, T

    2017-01-01

    OBJECTIVE: The aim of this study was to assess associations between momentary stress and both affective and psychotic symptoms in everyday life of individuals at clinical high risk (CHR), compared to chronic psychotic patients and healthy controls, in search for evidence of early stress sensitiza...

  18. Does the use of a prescriptive clinical prediction rule increase the likelihood of applying inappropriate treatments? A survey using clinical vignettes.

    Science.gov (United States)

    Learman, Kenneth; Showalter, Christopher; Cook, Chad

    2012-12-01

    Clinical prediction rules (CPR) have been promoted as a natural progression in treatment decision-making. Methodological limitations of derivation and validation studies have resulted in some researchers questioning the indiscriminate use of CPRs. The purpose of this study was to explore the influence of the lumbar spine manipulation CPR (LCPR) use on clinical decision making through a survey of practicing clinicians. A sample of 535 physiotherapists from the United States, who routinely use thrust manipulation (TM), agreed to participate in this study. Those who use and those who do not use the LCPR determined group designation. A 9-step clinical vignette progressed a fictitious patient meeting the LCPR from no medical concern to significant concern for general health. A 2 × 9 chi-square was used to analyze the progression of decision-making. APTA board certification (P = 0.04), gender (P < 0.01), and manual therapy course attendance (P = 0.04) may increase and following the McKenzie philosophy (P < 0.01) may decrease the use of the LCPR. Subjects using the LCPR were more likely to choose to manipulate the patient (P < 0.01 and P = 0.02) during the first 2 scenarios of the vignette but both groups avoided TM equally as the medical concerns progressed. The results would suggest that subjects who routinely use TM would modify their decision-making to accommodate medical complications that preclude the indication for TM, and hence a potentially harmful intervention. This propensity to modify behaviour, was seen in both groups, regardless of their initial tendency to use the LCPR.

  19. Predicting reattendance at a high-risk breast cancer clinic.

    Science.gov (United States)

    Ormseth, Sarah R; Wellisch, David K; Aréchiga, Adam E; Draper, Taylor L

    2015-10-01

    The research about follow-up patterns of women attending high-risk breast-cancer clinics is sparse. This study sought to profile daughters of breast-cancer patients who are likely to return versus those unlikely to return for follow-up care in a high-risk clinic. Our investigation included 131 patients attending the UCLA Revlon Breast Center High Risk Clinic. Predictor variables included age, computed breast-cancer risk, participants' perceived personal risk, clinically significant depressive symptomatology (CES-D score ≥ 16), current level of anxiety (State-Trait Anxiety Inventory), and survival status of participants' mothers (survived or passed away from breast cancer). A greater likelihood of reattendance was associated with older age (adjusted odds ratio [AOR] = 1.07, p = 0.004), computed breast-cancer risk (AOR = 1.10, p = 0.017), absence of depressive symptomatology (AOR = 0.25, p = 0.009), past psychiatric diagnosis (AOR = 3.14, p = 0.029), and maternal loss to breast cancer (AOR = 2.59, p = 0.034). Also, an interaction was found between mother's survival and perceived risk (p = 0.019), such that reattendance was associated with higher perceived risk among participants whose mothers survived (AOR = 1.04, p = 0.002), but not those whose mothers died (AOR = 0.99, p = 0.685). Furthermore, a nonlinear inverted "U" relationship was observed between state anxiety and reattendance (p = 0.037); participants with moderate anxiety were more likely to reattend than those with low or high anxiety levels. Demographic, medical, and psychosocial factors were found to be independently associated with reattendance to a high-risk breast-cancer clinic. Explication of the profiles of women who may or may not reattend may serve to inform the development and implementation of interventions to increase the likelihood of follow-up care.

  20. A combined maximum-likelihood analysis of the high-energy astrophysical neutrino flux measured with IceCube

    CERN Document Server

    Aartsen, M G; Ackermann, M; Adams, J; Aguilar, J A; Ahlers, M; Ahrens, M; Altmann, D; Anderson, T; Archinger, M; Arguelles, C; Arlen, T C; Auffenberg, J; Bai, X; Barwick, S W; Baum, V; Bay, R; Beatty, J J; Tjus, J Becker; Becker, K -H; Beiser, E; BenZvi, S; Berghaus, P; Berley, D; Bernardini, E; Bernhard, A; Besson, D Z; Binder, G; Bindig, D; Bissok, M; Blaufuss, E; Blumenthal, J; Boersma, D J; Bohm, C; Börner, M; Bos, F; Bose, D; Böser, S; Botner, O; Braun, J; Brayeur, L; Bretz, H -P; Brown, A M; Buzinsky, N; Casey, J; Casier, M; Cheung, E; Chirkin, D; Christov, A; Christy, B; Clark, K; Classen, L; Coenders, S; Cowen, D F; Silva, A H Cruz; Daughhetee, J; Davis, J C; Day, M; de André, J P A M; De Clercq, C; Dembinski, H; De Ridder, S; Desiati, P; de Vries, K D; de Wasseige, G; de With, M; DeYoung, T; Díaz-Vélez, J C; Dumm, J P; Dunkman, M; Eagan, R; Eberhardt, B; Ehrhardt, T; Eichmann, B; Euler, S; Evenson, P A; Fadiran, O; Fahey, S; Fazely, A R; Fedynitch, A; Feintzeig, J; Felde, J; Filimonov, K; Finley, C; Fischer-Wasels, T; Flis, S; Fuchs, T; Gaisser, T K; Gaior, R; Gallagher, J; Gerhardt, L; Ghorbani, K; Gier, D; Gladstone, L; Glagla, M; Glüsenkamp, T; Goldschmidt, A; Golup, G; Gonzalez, J G; Goodman, J A; Góra, D; Grant, D; Gretskov, P; Groh, J C; Groß, A; Ha, C; Haack, C; Ismail, A Haj; Hallgren, A; Halzen, F; Hansmann, B; Hanson, K; Hebecker, D; Heereman, D; Helbing, K; Hellauer, R; Hellwig, D; Hickford, S; Hignight, J; Hill, G C; Hoffman, K D; Hoffmann, R; Holzapfel, K; Homeier, A; Hoshina, K; Huang, F; Huber, M; Huelsnitz, W; Hulth, P O; Hultqvist, K; In, S; Ishihara, A; Jacobi, E; Japaridze, G S; Jero, K; Jurkovic, M; Kaminsky, B; Kappes, A; Karg, T; Karle, A; Kauer, M; Keivani, A; Kelley, J L; Kemp, J; Kheirandish, A; Kiryluk, J; Kläs, J; Klein, S R; Kohnen, G; Kolanoski, H; Konietz, R; Koob, A; Köpke, L; Kopper, C; Kopper, S; Koskinen, D J; Kowalski, M; Krings, K; Kroll, G; Kroll, M; Kunnen, J; Kurahashi, N; Kuwabara, T; Labare, M; Lanfranchi, J L; Larson, M J; Lesiak-Bzdak, M; Leuermann, M; Leuner, J; Lünemann, J; Madsen, J; Maggi, G; Mahn, K B M; Maruyama, R; Mase, K; Matis, H S; Maunu, R; McNally, F; Meagher, K; Medici, M; Meli, A; Menne, T; Merino, G; Meures, T; Miarecki, S; Middell, E; Middlemas, E; Miller, J; Mohrmann, L; Montaruli, T; Morse, R; Nahnhauer, R; Naumann, U; Niederhausen, H; Nowicki, S C; Nygren, D R; Obertacke, A; Olivas, A; Omairat, A; O'Murchadha, A; Palczewski, T; Paul, L; Pepper, J A; Heros, C Pérez de los; Pfendner, C; Pieloth, D; Pinat, E; Posselt, J; Price, P B; Przybylski, G T; Pütz, J; Quinnan, M; Rädel, L; Rameez, M; Rawlins, K; Redl, P; Reimann, R; Relich, M; Resconi, E; Rhode, W; Richman, M; Richter, S; Riedel, B; Robertson, S; Rongen, M; Rott, C; Ruhe, T; Ruzybayev, B; Ryckbosch, D; Saba, S M; Sabbatini, L; Sander, H -G; Sandrock, A; Sandroos, J; Sarkar, S; Schatto, K; Scheriau, F; Schimp, M; Schmidt, T; Schmitz, M; Schoenen, S; Schöneberg, S; Schönwald, A; Schukraft, A; Schulte, L; Seckel, D; Seunarine, S; Shanidze, R; Smith, M W E; Soldin, D; Spiczak, G M; Spiering, C; Stahlberg, M; Stamatikos, M; Stanev, T; Stanisha, N A; Stasik, A; Stezelberger, T; Stokstad, R G; Stößl, A; Strahler, E A; Ström, R; Strotjohann, N L; Sullivan, G W; Sutherland, M; Taavola, H; Taboada, I; Ter-Antonyan, S; Terliuk, A; Tešić, G; Tilav, S; Toale, P A; Tobin, M N; Tosi, D; Tselengidou, M; Unger, E; Usner, M; Vallecorsa, S; Vandenbroucke, J; van Eijndhoven, N; Vanheule, S; van Santen, J; Veenkamp, J; Vehring, M; Voge, M; Vraeghe, M; Walck, C; Wallace, A; Wallraff, M; Wandkowsky, N; Weaver, C; Wendt, C; Westerhoff, S; Whelan, B J; Whitehorn, N; Wichary, C; Wiebe, K; Wiebusch, C H; Wille, L; Williams, D R; Wissing, H; Wolf, M; Wood, T R; Woschnagg, K; Xu, D L; Xu, X W; Xu, Y; Yanez, J P; Yodh, G; Yoshida, S; Zarzhitsky, P; Zoll, M

    2015-01-01

    Evidence for an extraterrestrial flux of high-energy neutrinos has now been found in multiple searches with the IceCube detector. The first solid evidence was provided by a search for neutrino events with deposited energies $\\gtrsim30$~TeV and interaction vertices inside the instrumented volume. Recent analyses suggest that the extraterrestrial flux extends to lower energies and is also visible with throughgoing, $\

  1. Incremental value of myocardial perfusion over coronary angiography by spectral computed tomography in patients with intermediate to high likelihood of coronary artery disease

    Energy Technology Data Exchange (ETDEWEB)

    Carrascosa, Patricia M., E-mail: investigacion@diagnosticomaipu.com.ar; Deviggiano, Alejandro; Capunay, Carlos; Campisi, Roxana; López Munain, Marina de; Vallejos, Javier; Tajer, Carlos; Rodriguez-Granillo, Gaston A.

    2015-04-15

    Highlights: •We evaluated myocardial perfusion by dual energy computed tomography (DECT). •We included patients with intermediate to high likelihood of coronary artery disease. •Stress myocardial perfusion by DECT had a reliable accuracy for the detection of ischemia. •Stress myocardial perfusion with DECT showed an incremental value over anatomical evaluation. •DECT imaging was associated to a significant reduction in radiation dose compared to SPECT. -- Abstract: Purpose: We sought to explore the diagnostic performance of dual energy computed tomography (DECT) for the evaluation of myocardial perfusion in patients with intermediate to high likelihood of coronary artery disease (CAD). Materials and methods: Consecutive patients with known or suspected CAD referred for myocardial perfusion imaging by single-photon emission computed tomography (SPECT) constituted the study population and were scanned using a DECT scanner equipped with gemstone detectors for spectral imaging, and a SPECT. The same pharmacological stress was used for both scans. Results: Twenty-five patients were prospectively included in the study protocol. The mean age was 63.4 ± 10.6 years. The total mean effective radiation dose was 7.5 ± 1.2 mSv with DECT and 8.2 ± 1.7 mSv with SPECT (p = 0.007). A total of 425 left ventricular segments were evaluated by DECT, showing a reliable accuracy for the detection of reversible perfusion defects [area under ROC curve (AUC) 0.84 (0.80–0.87)]. Furthermore, adding stress myocardial perfusion provided a significant incremental value over anatomical evaluation alone by computed tomography coronary angiography [AUC 0.70 (0.65–0.74), p = 0.003]. Conclusions: In this pilot investigation, stress myocardial perfusion by DECT demonstrated a significant incremental value over anatomical evaluation alone by CTCA for the detection of reversible perfusion defects.

  2. Equalized near maximum likelihood detector

    OpenAIRE

    2012-01-01

    This paper presents new detector that is used to mitigate intersymbol interference introduced by bandlimited channels. This detector is named equalized near maximum likelihood detector which combines nonlinear equalizer and near maximum likelihood detector. Simulation results show that the performance of equalized near maximum likelihood detector is better than the performance of nonlinear equalizer but worse than near maximum likelihood detector.

  3. Maximum Likelihood Associative Memories

    OpenAIRE

    Gripon, Vincent; Rabbat, Michael

    2013-01-01

    Associative memories are structures that store data in such a way that it can later be retrieved given only a part of its content -- a sort-of error/erasure-resilience property. They are used in applications ranging from caches and memory management in CPUs to database engines. In this work we study associative memories built on the maximum likelihood principle. We derive minimum residual error rates when the data stored comes from a uniform binary source. Second, we determine the minimum amo...

  4. Likelihood estimators for multivariate extremes

    KAUST Repository

    Huser, Raphaël

    2015-11-17

    The main approach to inference for multivariate extremes consists in approximating the joint upper tail of the observations by a parametric family arising in the limit for extreme events. The latter may be expressed in terms of componentwise maxima, high threshold exceedances or point processes, yielding different but related asymptotic characterizations and estimators. The present paper clarifies the connections between the main likelihood estimators, and assesses their practical performance. We investigate their ability to estimate the extremal dependence structure and to predict future extremes, using exact calculations and simulation, in the case of the logistic model.

  5. Likelihood approaches for proportional likelihood ratio model with right-censored data.

    Science.gov (United States)

    Zhu, Hong

    2014-06-30

    Regression methods for survival data with right censoring have been extensively studied under semiparametric transformation models such as the Cox regression model and the proportional odds model. However, their practical application could be limited because of possible violation of model assumption or lack of ready interpretation for the regression coefficients in some cases. As an alternative, in this paper, the proportional likelihood ratio model introduced by Luo and Tsai is extended to flexibly model the relationship between survival outcome and covariates. This model has a natural connection with many important semiparametric models such as generalized linear model and density ratio model and is closely related to biased sampling problems. Compared with the semiparametric transformation model, the proportional likelihood ratio model is appealing and practical in many ways because of its model flexibility and quite direct clinical interpretation. We present two likelihood approaches for the estimation and inference on the target regression parameters under independent and dependent censoring assumptions. Based on a conditional likelihood approach using uncensored failure times, a numerically simple estimation procedure is developed by maximizing a pairwise pseudo-likelihood. We also develop a full likelihood approach, and the most efficient maximum likelihood estimator is obtained by a profile likelihood. Simulation studies are conducted to assess the finite-sample properties of the proposed estimators and compare the efficiency of the two likelihood approaches. An application to survival data for bone marrow transplantation patients of acute leukemia is provided to illustrate the proposed method and other approaches for handling non-proportionality. The relative merits of these methods are discussed in concluding remarks.

  6. Receiver-operating characteristic curves and likelihood ratios: improvements over traditional methods for the evaluation and application of veterinary clinical pathology tests

    DEFF Research Database (Denmark)

    Gardner, Ian A.; Greiner, Matthias

    2006-01-01

    Receiver-operating characteristic (ROC) curves provide a cutoff-independent method for the evaluation of continuous or ordinal tests used in clinical pathology laboratories. The area under the curve is a useful overall measure of test accuracy and can be used to compare different tests (or differ...

  7. Decision Making for Borderline Cases in Pass/Fail Clinical Anatomy Courses: The Practical Value of the Standard Error of Measurement and Likelihood Ratio in a Diagnostic Test

    Science.gov (United States)

    Severo, Milton; Silva-Pereira, Fernanda; Ferreira, Maria Amelia

    2013-01-01

    Several studies have shown that the standard error of measurement (SEM) can be used as an additional “safety net” to reduce the frequency of false-positive or false-negative student grading classifications. Practical examinations in clinical anatomy are often used as diagnostic tests to admit students to course final examinations. The aim of this…

  8. In all likelihood statistical modelling and inference using likelihood

    CERN Document Server

    Pawitan, Yudi

    2001-01-01

    Based on a course in the theory of statistics this text concentrates on what can be achieved using the likelihood/Fisherian method of taking account of uncertainty when studying a statistical problem. It takes the concept ot the likelihood as providing the best methods for unifying the demands of statistical modelling and the theory of inference. Every likelihood concept is illustrated by realistic examples, which are not compromised by computational problems. Examples range from asimile comparison of two accident rates, to complex studies that require generalised linear or semiparametric mode

  9. Inference in HIV dynamics models via hierarchical likelihood

    CERN Document Server

    Commenges, D; Putter, H; Thiebaut, R

    2010-01-01

    HIV dynamical models are often based on non-linear systems of ordinary differential equations (ODE), which do not have analytical solution. Introducing random effects in such models leads to very challenging non-linear mixed-effects models. To avoid the numerical computation of multiple integrals involved in the likelihood, we propose a hierarchical likelihood (h-likelihood) approach, treated in the spirit of a penalized likelihood. We give the asymptotic distribution of the maximum h-likelihood estimators (MHLE) for fixed effects, a result that may be relevant in a more general setting. The MHLE are slightly biased but the bias can be made negligible by using a parametric bootstrap procedure. We propose an efficient algorithm for maximizing the h-likelihood. A simulation study, based on a classical HIV dynamical model, confirms the good properties of the MHLE. We apply it to the analysis of a clinical trial.

  10. Likelihood Analysis of Seasonal Cointegration

    DEFF Research Database (Denmark)

    Johansen, Søren; Schaumburg, Ernst

    1999-01-01

    The error correction model for seasonal cointegration is analyzed. Conditions are found under which the process is integrated of order 1 and cointegrated at seasonal frequency, and a representation theorem is given. The likelihood function is analyzed and the numerical calculation of the maximum...... likelihood estimators is discussed. The asymptotic distribution of the likelihood ratio test for cointegrating rank is given. It is shown that the estimated cointegrating vectors are asymptotically mixed Gaussian. The results resemble the results for cointegration at zero frequency when expressed in terms...

  11. Analytic Methods for Cosmological Likelihoods

    OpenAIRE

    Taylor, A. N.; Kitching, T. D.

    2010-01-01

    We present general, analytic methods for Cosmological likelihood analysis and solve the "many-parameters" problem in Cosmology. Maxima are found by Newton's Method, while marginalization over nuisance parameters, and parameter errors and covariances are estimated by analytic marginalization of an arbitrary likelihood function with flat or Gaussian priors. We show that information about remaining parameters is preserved by marginalization. Marginalizing over all parameters, we find an analytic...

  12. Composite likelihood estimation of demographic parameters

    Directory of Open Access Journals (Sweden)

    Garrigan Daniel

    2009-11-01

    accuracy, demographic parameters from three simulated data sets that vary in the magnitude of a founder event and a skew in the effective population size of the X chromosome relative to the autosomes. The behavior of the Markov chain is also examined and shown to convergence to its stationary distribution, while also showing high levels of parameter mixing. The analysis of three pairwise comparisons of sub-Saharan African human populations with non-African human populations do not provide unequivocal support for a strong non-African founder event from these nuclear data. The estimates do however suggest a skew in the ratio of X chromosome to autosome effective population size that is greater than one. However in all three cases, the 95% highest posterior density interval for this ratio does include three-fourths, the value expected under an equal breeding sex ratio. Conclusion The implementation of composite and approximate likelihood methods in a framework that includes MCMCMC demographic parameter estimation shows great promise for being flexible and computationally efficient enough to scale up to the level of whole-genome polymorphism and divergence analysis. Further work must be done to characterize the effects of the assumption of linkage equilibrium among genomic regions that is crucial to the validity of applying the composite likelihood method.

  13. On the likelihood of forests

    Science.gov (United States)

    Shang, Yilun

    2016-08-01

    How complex a network is crucially impacts its function and performance. In many modern applications, the networks involved have a growth property and sparse structures, which pose challenges to physicists and applied mathematicians. In this paper, we introduce the forest likelihood as a plausible measure to gauge how difficult it is to construct a forest in a non-preferential attachment way. Based on the notions of admittable labeling and path construction, we propose algorithms for computing the forest likelihood of a given forest. Concrete examples as well as the distributions of forest likelihoods for all forests with some fixed numbers of nodes are presented. Moreover, we illustrate the ideas on real-life networks, including a benzenoid tree, a mathematical family tree, and a peer-to-peer network.

  14. Obtaining reliable Likelihood Ratio tests from simulated likelihood functions

    DEFF Research Database (Denmark)

    Andersen, Laura Mørch

    It is standard practice by researchers and the default option in many statistical programs to base test statistics for mixed models on simulations using asymmetric draws (e.g. Halton draws). This paper shows that when the estimated likelihood functions depend on standard deviations of mixed...

  15. Maximum Likelihood Identification of Nonlinear Model for High-speed Train%高速列车非线性模型的极大似然辨识

    Institute of Scientific and Technical Information of China (English)

    衷路生; 李兵; 龚锦红; 张永贤; 祝振敏

    2014-01-01

    提出高速列车非线性模型的极大似然(Maximum likelihood, ML)辨识方法,适合于高速列车在非高斯噪声干扰下的非线性模型的参数估计.首先,构建了描述高速列车单质点力学行为的随机离散非线性状态空间模型,并将高速列车参数的极大似然(ML)估计问题转化为期望极大(Expectation maximization,EM)的优化问题;然后,给出高速列车状态估计的粒子滤波器和粒子平滑器的设计方法,据此构造列车的条件数学期望,并给出最大化该数学期望的梯度搜索方法,进而得到列车参数的辨识算法,分析了算法的收敛速度;最后,进行了高速列车阻力系数估计的数值对比实验.结果表明,所提出的辨识方法的有效性.

  16. Silence that can be dangerous: a vignette study to assess healthcare professionals' likelihood of speaking up about safety concerns.

    Science.gov (United States)

    Schwappach, David L B; Gehring, Katrin

    2014-01-01

    To investigate the likelihood of speaking up about patient safety in oncology and to clarify the effect of clinical and situational context factors on the likelihood of voicing concerns. 1013 nurses and doctors in oncology rated four clinical vignettes describing coworkers' errors and rule violations in a self-administered factorial survey (65% response rate). Multiple regression analysis was used to model the likelihood of speaking up as outcome of vignette attributes, responder's evaluations of the situation and personal characteristics. Respondents reported a high likelihood of speaking up about patient safety but the variation between and within types of errors and rule violations was substantial. Staff without managerial function provided significantly higher levels of decision difficulty and discomfort to speak up. Based on the information presented in the vignettes, 74%-96% would speak up towards a supervisor failing to check a prescription, 45%-81% would point a coworker to a missed hand disinfection, 82%-94% would speak up towards nurses who violate a safety rule in medication preparation, and 59%-92% would question a doctor violating a safety rule in lumbar puncture. Several vignette attributes predicted the likelihood of speaking up. Perceived potential harm, anticipated discomfort, and decision difficulty were significant predictors of the likelihood of speaking up. Clinicians' willingness to speak up about patient safety is considerably affected by contextual factors. Physicians and nurses without managerial function report substantial discomfort with speaking up. Oncology departments should provide staff with clear guidance and trainings on when and how to voice safety concerns.

  17. Accurate structural correlations from maximum likelihood superpositions.

    Directory of Open Access Journals (Sweden)

    Douglas L Theobald

    2008-02-01

    Full Text Available The cores of globular proteins are densely packed, resulting in complicated networks of structural interactions. These interactions in turn give rise to dynamic structural correlations over a wide range of time scales. Accurate analysis of these complex correlations is crucial for understanding biomolecular mechanisms and for relating structure to function. Here we report a highly accurate technique for inferring the major modes of structural correlation in macromolecules using likelihood-based statistical analysis of sets of structures. This method is generally applicable to any ensemble of related molecules, including families of nuclear magnetic resonance (NMR models, different crystal forms of a protein, and structural alignments of homologous proteins, as well as molecular dynamics trajectories. Dominant modes of structural correlation are determined using principal components analysis (PCA of the maximum likelihood estimate of the correlation matrix. The correlations we identify are inherently independent of the statistical uncertainty and dynamic heterogeneity associated with the structural coordinates. We additionally present an easily interpretable method ("PCA plots" for displaying these positional correlations by color-coding them onto a macromolecular structure. Maximum likelihood PCA of structural superpositions, and the structural PCA plots that illustrate the results, will facilitate the accurate determination of dynamic structural correlations analyzed in diverse fields of structural biology.

  18. Factors Associated with Young Adults’ Pregnancy Likelihood

    Science.gov (United States)

    Kitsantas, Panagiota; Lindley, Lisa L.; Wu, Huichuan

    2014-01-01

    OBJECTIVES While progress has been made to reduce adolescent pregnancies in the United States, rates of unplanned pregnancy among young adults (18–29 years) remain high. In this study, we assessed factors associated with perceived likelihood of pregnancy (likelihood of getting pregnant/getting partner pregnant in the next year) among sexually experienced young adults who were not trying to get pregnant and had ever used contraceptives. METHODS We conducted a secondary analysis of 660 young adults, 18–29 years old in the United States, from the cross-sectional National Survey of Reproductive and Contraceptive Knowledge. Logistic regression and classification tree analyses were conducted to generate profiles of young adults most likely to report anticipating a pregnancy in the next year. RESULTS Nearly one-third (32%) of young adults indicated they believed they had at least some likelihood of becoming pregnant in the next year. Young adults who believed that avoiding pregnancy was not very important were most likely to report pregnancy likelihood (odds ratio [OR], 5.21; 95% CI, 2.80–9.69), as were young adults for whom avoiding a pregnancy was important but not satisfied with their current contraceptive method (OR, 3.93; 95% CI, 1.67–9.24), attended religious services frequently (OR, 3.0; 95% CI, 1.52–5.94), were uninsured (OR, 2.63; 95% CI, 1.31–5.26), and were likely to have unprotected sex in the next three months (OR, 1.77; 95% CI, 1.04–3.01). DISCUSSION These results may help guide future research and the development of pregnancy prevention interventions targeting sexually experienced young adults. PMID:25782849

  19. The Sherpa Maximum Likelihood Estimator

    Science.gov (United States)

    Nguyen, D.; Doe, S.; Evans, I.; Hain, R.; Primini, F.

    2011-07-01

    A primary goal for the second release of the Chandra Source Catalog (CSC) is to include X-ray sources with as few as 5 photon counts detected in stacked observations of the same field, while maintaining acceptable detection efficiency and false source rates. Aggressive source detection methods will result in detection of many false positive source candidates. Candidate detections will then be sent to a new tool, the Maximum Likelihood Estimator (MLE), to evaluate the likelihood that a detection is a real source. MLE uses the Sherpa modeling and fitting engine to fit a model of a background and source to multiple overlapping candidate source regions. A background model is calculated by simultaneously fitting the observed photon flux in multiple background regions. This model is used to determine the quality of the fit statistic for a background-only hypothesis in the potential source region. The statistic for a background-plus-source hypothesis is calculated by adding a Gaussian source model convolved with the appropriate Chandra point spread function (PSF) and simultaneously fitting the observed photon flux in each observation in the stack. Since a candidate source may be located anywhere in the field of view of each stacked observation, a different PSF must be used for each observation because of the strong spatial dependence of the Chandra PSF. The likelihood of a valid source being detected is a function of the two statistics (for background alone, and for background-plus-source). The MLE tool is an extensible Python module with potential for use by the general Chandra user.

  20. Dimension-Independent Likelihood-Informed MCMC

    KAUST Repository

    Cui, Tiangang

    2015-01-07

    Many Bayesian inference problems require exploring the posterior distribution of high-dimensional parameters, which in principle can be described as functions. By exploiting low-dimensional structure in the change from prior to posterior [distributions], we introduce a suite of MCMC samplers that can adapt to the complex structure of the posterior distribution, yet are well-defined on function space. Posterior sampling in nonlinear inverse problems arising from various partial di erential equations and also a stochastic differential equation are used to demonstrate the e ciency of these dimension-independent likelihood-informed samplers.

  1. Section 9: Ground Water - Likelihood of Release

    Science.gov (United States)

    HRS training. the ground water pathway likelihood of release factor category reflects the likelihood that there has been, or will be, a release of hazardous substances in any of the aquifers underlying the site.

  2. Phylogenetic estimation with partial likelihood tensors

    CERN Document Server

    Sumner, J G

    2008-01-01

    We present an alternative method for calculating likelihoods in molecular phylogenetics. Our method is based on partial likelihood tensors, which are generalizations of partial likelihood vectors, as used in Felsenstein's approach. Exploiting a lexicographic sorting and partial likelihood tensors, it is possible to obtain significant computational savings. We show this on a range of simulated data by enumerating all numerical calculations that are required by our method and the standard approach.

  3. Workshop on Likelihoods for the LHC Searches

    CERN Document Server

    2013-01-01

    The primary goal of this 3‐day workshop is to educate the LHC community about the scientific utility of likelihoods. We shall do so by describing and discussing several real‐world examples of the use of likelihoods, including a one‐day in‐depth examination of likelihoods in the Higgs boson studies by ATLAS and CMS.

  4. Parametric likelihood inference for interval censored competing risks data.

    Science.gov (United States)

    Hudgens, Michael G; Li, Chenxi; Fine, Jason P

    2014-03-01

    Parametric estimation of the cumulative incidence function (CIF) is considered for competing risks data subject to interval censoring. Existing parametric models of the CIF for right censored competing risks data are adapted to the general case of interval censoring. Maximum likelihood estimators for the CIF are considered under the assumed models, extending earlier work on nonparametric estimation. A simple naive likelihood estimator is also considered that utilizes only part of the observed data. The naive estimator enables separate estimation of models for each cause, unlike full maximum likelihood in which all models are fit simultaneously. The naive likelihood is shown to be valid under mixed case interval censoring, but not under an independent inspection process model, in contrast with full maximum likelihood which is valid under both interval censoring models. In simulations, the naive estimator is shown to perform well and yield comparable efficiency to the full likelihood estimator in some settings. The methods are applied to data from a large, recent randomized clinical trial for the prevention of mother-to-child transmission of HIV.

  5. Clinical characteristics of high grade foveal hypoplasia.

    Science.gov (United States)

    Park, Kyung-Ah; Oh, Sei Yeul

    2013-02-01

    To report clinical characteristics of high grade foveal hypoplasia. Patients with foveal hypoplasia of grade 3 or 4 on spectral domain optical coherence tomography according to a previously published scheme were enrolled. All patients underwent a full ophthalmologic assessment including visual acuity testing, slit lamp biomicroscopy, fundus examination, and evaluation of ocular alignment. The underlying causes of foveal hypoplasia were identified as albinism in five patients and aniridia in six patients. The mean logMAR visual acuity was 0.57 ± 0.24 (range 0.22-1.00) in the right eyes and 0.58 ± 0.21 (range 0.30-1.00) in the left eyes. On fundus examination in patients with albinism, two patients showed marked transparency, one patient showed moderate transparency, and two patients showed minimal transparency. Among six patients with aniridia, five patients showed normal macular pigmentation without macular reflex and one patient showed decreased macular pigmentation with no macular reflex. Patients with high grade macular hypoplasia tended to have poor visual acuities; however, the range of visual acuity was quite variable. Other factors associated with underlying disease could be the reason of this variability. Therefore, careful consideration should be given when assessing visual prognosis in foveal hypoplasia using optical coherence tomography.

  6. Empirical likelihood method in survival analysis

    CERN Document Server

    Zhou, Mai

    2015-01-01

    Add the Empirical Likelihood to Your Nonparametric ToolboxEmpirical Likelihood Method in Survival Analysis explains how to use the empirical likelihood method for right censored survival data. The author uses R for calculating empirical likelihood and includes many worked out examples with the associated R code. The datasets and code are available for download on his website and CRAN.The book focuses on all the standard survival analysis topics treated with empirical likelihood, including hazard functions, cumulative distribution functions, analysis of the Cox model, and computation of empiric

  7. Orthogonal NGS for High Throughput Clinical Diagnostics.

    Science.gov (United States)

    Chennagiri, Niru; White, Eric J; Frieden, Alexander; Lopez, Edgardo; Lieber, Daniel S; Nikiforov, Anastasia; Ross, Tristen; Batorsky, Rebecca; Hansen, Sherry; Lip, Va; Luquette, Lovelace J; Mauceli, Evan; Margulies, David; Milos, Patrice M; Napolitano, Nichole; Nizzari, Marcia M; Yu, Timothy; Thompson, John F

    2016-04-19

    Next generation sequencing is a transformative technology for discovering and diagnosing genetic disorders. However, high-throughput sequencing remains error-prone, necessitating variant confirmation in order to meet the exacting demands of clinical diagnostic sequencing. To address this, we devised an orthogonal, dual platform approach employing complementary target capture and sequencing chemistries to improve speed and accuracy of variant calls at a genomic scale. We combined DNA selection by bait-based hybridization followed by Illumina NextSeq reversible terminator sequencing with DNA selection by amplification followed by Ion Proton semiconductor sequencing. This approach yields genomic scale orthogonal confirmation of ~95% of exome variants. Overall variant sensitivity improves as each method covers thousands of coding exons missed by the other. We conclude that orthogonal NGS offers improvements in variant calling sensitivity when two platforms are used, better specificity for variants identified on both platforms, and greatly reduces the time and expense of Sanger follow-up, thus enabling physicians to act on genomic results more quickly.

  8. Improving and Evaluating Nested Sampling Algorithm for Marginal Likelihood Estimation

    Science.gov (United States)

    Ye, M.; Zeng, X.; Wu, J.; Wang, D.; Liu, J.

    2016-12-01

    With the growing impacts of climate change and human activities on the cycle of water resources, an increasing number of researches focus on the quantification of modeling uncertainty. Bayesian model averaging (BMA) provides a popular framework for quantifying conceptual model and parameter uncertainty. The ensemble prediction is generated by combining each plausible model's prediction, and each model is attached with a model weight which is determined by model's prior weight and marginal likelihood. Thus, the estimation of model's marginal likelihood is crucial for reliable and accurate BMA prediction. Nested sampling estimator (NSE) is a new proposed method for marginal likelihood estimation. The process of NSE is accomplished by searching the parameters' space from low likelihood area to high likelihood area gradually, and this evolution is finished iteratively via local sampling procedure. Thus, the efficiency of NSE is dominated by the strength of local sampling procedure. Currently, Metropolis-Hasting (M-H) algorithm is often used for local sampling. However, M-H is not an efficient sampling algorithm for high-dimensional or complicated parameter space. For improving the efficiency of NSE, it could be ideal to incorporate the robust and efficient sampling algorithm - DREAMzs into the local sampling of NSE. The comparison results demonstrated that the improved NSE could improve the efficiency of marginal likelihood estimation significantly. However, both improved and original NSEs suffer from heavy instability. In addition, the heavy computation cost of huge number of model executions is overcome by using an adaptive sparse grid surrogates.

  9. Employee Likelihood of Purchasing Health Insurance using Fuzzy Inference System

    Directory of Open Access Journals (Sweden)

    Lazim Abdullah

    2012-01-01

    Full Text Available Many believe that employees health and economic factors plays an important role in their likelihood to purchase health insurance. However decision to purchase health insurance is not trivial matters as many risk factors that influence decision. This paper presents a decision model using fuzzy inference system to identify the likelihoods of purchasing health insurance based on the selected risk factors. To build the likelihoods, data from one hundred and twenty eight employees at five organizations under the purview of Kota Star Municipality Malaysia were collected to provide input data. Three risk factors were considered as the input of the system including age, salary and risk of having illness. The likelihoods of purchasing health insurance was the output of the system and defined in three linguistic terms of Low, Medium and High. Input and output data were governed by the Mamdani inference rules of the system to decide the best linguistic term. The linguistic terms that describe the likelihoods of purchasing health insurance were identified by the system based on the three risk factors. It is found that twenty seven employees were likely to purchase health insurance at Low level and fifty six employees show their likelihoods at High level. The usage of fuzzy inference system would offer possible justifications to set a new approach in identifying prospective health insurance purchasers.

  10. Vestige: Maximum likelihood phylogenetic footprinting

    Directory of Open Access Journals (Sweden)

    Maxwell Peter

    2005-05-01

    Full Text Available Abstract Background Phylogenetic footprinting is the identification of functional regions of DNA by their evolutionary conservation. This is achieved by comparing orthologous regions from multiple species and identifying the DNA regions that have diverged less than neutral DNA. Vestige is a phylogenetic footprinting package built on the PyEvolve toolkit that uses probabilistic molecular evolutionary modelling to represent aspects of sequence evolution, including the conventional divergence measure employed by other footprinting approaches. In addition to measuring the divergence, Vestige allows the expansion of the definition of a phylogenetic footprint to include variation in the distribution of any molecular evolutionary processes. This is achieved by displaying the distribution of model parameters that represent partitions of molecular evolutionary substitutions. Examination of the spatial incidence of these effects across regions of the genome can identify DNA segments that differ in the nature of the evolutionary process. Results Vestige was applied to a reference dataset of the SCL locus from four species and provided clear identification of the known conserved regions in this dataset. To demonstrate the flexibility to use diverse models of molecular evolution and dissect the nature of the evolutionary process Vestige was used to footprint the Ka/Ks ratio in primate BRCA1 with a codon model of evolution. Two regions of putative adaptive evolution were identified illustrating the ability of Vestige to represent the spatial distribution of distinct molecular evolutionary processes. Conclusion Vestige provides a flexible, open platform for phylogenetic footprinting. Underpinned by the PyEvolve toolkit, Vestige provides a framework for visualising the signatures of evolutionary processes across the genome of numerous organisms simultaneously. By exploiting the maximum-likelihood statistical framework, the complex interplay between mutational

  11. The Laplace Likelihood Ratio Test for Heteroscedasticity

    Directory of Open Access Journals (Sweden)

    J. Martin van Zyl

    2011-01-01

    Full Text Available It is shown that the likelihood ratio test for heteroscedasticity, assuming the Laplace distribution, gives good results for Gaussian and fat-tailed data. The likelihood ratio test, assuming normality, is very sensitive to any deviation from normality, especially when the observations are from a distribution with fat tails. Such a likelihood test can also be used as a robust test for a constant variance in residuals or a time series if the data is partitioned into groups.

  12. Dishonestly increasing the likelihood of winning

    Directory of Open Access Journals (Sweden)

    Shaul Shalvi

    2012-05-01

    Full Text Available People not only seek to avoid losses or secure gains; they also attempt to create opportunities for obtaining positive outcomes. When distributing money between gambles with equal probabilities, people often invest in turning negative gambles into positive ones, even at a cost of reduced expected value. Results of an experiment revealed that (1 the preference to turn a negative outcome into a positive outcome exists when people's ability to do so depends on their performance levels (rather than merely on their choice, (2 this preference is amplified when the likelihood to turn negative into positive is high rather than low, and (3 this preference is attenuated when people can lie about their performance levels, allowing them to turn negative into positive not by performing better but rather by lying about how well they performed.

  13. Likelihood analysis of earthquake focal mechanism distributions

    CERN Document Server

    Kagan, Y Y

    2014-01-01

    In our paper published earlier we discussed forecasts of earthquake focal mechanism and ways to test the forecast efficiency. Several verification methods were proposed, but they were based on ad-hoc, empirical assumptions, thus their performance is questionable. In this work we apply a conventional likelihood method to measure a skill of forecast. The advantage of such an approach is that earthquake rate prediction can in principle be adequately combined with focal mechanism forecast, if both are based on the likelihood scores, resulting in a general forecast optimization. To calculate the likelihood score we need to compare actual forecasts or occurrences of predicted events with the null hypothesis that the mechanism's 3-D orientation is random. For double-couple source orientation the random probability distribution function is not uniform, which complicates the calculation of the likelihood value. To better understand the resulting complexities we calculate the information (likelihood) score for two rota...

  14. Silence that can be dangerous: a vignette study to assess healthcare professionals' likelihood of speaking up about safety concerns.

    Directory of Open Access Journals (Sweden)

    David L B Schwappach

    Full Text Available To investigate the likelihood of speaking up about patient safety in oncology and to clarify the effect of clinical and situational context factors on the likelihood of voicing concerns.1013 nurses and doctors in oncology rated four clinical vignettes describing coworkers' errors and rule violations in a self-administered factorial survey (65% response rate. Multiple regression analysis was used to model the likelihood of speaking up as outcome of vignette attributes, responder's evaluations of the situation and personal characteristics.Respondents reported a high likelihood of speaking up about patient safety but the variation between and within types of errors and rule violations was substantial. Staff without managerial function provided significantly higher levels of decision difficulty and discomfort to speak up. Based on the information presented in the vignettes, 74%-96% would speak up towards a supervisor failing to check a prescription, 45%-81% would point a coworker to a missed hand disinfection, 82%-94% would speak up towards nurses who violate a safety rule in medication preparation, and 59%-92% would question a doctor violating a safety rule in lumbar puncture. Several vignette attributes predicted the likelihood of speaking up. Perceived potential harm, anticipated discomfort, and decision difficulty were significant predictors of the likelihood of speaking up.Clinicians' willingness to speak up about patient safety is considerably affected by contextual factors. Physicians and nurses without managerial function report substantial discomfort with speaking up. Oncology departments should provide staff with clear guidance and trainings on when and how to voice safety concerns.

  15. Silence That Can Be Dangerous: A Vignette Study to Assess Healthcare Professionals’ Likelihood of Speaking up about Safety Concerns

    Science.gov (United States)

    Schwappach, David L. B.; Gehring, Katrin

    2014-01-01

    Purpose To investigate the likelihood of speaking up about patient safety in oncology and to clarify the effect of clinical and situational context factors on the likelihood of voicing concerns. Patients and Methods 1013 nurses and doctors in oncology rated four clinical vignettes describing coworkers’ errors and rule violations in a self-administered factorial survey (65% response rate). Multiple regression analysis was used to model the likelihood of speaking up as outcome of vignette attributes, responder’s evaluations of the situation and personal characteristics. Results Respondents reported a high likelihood of speaking up about patient safety but the variation between and within types of errors and rule violations was substantial. Staff without managerial function provided significantly higher levels of decision difficulty and discomfort to speak up. Based on the information presented in the vignettes, 74%−96% would speak up towards a supervisor failing to check a prescription, 45%−81% would point a coworker to a missed hand disinfection, 82%−94% would speak up towards nurses who violate a safety rule in medication preparation, and 59%−92% would question a doctor violating a safety rule in lumbar puncture. Several vignette attributes predicted the likelihood of speaking up. Perceived potential harm, anticipated discomfort, and decision difficulty were significant predictors of the likelihood of speaking up. Conclusions Clinicians’ willingness to speak up about patient safety is considerably affected by contextual factors. Physicians and nurses without managerial function report substantial discomfort with speaking up. Oncology departments should provide staff with clear guidance and trainings on when and how to voice safety concerns. PMID:25116338

  16. Classification and Clinical Management of Variants of Uncertain Significance in High Penetrance Cancer Predisposition Genes.

    Science.gov (United States)

    Moghadasi, Setareh; Eccles, Diana M; Devilee, Peter; Vreeswijk, Maaike P G; van Asperen, Christi J

    2016-04-01

    In 2008, the International Agency for Research on Cancer (IARC) proposed a system for classifying sequence variants in highly penetrant breast and colon cancer susceptibility genes, linked to clinical actions. This system uses a multifactorial likelihood model to calculate the posterior probability that an altered DNA sequence is pathogenic. Variants between 5%-94.9% (class 3) are categorized as variants of uncertain significance (VUS). This interval is wide and might include variants with a substantial difference in pathogenicity at either end of the spectrum. We think that carriers of class 3 variants would benefit from a fine-tuning of this classification. Classification of VUS to a category with a defined clinical significance is very important because for carriers of a pathogenic mutation full surveillance and risk-reducing surgery can reduce cancer incidence. Counselees who are not carriers of a pathogenic mutation can be discharged from intensive follow-up and avoid unnecessary risk-reducing surgery. By means of examples, we show how, in selected cases, additional data can lead to reclassification of some variants to a different class with different recommendations for surveillance and therapy. To improve the clinical utility of this classification system, we suggest a pragmatic adaptation to clinical practice.

  17. [Acupuncture clinical trials published in high impact factor journals].

    Science.gov (United States)

    Hu, Min; Liu, Jian-Ping; Wu, Xiao-Ke

    2014-12-01

    Acupuncture clinical trials are designed to provide reliable evidence of clinical efficacy, and SCI papers is one of the high-quality clinical efficacy of acupuncture research. To analyze these papers published in high impact factor journals on acupuncture clinical trials, we can study clinical trials from design to implementation, the efficacy of prevention and cure, combined with international standard practices to evaluate the effectiveness and safety of acupuncture. That is the core of acupuncture clinical trials, as well as a prerequisite for outstanding academic output. A scientific and complete acupuncture clinical trial should be topically novel, designed innovative, logically clear, linguistically refining, and the most important point lies in a great discovery and solving the pragmatic problem. All of these are critical points of papers to be published in high impact factor journal, and directly affect international evaluation and promotion of acupuncture.

  18. Maximum likelihood for genome phylogeny on gene content.

    Science.gov (United States)

    Zhang, Hongmei; Gu, Xun

    2004-01-01

    With the rapid growth of entire genome data, reconstructing the phylogenetic relationship among different genomes has become a hot topic in comparative genomics. Maximum likelihood approach is one of the various approaches, and has been very successful. However, there is no reported study for any applications in the genome tree-making mainly due to the lack of an analytical form of a probability model and/or the complicated calculation burden. In this paper we studied the mathematical structure of the stochastic model of genome evolution, and then developed a simplified likelihood function for observing a specific phylogenetic pattern under four genome situation using gene content information. We use the maximum likelihood approach to identify phylogenetic trees. Simulation results indicate that the proposed method works well and can identify trees with a high correction rate. Real data application provides satisfied results. The approach developed in this paper can serve as the basis for reconstructing phylogenies of more than four genomes.

  19. Adaptive Parallel Tempering for Stochastic Maximum Likelihood Learning of RBMs

    CERN Document Server

    Desjardins, Guillaume; Bengio, Yoshua

    2010-01-01

    Restricted Boltzmann Machines (RBM) have attracted a lot of attention of late, as one the principle building blocks of deep networks. Training RBMs remains problematic however, because of the intractibility of their partition function. The maximum likelihood gradient requires a very robust sampler which can accurately sample from the model despite the loss of ergodicity often incurred during learning. While using Parallel Tempering in the negative phase of Stochastic Maximum Likelihood (SML-PT) helps address the issue, it imposes a trade-off between computational complexity and high ergodicity, and requires careful hand-tuning of the temperatures. In this paper, we show that this trade-off is unnecessary. The choice of optimal temperatures can be automated by minimizing average return time (a concept first proposed by [Katzgraber et al., 2006]) while chains can be spawned dynamically, as needed, thus minimizing the computational overhead. We show on a synthetic dataset, that this results in better likelihood ...

  20. Likelihood inference for unions of interacting discs

    DEFF Research Database (Denmark)

    Møller, Jesper; Helisova, K.

    2010-01-01

    with respect to a given marked Poisson model (i.e. a Boolean model). We show how edge effects and other complications can be handled by considering a certain conditional likelihood. Our methodology is illustrated by analysing Peter Diggle's heather data set, where we discuss the results of simulation......This is probably the first paper which discusses likelihood inference for a random set using a germ-grain model, where the individual grains are unobservable, edge effects occur and other complications appear. We consider the case where the grains form a disc process modelled by a marked point......-based maximum likelihood inference and the effect of specifying different reference Poisson models....

  1. Dimension-independent likelihood-informed MCMC

    KAUST Repository

    Cui, Tiangang

    2015-10-08

    Many Bayesian inference problems require exploring the posterior distribution of high-dimensional parameters that represent the discretization of an underlying function. This work introduces a family of Markov chain Monte Carlo (MCMC) samplers that can adapt to the particular structure of a posterior distribution over functions. Two distinct lines of research intersect in the methods developed here. First, we introduce a general class of operator-weighted proposal distributions that are well defined on function space, such that the performance of the resulting MCMC samplers is independent of the discretization of the function. Second, by exploiting local Hessian information and any associated low-dimensional structure in the change from prior to posterior distributions, we develop an inhomogeneous discretization scheme for the Langevin stochastic differential equation that yields operator-weighted proposals adapted to the non-Gaussian structure of the posterior. The resulting dimension-independent and likelihood-informed (DILI) MCMC samplers may be useful for a large class of high-dimensional problems where the target probability measure has a density with respect to a Gaussian reference measure. Two nonlinear inverse problems are used to demonstrate the efficiency of these DILI samplers: an elliptic PDE coefficient inverse problem and path reconstruction in a conditioned diffusion.

  2. Introductory statistical inference with the likelihood function

    CERN Document Server

    Rohde, Charles A

    2014-01-01

    This textbook covers the fundamentals of statistical inference and statistical theory including Bayesian and frequentist approaches and methodology possible without excessive emphasis on the underlying mathematics. This book is about some of the basic principles of statistics that are necessary to understand and evaluate methods for analyzing complex data sets. The likelihood function is used for pure likelihood inference throughout the book. There is also coverage of severity and finite population sampling. The material was developed from an introductory statistical theory course taught by the author at the Johns Hopkins University’s Department of Biostatistics. Students and instructors in public health programs will benefit from the likelihood modeling approach that is used throughout the text. This will also appeal to epidemiologists and psychometricians.  After a brief introduction, there are chapters on estimation, hypothesis testing, and maximum likelihood modeling. The book concludes with secti...

  3. Maximum-likelihood method in quantum estimation

    CERN Document Server

    Paris, M G A; Sacchi, M F

    2001-01-01

    The maximum-likelihood method for quantum estimation is reviewed and applied to the reconstruction of density matrix of spin and radiation as well as to the determination of several parameters of interest in quantum optics.

  4. Maximum-Likelihood Detection Of Noncoherent CPM

    Science.gov (United States)

    Divsalar, Dariush; Simon, Marvin K.

    1993-01-01

    Simplified detectors proposed for use in maximum-likelihood-sequence detection of symbols in alphabet of size M transmitted by uncoded, full-response continuous phase modulation over radio channel with additive white Gaussian noise. Structures of receivers derived from particular interpretation of maximum-likelihood metrics. Receivers include front ends, structures of which depends only on M, analogous to those in receivers of coherent CPM. Parts of receivers following front ends have structures, complexity of which would depend on N.

  5. Robust Likelihood-Based Survival Modeling with Microarray Data

    Directory of Open Access Journals (Sweden)

    HyungJun Cho

    2008-09-01

    Full Text Available Gene expression data can be associated with various clinical outcomes. In particular, these data can be of importance in discovering survival-associated genes for medical applications. As alternatives to traditional statistical methods, sophisticated methods and software programs have been developed to overcome the high-dimensional difficulty of microarray data. Nevertheless, new algorithms and software programs are needed to include practical functions such as the discovery of multiple sets of survival-associated genes and the incorporation of risk factors, and to use in the R environment which many statisticians are familiar with. For survival modeling with microarray data, we have developed a software program (called rbsurv which can be used conveniently and interactively in the R environment. This program selects survival-associated genes based on the partial likelihood of the Cox model and separates training and validation sets of samples for robustness. It can discover multiple sets of genes by iterative forward selection rather than one large set of genes. It can also allow adjustment for risk factors in microarray survival modeling. This software package, the rbsurv package, can be used to discover survival-associated genes with microarray data conveniently.

  6. Likelihood Principle and Maximum Likelihood Estimator of Location Parameter for Cauchy Distribution.

    Science.gov (United States)

    1986-05-01

    consistency (or strong consistency) of maximum likelihood estimator has been studied by many researchers, for example, Wald (1949), Wolfowitz (1953, 1965...20, 595-601. [25] Wolfowitz , J. (1953). The method of maximum likelihood and Wald theory of decision functions. Indag. Math., Vol. 15, 114-119. [26...Probability Letters Vol. 1, No. 3, 197-202. [24] Wald , A. (1949). Note on the consistency of maximum likelihood estimates. Ann. Math. Statist., Vol

  7. GENERALIZATION OF RAYLEIGH MAXIMUM LIKELIHOOD DESPECKLING FILTER USING QUADRILATERAL KERNELS

    Directory of Open Access Journals (Sweden)

    S. Sridevi

    2013-02-01

    Full Text Available Speckle noise is the most prevalent noise in clinical ultrasound images. It visibly looks like light and dark spots and deduce the pixel intensity as murkiest. Gazing at fetal ultrasound images, the impact of edge and local fine details are more palpable for obstetricians and gynecologists to carry out prenatal diagnosis of congenital heart disease. A robust despeckling filter has to be contrived to proficiently suppress speckle noise and simultaneously preserve the features. The proposed filter is the generalization of Rayleigh maximum likelihood filter by the exploitation of statistical tools as tuning parameters and use different shapes of quadrilateral kernels to estimate the noise free pixel from neighborhood. The performance of various filters namely Median, Kuwahura, Frost, Homogenous mask filter and Rayleigh maximum likelihood filter are compared with the proposed filter in terms PSNR and image profile. Comparatively the proposed filters surpass the conventional filters.

  8. Planck 2013 results. XV. CMB power spectra and likelihood

    CERN Document Server

    Ade, P.A.R.; Armitage-Caplan, C.; Arnaud, M.; Ashdown, M.; Atrio-Barandela, F.; Aumont, J.; Baccigalupi, C.; Banday, A.J.; Barreiro, R.B.; Bartlett, J.G.; Battaner, E.; Benabed, K.; Benoit, A.; Benoit-Levy, A.; Bernard, J.P.; Bersanelli, M.; Bielewicz, P.; Bobin, J.; Bock, J.J.; Bonaldi, A.; Bonavera, L.; Bond, J.R.; Borrill, J.; Bouchet, F.R.; Boulanger, F.; Bridges, M.; Bucher, M.; Burigana, C.; Butler, R.C.; Calabrese, E.; Cardoso, J.F.; Catalano, A.; Challinor, A.; Chamballu, A.; Chiang, L.Y.; Chiang, H.C.; Christensen, P.R.; Church, S.; Clements, D.L.; Colombi, S.; Colombo, L.P.L.; Combet, C.; Couchot, F.; Coulais, A.; Crill, B.P.; Curto, A.; Cuttaia, F.; Danese, L.; Davies, R.D.; Davis, R.J.; de Bernardis, P.; de Rosa, A.; de Zotti, G.; Delabrouille, J.; Delouis, J.M.; Desert, F.X.; Dickinson, C.; Diego, J.M.; Dole, H.; Donzelli, S.; Dore, O.; Douspis, M.; Dunkley, J.; Dupac, X.; Efstathiou, G.; Elsner, F.; Ensslin, T.A.; Eriksen, H.K.; Finelli, F.; Forni, O.; Frailis, M.; Fraisse, A.A.; Franceschi, E.; Gaier, T.C.; Galeotta, S.; Galli, S.; Ganga, K.; Giard, M.; Giardino, G.; Giraud-Heraud, Y.; Gjerlow, E.; Gonzalez-Nuevo, J.; Gorski, K.M.; Gratton, S.; Gregorio, A.; Gruppuso, A.; Gudmundsson, J.E.; Hansen, F.K.; Hanson, D.; Harrison, D.; Helou, G.; Henrot-Versille, S.; Hernandez-Monteagudo, C.; Herranz, D.; Hildebrandt, S.R.; Hivon, E.; Hobson, M.; Holmes, W.A.; Hornstrup, A.; Hovest, W.; Huffenberger, K.M.; Hurier, G.; Jaffe, T.R.; Jaffe, A.H.; Jewell, J.; Jones, W.C.; Juvela, M.; Keihanen, E.; Keskitalo, R.; Kiiveri, K.; Kisner, T.S.; Kneissl, R.; Knoche, J.; Knox, L.; Kunz, M.; Kurki-Suonio, H.; Lagache, G.; Lahteenmaki, A.; Lamarre, J.M.; Lasenby, A.; Lattanzi, M.; Laureijs, R.J.; Lawrence, C.R.; Le Jeune, M.; Leach, S.; Leahy, J.P.; Leonardi, R.; Leon-Tavares, J.; Lesgourgues, J.; Liguori, M.; Lilje, P.B.; Lindholm, V.; Linden-Vornle, M.; Lopez-Caniego, M.; Lubin, P.M.; Macias-Perez, J.F.; Maffei, B.; Maino, D.; Mandolesi, N.; Marinucci, D.; Maris, M.; Marshall, D.J.; Martin, P.G.; Martinez-Gonzalez, E.; Masi, S.; Matarrese, S.; Matthai, F.; Mazzotta, P.; Meinhold, P.R.; Melchiorri, A.; Mendes, L.; Menegoni, E.; Mennella, A.; Migliaccio, M.; Millea, M.; Mitra, S.; Miville-Deschenes, M.A.; Molinari, D.; Moneti, A.; Montier, L.; Morgante, G.; Mortlock, D.; Moss, A.; Munshi, D.; Naselsky, P.; Nati, F.; Natoli, P.; Netterfield, C.B.; Norgaard-Nielsen, H.U.; Noviello, F.; Novikov, D.; Novikov, I.; O'Dwyer, I.J.; Orieux, F.; Osborne, S.; Oxborrow, C.A.; Paci, F.; Pagano, L.; Pajot, F.; Paladini, R.; Paoletti, D.; Partridge, B.; Pasian, F.; Patanchon, G.; Paykari, P.; Perdereau, O.; Perotto, L.; Perrotta, F.; Piacentini, F.; Piat, M.; Pierpaoli, E.; Pietrobon, D.; Plaszczynski, S.; Pointecouteau, E.; Polenta, G.; Ponthieu, N.; Popa, L.; Poutanen, T.; Pratt, G.W.; Prezeau, G.; Prunet, S.; Puget, J.L.; Rachen, J.P.; Rahlin, A.; Rebolo, R.; Reinecke, M.; Remazeilles, M.; Renault, C.; Ricciardi, S.; Riller, T.; Ringeval, C.; Ristorcelli, I.; Rocha, G.; Rosset, C.; Roudier, G.; Rowan-Robinson, M.; Rubino-Martin, J.A.; Rusholme, B.; Sandri, M.; Sanselme, L.; Santos, D.; Savini, G.; Scott, D.; Seiffert, M.D.; Shellard, E.P.S.; Spencer, L.D.; Starck, J.L.; Stolyarov, V.; Stompor, R.; Sudiwala, R.; Sureau, F.; Sutton, D.; Suur-Uski, A.S.; Sygnet, J.F.; Tauber, J.A.; Tavagnacco, D.; Terenzi, L.; Toffolatti, L.; Tomasi, M.; Tristram, M.; Tucci, M.; Tuovinen, J.; Turler, M.; Valenziano, L.; Valiviita, J.; Van Tent, B.; Varis, J.; Vielva, P.; Villa, F.; Vittorio, N.; Wade, L.A.; Wandelt, B.D.; Wehus, I.K.; White, M.; White, S.D.M.; Yvon, D.; Zacchei, A.; Zonca, A.

    2014-01-01

    We present the Planck likelihood, a complete statistical description of the two-point correlation function of the CMB temperature fluctuations. We use this likelihood to derive the Planck CMB power spectrum over three decades in l, covering 2 = 50, we employ a correlated Gaussian likelihood approximation based on angular cross-spectra derived from the 100, 143 and 217 GHz channels. We validate our likelihood through an extensive suite of consistency tests, and assess the impact of residual foreground and instrumental uncertainties on cosmological parameters. We find good internal agreement among the high-l cross-spectra with residuals of a few uK^2 at l <= 1000. We compare our results with foreground-cleaned CMB maps, and with cross-spectra derived from the 70 GHz Planck map, and find broad agreement in terms of spectrum residuals and cosmological parameters. The best-fit LCDM cosmology is in excellent agreement with preliminary Planck polarisation spectra. The standard LCDM cosmology is well constrained b...

  9. Likelihood inference for unions of interacting discs

    DEFF Research Database (Denmark)

    Møller, Jesper; Helisová, Katarina

    is specified with respect to a given marked Poisson model (i.e. a Boolean model). We show how edge effects and other complications can be handled by considering a certain conditional likelihood. Our methodology is illustrated by analyzing Peter Diggle's heather dataset, where we discuss the results......To the best of our knowledge, this is the first paper which discusses likelihood inference or a random set using a germ-grain model, where the individual grains are unobservable edge effects occur, and other complications appear. We consider the case where the grains form a disc process modelled...... of simulation-based maximum likelihood inference and the effect of specifying different reference Poisson models....

  10. Likelihood alarm displays. [for human operator

    Science.gov (United States)

    Sorkin, Robert D.; Kantowitz, Barry H.; Kantowitz, Susan C.

    1988-01-01

    In a likelihood alarm display (LAD) information about event likelihood is computed by an automated monitoring system and encoded into an alerting signal for the human operator. Operator performance within a dual-task paradigm was evaluated with two LADs: a color-coded visual alarm and a linguistically coded synthetic speech alarm. The operator's primary task was one of tracking; the secondary task was to monitor a four-element numerical display and determine whether the data arose from a 'signal' or 'no-signal' condition. A simulated 'intelligent' monitoring system alerted the operator to the likelihood of a signal. The results indicated that (1) automated monitoring systems can improve performance on primary and secondary tasks; (2) LADs can improve the allocation of attention among tasks and provide information integrated into operator decisions; and (3) LADs do not necessarily add to the operator's attentional load.

  11. A quantum framework for likelihood ratios

    CERN Document Server

    Bond, Rachael L; Ormerod, Thomas C

    2015-01-01

    The ability to calculate precise likelihood ratios is fundamental to many STEM areas, such as decision-making theory, biomedical science, and engineering. However, there is no assumption-free statistical methodology to achieve this. For instance, in the absence of data relating to covariate overlap, the widely used Bayes' theorem either defaults to the marginal probability driven "naive Bayes' classifier", or requires the use of compensatory expectation-maximization techniques. Equally, the use of alternative statistical approaches, such as multivariate logistic regression, may be confounded by other axiomatic conditions, e.g., low levels of co-linearity. This article takes an information-theoretic approach in developing a new statistical formula for the calculation of likelihood ratios based on the principles of quantum entanglement. In doing so, it is argued that this quantum approach demonstrates: that the likelihood ratio is a real quality of statistical systems; that the naive Bayes' classifier is a spec...

  12. CORA: Emission Line Fitting with Maximum Likelihood

    Science.gov (United States)

    Ness, Jan-Uwe; Wichmann, Rainer

    2011-12-01

    CORA analyzes emission line spectra with low count numbers and fits them to a line using the maximum likelihood technique. CORA uses a rigorous application of Poisson statistics. From the assumption of Poissonian noise, the software derives the probability for a model of the emission line spectrum to represent the measured spectrum. The likelihood function is used as a criterion for optimizing the parameters of the theoretical spectrum and a fixed point equation is derived allowing an efficient way to obtain line fluxes. CORA has been applied to an X-ray spectrum with the Low Energy Transmission Grating Spectrometer (LETGS) on board the Chandra observatory.

  13. Clinical and molecular features of high-grade osteosarcoma

    NARCIS (Netherlands)

    Anninga, Jakob Klaas

    2013-01-01

    It can be concluded from this thesis that high-grade osteosarcoma is at clinical, pathological and molecular level a heterogeneous disease. To treat high-grade osteosarcoma, neo-adjuvant chemotherapy should be combined with radical surgery, irrespective the localization. There are only 4 effective c

  14. Clinical and molecular features of high-grade osteosarcoma

    NARCIS (Netherlands)

    Anninga, Jakob Klaas

    2013-01-01

    It can be concluded from this thesis that high-grade osteosarcoma is at clinical, pathological and molecular level a heterogeneous disease. To treat high-grade osteosarcoma, neo-adjuvant chemotherapy should be combined with radical surgery, irrespective the localization. There are only 4 effective

  15. Maintaining symmetry of simulated likelihood functions

    DEFF Research Database (Denmark)

    Andersen, Laura Mørch

    This paper suggests solutions to two different types of simulation errors related to Quasi-Monte Carlo integration. Likelihood functions which depend on standard deviations of mixed parameters are symmetric in nature. This paper shows that antithetic draws preserve this symmetry and thereby...

  16. Likelihood analysis of the I(2) model

    DEFF Research Database (Denmark)

    Johansen, Søren

    1997-01-01

    The I(2) model is defined as a submodel of the general vector autoregressive model, by two reduced rank conditions. The model describes stochastic processes with stationary second difference. A parametrization is suggested which makes likelihood inference feasible. Consistency of the maximum like...

  17. Synthesizing Regression Results: A Factored Likelihood Method

    Science.gov (United States)

    Wu, Meng-Jia; Becker, Betsy Jane

    2013-01-01

    Regression methods are widely used by researchers in many fields, yet methods for synthesizing regression results are scarce. This study proposes using a factored likelihood method, originally developed to handle missing data, to appropriately synthesize regression models involving different predictors. This method uses the correlations reported…

  18. Maximum Likelihood Estimation of Search Costs

    NARCIS (Netherlands)

    J.L. Moraga-Gonzalez (José Luis); M.R. Wildenbeest (Matthijs)

    2006-01-01

    textabstractIn a recent paper Hong and Shum (forthcoming) present a structural methodology to estimate search cost distributions. We extend their approach to the case of oligopoly and present a maximum likelihood estimate of the search cost distribution. We apply our method to a data set of online p

  19. Maintaining symmetry of simulated likelihood functions

    DEFF Research Database (Denmark)

    Andersen, Laura Mørch

    This paper suggests solutions to two different types of simulation errors related to Quasi-Monte Carlo integration. Likelihood functions which depend on standard deviations of mixed parameters are symmetric in nature. This paper shows that antithetic draws preserve this symmetry and thereby...

  20. Likelihood based testing for no fractional cointegration

    DEFF Research Database (Denmark)

    Lasak, Katarzyna

    We consider two likelihood ratio tests, so-called maximum eigenvalue and trace tests, for the null of no cointegration when fractional cointegration is allowed under the alternative, which is a first step to generalize the so-called Johansen's procedure to the fractional cointegration case. The s...

  1. Maximum likelihood estimation of fractionally cointegrated systems

    DEFF Research Database (Denmark)

    Lasak, Katarzyna

    In this paper we consider a fractionally cointegrated error correction model and investigate asymptotic properties of the maximum likelihood (ML) estimators of the matrix of the cointe- gration relations, the degree of fractional cointegration, the matrix of the speed of adjustment...

  2. Maximum likelihood estimation for integrated diffusion processes

    DEFF Research Database (Denmark)

    Baltazar-Larios, Fernando; Sørensen, Michael

    EM-algorithm to obtain maximum likelihood estimates of the parameters in the diffusion model. As part of the algorithm, we use a recent simple method for approximate simulation of diffusion bridges. In simulation studies for the Ornstein-Uhlenbeck process and the CIR process the proposed method works...

  3. A clinical gamma camera-based pinhole collimated system for high resolution small animal SPECT imaging

    Directory of Open Access Journals (Sweden)

    J. Mejia

    2010-12-01

    Full Text Available The main objective of the present study was to upgrade a clinical gamma camera to obtain high resolution tomographic images of small animal organs. The system is based on a clinical gamma camera to which we have adapted a special-purpose pinhole collimator and a device for positioning and rotating the target based on a computer-controlled step motor. We developed a software tool to reconstruct the target’s three-dimensional distribution of emission from a set of planar projections, based on the maximum likelihood algorithm. We present details on the hardware and software implementation. We imaged phantoms and heart and kidneys of rats. When using pinhole collimators, the spatial resolution and sensitivity of the imaging system depend on parameters such as the detector-to-collimator and detector-to-target distances and pinhole diameter. In this study, we reached an object voxel size of 0.6 mm and spatial resolution better than 2.4 and 1.7 mm full width at half maximum when 1.5- and 1.0-mm diameter pinholes were used, respectively. Appropriate sensitivity to study the target of interest was attained in both cases. Additionally, we show that as few as 12 projections are sufficient to attain good quality reconstructions, a result that implies a significant reduction of acquisition time and opens the possibility for radiotracer dynamic studies. In conclusion, a high resolution single photon emission computed tomography (SPECT system was developed using a commercial clinical gamma camera, allowing the acquisition of detailed volumetric images of small animal organs. This type of system has important implications for research areas such as Cardiology, Neurology or Oncology.

  4. A clinical gamma camera-based pinhole collimated system for high resolution small animal SPECT imaging

    Energy Technology Data Exchange (ETDEWEB)

    Mejia, J.; Galvis-Alonso, O.Y., E-mail: mejia_famerp@yahoo.com.b [Faculdade de Medicina de Sao Jose do Rio Preto (FAMERP), SP (Brazil). Dept. de Biologia Molecular; Castro, A.A. de; Simoes, M.V. [Faculdade de Medicina de Sao Jose do Rio Preto (FAMERP), SP (Brazil). Dept. de Clinica Medica; Leite, J.P. [Universidade de Sao Paulo (FMRP/USP), Ribeirao Preto, SP (Brazil). Fac. de Medicina. Dept. de Neurociencias e Ciencias do Comportamento; Braga, J. [Instituto Nacional de Pesquisas Espaciais (INPE), Sao Jose dos Campos, SP (Brazil). Div. de Astrofisica

    2010-11-15

    The main objective of the present study was to upgrade a clinical gamma camera to obtain high resolution tomographic images of small animal organs. The system is based on a clinical gamma camera to which we have adapted a special-purpose pinhole collimator and a device for positioning and rotating the target based on a computer-controlled step motor. We developed a software tool to reconstruct the target's three-dimensional distribution of emission from a set of planar projections, based on the maximum likelihood algorithm. We present details on the hardware and software implementation. We imaged phantoms and heart and kidneys of rats. When using pinhole collimators, the spatial resolution and sensitivity of the imaging system depend on parameters such as the detector-to-collimator and detector-to-target distances and pinhole diameter. In this study, we reached an object voxel size of 0.6 mm and spatial resolution better than 2.4 and 1.7 mm full width at half maximum when 1.5- and 1.0-mm diameter pinholes were used, respectively. Appropriate sensitivity to study the target of interest was attained in both cases. Additionally, we show that as few as 12 projections are sufficient to attain good quality reconstructions, a result that implies a significant reduction of acquisition time and opens the possibility for radiotracer dynamic studies. In conclusion, a high resolution single photon emission computed tomography (SPECT) system was developed using a commercial clinical gamma camera, allowing the acquisition of detailed volumetric images of small animal organs. This type of system has important implications for research areas such as Cardiology, Neurology or Oncology. (author)

  5. Parallel Likelihood Function Evaluation on Heterogeneous Many-core Systems

    CERN Document Server

    Jarp, Sverre; Leduc, Julien; Nowak, Andrzej; Sneen Lindal, Yngve

    2011-01-01

    This paper describes a parallel implementation that allows the evaluations of the likelihood function for data analysis methods to run cooperatively on heterogeneous computational devices (i.e. CPU and GPU) belonging to a single computational node. The implementation is able to split and balance the workload needed for the evaluation of the likelihood function in corresponding sub-workloads to be executed in parallel on each computational device. The CPU parallelization is implemented using OpenMP, while the GPU implementation is based on OpenCL. The comparison of the performance of these implementations for different configurations and different hardware systems are reported. Tests are based on a real data analysis carried out in the high energy physics community.

  6. $\\ell_0$-penalized maximum likelihood for sparse directed acyclic graphs

    CERN Document Server

    van de Geer, Sara

    2012-01-01

    We consider the problem of regularized maximum likelihood estimation for the structure and parameters of a high-dimensional, sparse directed acyclic graphical (DAG) model with Gaussian distribution, or equivalently, of a Gaussian structural equation model. We show that the $\\ell_0$-penalized maximum likelihood estimator of a DAG has about the same number of edges as the minimal-edge I-MAP (a DAG with minimal number of edges representing the distribution), and that it converges in Frobenius norm. We allow the number of nodes $p$ to be much larger than sample size $n$ but assume a sparsity condition and that any representation of the true DAG has at least a fixed proportion of its non-zero edge weights above the noise level. Our results do not rely on the restrictive strong faithfulness condition which is required for methods based on conditional independence testing such as the PC-algorithm.

  7. A model independent safeguard for unbinned Profile Likelihood

    CERN Document Server

    Priel, Nadav; Landsman, Hagar; Manfredini, Alessandro; Budnik, Ranny

    2016-01-01

    We present a general method to include residual un-modeled background shape uncertainties in profile likelihood based statistical tests for high energy physics and astroparticle physics counting experiments. This approach provides a simple and natural protection against undercoverage, thus lowering the chances of a false discovery or of an over constrained confidence interval, and allows a natural transition to unbinned space. Unbinned likelihood enhances the sensitivity and allows optimal usage of information for the data and the models. We show that the asymptotic behavior of the test statistic can be regained in cases where the model fails to describe the true background behavior, and present 1D and 2D case studies for model-driven and data-driven background models. The resulting penalty on sensitivities follows the actual discrepancy between the data and the models, and is asymptotically reduced to zero with increasing knowledge.

  8. Semidefinite Programming for Approximate Maximum Likelihood Sinusoidal Parameter Estimation

    Directory of Open Access Journals (Sweden)

    Kenneth W. K. Lui

    2009-01-01

    Full Text Available We study the convex optimization approach for parameter estimation of several sinusoidal models, namely, single complex/real tone, multiple complex sinusoids, and single two-dimensional complex tone, in the presence of additive Gaussian noise. The major difficulty for optimally determining the parameters is that the corresponding maximum likelihood (ML estimators involve finding the global minimum or maximum of multimodal cost functions because the frequencies are nonlinear in the observed signals. By relaxing the nonconvex ML formulations using semidefinite programs, high-fidelity approximate solutions are obtained in a globally optimum fashion. Computer simulations are included to contrast the estimation performance of the proposed semi-definite relaxation methods with the iterative quadratic maximum likelihood technique as well as Cramér-Rao lower bound.

  9. Semidefinite Programming for Approximate Maximum Likelihood Sinusoidal Parameter Estimation

    Science.gov (United States)

    Lui, Kenneth W. K.; So, H. C.

    2009-12-01

    We study the convex optimization approach for parameter estimation of several sinusoidal models, namely, single complex/real tone, multiple complex sinusoids, and single two-dimensional complex tone, in the presence of additive Gaussian noise. The major difficulty for optimally determining the parameters is that the corresponding maximum likelihood (ML) estimators involve finding the global minimum or maximum of multimodal cost functions because the frequencies are nonlinear in the observed signals. By relaxing the nonconvex ML formulations using semidefinite programs, high-fidelity approximate solutions are obtained in a globally optimum fashion. Computer simulations are included to contrast the estimation performance of the proposed semi-definite relaxation methods with the iterative quadratic maximum likelihood technique as well as Cramér-Rao lower bound.

  10. Model Selection Through Sparse Maximum Likelihood Estimation

    CERN Document Server

    Banerjee, Onureena; D'Aspremont, Alexandre

    2007-01-01

    We consider the problem of estimating the parameters of a Gaussian or binary distribution in such a way that the resulting undirected graphical model is sparse. Our approach is to solve a maximum likelihood problem with an added l_1-norm penalty term. The problem as formulated is convex but the memory requirements and complexity of existing interior point methods are prohibitive for problems with more than tens of nodes. We present two new algorithms for solving problems with at least a thousand nodes in the Gaussian case. Our first algorithm uses block coordinate descent, and can be interpreted as recursive l_1-norm penalized regression. Our second algorithm, based on Nesterov's first order method, yields a complexity estimate with a better dependence on problem size than existing interior point methods. Using a log determinant relaxation of the log partition function (Wainwright & Jordan (2006)), we show that these same algorithms can be used to solve an approximate sparse maximum likelihood problem for...

  11. Regions of constrained maximum likelihood parameter identifiability

    Science.gov (United States)

    Lee, C.-H.; Herget, C. J.

    1975-01-01

    This paper considers the parameter identification problem of general discrete-time, nonlinear, multiple-input/multiple-output dynamic systems with Gaussian-white distributed measurement errors. Knowledge of the system parameterization is assumed to be known. Regions of constrained maximum likelihood (CML) parameter identifiability are established. A computation procedure employing interval arithmetic is proposed for finding explicit regions of parameter identifiability for the case of linear systems. It is shown that if the vector of true parameters is locally CML identifiable, then with probability one, the vector of true parameters is a unique maximal point of the maximum likelihood function in the region of parameter identifiability and the CML estimation sequence will converge to the true parameters.

  12. Composite likelihood method for inferring local pedigrees

    Science.gov (United States)

    Nielsen, Rasmus

    2017-01-01

    Pedigrees contain information about the genealogical relationships among individuals and are of fundamental importance in many areas of genetic studies. However, pedigrees are often unknown and must be inferred from genetic data. Despite the importance of pedigree inference, existing methods are limited to inferring only close relationships or analyzing a small number of individuals or loci. We present a simulated annealing method for estimating pedigrees in large samples of otherwise seemingly unrelated individuals using genome-wide SNP data. The method supports complex pedigree structures such as polygamous families, multi-generational families, and pedigrees in which many of the member individuals are missing. Computational speed is greatly enhanced by the use of a composite likelihood function which approximates the full likelihood. We validate our method on simulated data and show that it can infer distant relatives more accurately than existing methods. Furthermore, we illustrate the utility of the method on a sample of Greenlandic Inuit. PMID:28827797

  13. Human transcriptome array for high-throughput clinical studies.

    Science.gov (United States)

    Xu, Weihong; Seok, Junhee; Mindrinos, Michael N; Schweitzer, Anthony C; Jiang, Hui; Wilhelmy, Julie; Clark, Tyson A; Kapur, Karen; Xing, Yi; Faham, Malek; Storey, John D; Moldawer, Lyle L; Maier, Ronald V; Tompkins, Ronald G; Wong, Wing Hung; Davis, Ronald W; Xiao, Wenzhong

    2011-03-01

    A 6.9 million-feature oligonucleotide array of the human transcriptome [Glue Grant human transcriptome (GG-H array)] has been developed for high-throughput and cost-effective analyses in clinical studies. This array allows comprehensive examination of gene expression and genome-wide identification of alternative splicing as well as detection of coding SNPs and noncoding transcripts. The performance of the array was examined and compared with mRNA sequencing (RNA-Seq) results over multiple independent replicates of liver and muscle samples. Compared with RNA-Seq of 46 million uniquely mappable reads per replicate, the GG-H array is highly reproducible in estimating gene and exon abundance. Although both platforms detect similar expression changes at the gene level, the GG-H array is more sensitive at the exon level. Deeper sequencing is required to adequately cover low-abundance transcripts. The array has been implemented in a multicenter clinical program and has generated high-quality, reproducible data. Considering the clinical trial requirements of cost, sample availability, and throughput, the GG-H array has a wide range of applications. An emerging approach for large-scale clinical genomic studies is to first use RNA-Seq to the sufficient depth for the discovery of transcriptome elements relevant to the disease process followed by high-throughput and reliable screening of these elements on thousands of patient samples using custom-designed arrays.

  14. Lessons about likelihood functions from nuclear physics

    CERN Document Server

    Hanson, Kenneth M

    2007-01-01

    Least-squares data analysis is based on the assumption that the normal (Gaussian) distribution appropriately characterizes the likelihood, that is, the conditional probability of each measurement d, given a measured quantity y, p(d | y). On the other hand, there is ample evidence in nuclear physics of significant disagreements among measurements, which are inconsistent with the normal distribution, given their stated uncertainties. In this study the histories of 99 measurements of the lifetimes of five elementary particles are examined to determine what can be inferred about the distribution of their values relative to their stated uncertainties. Taken as a whole, the variations in the data are somewhat larger than their quoted uncertainties would indicate. These data strongly support using a Student t distribution for the likelihood function instead of a normal. The most probable value for the order of the t distribution is 2.6 +/- 0.9. It is shown that analyses based on long-tailed t-distribution likelihood...

  15. Maximum likelihood continuity mapping for fraud detection

    Energy Technology Data Exchange (ETDEWEB)

    Hogden, J.

    1997-05-01

    The author describes a novel time-series analysis technique called maximum likelihood continuity mapping (MALCOM), and focuses on one application of MALCOM: detecting fraud in medical insurance claims. Given a training data set composed of typical sequences, MALCOM creates a stochastic model of sequence generation, called a continuity map (CM). A CM maximizes the probability of sequences in the training set given the model constraints, CMs can be used to estimate the likelihood of sequences not found in the training set, enabling anomaly detection and sequence prediction--important aspects of data mining. Since MALCOM can be used on sequences of categorical data (e.g., sequences of words) as well as real valued data, MALCOM is also a potential replacement for database search tools such as N-gram analysis. In a recent experiment, MALCOM was used to evaluate the likelihood of patient medical histories, where ``medical history`` is used to mean the sequence of medical procedures performed on a patient. Physicians whose patients had anomalous medical histories (according to MALCOM) were evaluated for fraud by an independent agency. Of the small sample (12 physicians) that has been evaluated, 92% have been determined fraudulent or abusive. Despite the small sample, these results are encouraging.

  16. Likelihood methods and classical burster repetition

    CERN Document Server

    Graziani, C; Graziani, Carlo; Lamb, Donald Q

    1995-01-01

    We develop a likelihood methodology which can be used to search for evidence of burst repetition in the BATSE catalog, and to study the properties of the repetition signal. We use a simplified model of burst repetition in which a number N_{\\rm r} of sources which repeat a fixed number of times N_{\\rm rep} are superposed upon a number N_{\\rm nr} of non-repeating sources. The instrument exposure is explicitly taken into account. By computing the likelihood for the data, we construct a probability distribution in parameter space that may be used to infer the probability that a repetition signal is present, and to estimate the values of the repetition parameters. The likelihood function contains contributions from all the bursts, irrespective of the size of their positional errors --- the more uncertain a burst's position is, the less constraining is its contribution. Thus this approach makes maximal use of the data, and avoids the ambiguities of sample selection associated with data cuts on error circle size. We...

  17. Database likelihood ratios and familial DNA searching

    CERN Document Server

    Slooten, Klaas

    2012-01-01

    Familial Searching is the process of searching in a DNA database for relatives of a given individual. It is well known that in order to evaluate the genetic evidence in favour of a certain given form of relatedness between two individuals, one needs to calculate the appropriate likelihood ratio, which is in this context called a Kinship Index. Suppose that the database contains, for a given type of relative, at most one related individual. Given prior probabilities of being the relative for all persons in the database, we derive the likelihood ratio for each database member in favour of being that relative. This likelihood ratio takes all the Kinship Indices between target and members of the database into account. We also compute the corresponding posterior probabilities. We then discuss two ways of selecting a subset from the database that contains the relative with a known probability, or at least a useful lower bound thereof. We discuss the relation between these approaches and illustrate them with Familia...

  18. A clinical model to identify patients with high-risk coronary artery disease.

    Science.gov (United States)

    Yang, Yelin; Chen, Li; Yam, Yeung; Achenbach, Stephan; Al-Mallah, Mouaz; Berman, Daniel S; Budoff, Matthew J; Cademartiri, Filippo; Callister, Tracy Q; Chang, Hyuk-Jae; Cheng, Victor Y; Chinnaiyan, Kavitha; Cury, Ricardo; Delago, Augustin; Dunning, Allison; Feuchtner, Gudrun; Hadamitzky, Martin; Hausleiter, Jörg; Karlsberg, Ronald P; Kaufmann, Philipp A; Kim, Yong-Jin; Leipsic, Jonathon; LaBounty, Troy; Lin, Fay; Maffei, Erica; Raff, Gilbert L; Shaw, Leslee J; Villines, Todd C; Min, James K; Chow, Benjamin J W

    2015-04-01

    This study sought to develop a clinical model that identifies patients with and without high-risk coronary artery disease (CAD). Although current clinical models help to estimate a patient's pre-test probability of obstructive CAD, they do not accurately identify those patients with and without high-risk coronary anatomy. Retrospective analysis of a prospectively collected multinational coronary computed tomographic angiography (CTA) cohort was conducted. High-risk anatomy was defined as left main diameter stenosis ≥50%, 3-vessel disease with diameter stenosis ≥70%, or 2-vessel disease involving the proximal left anterior descending artery. Using a cohort of 27,125, patients with a history of CAD, cardiac transplantation, and congenital heart disease were excluded. The model was derived from 24,251 consecutive patients in the derivation cohort and an additional 7,333 nonoverlapping patients in the validation cohort. The risk score consisted of 9 variables: age, sex, diabetes, hypertension, current smoking, hyperlipidemia, family history of CAD, history of peripheral vascular disease, and chest pain symptoms. Patients were divided into 3 risk categories: low (≤7 points), intermediate (8 to 17 points) and high (≥18 points). The model was statistically robust with area under the curve of 0.76 (95% confidence interval [CI]: 0.75 to 0.78) in the derivation cohort and 0.71 (95% CI: 0.69 to 0.74) in the validation cohort. Patients who scored ≤7 points had a low negative likelihood ratio (risk CAD was 1% in patients with ≤7 points and 16.7% in those with ≥18 points. We propose a scoring system, based on clinical variables, that can be used to identify patients at high and low pre-test probability of having high-risk CAD. Identification of these populations may detect those who may benefit from a trial of medical therapy and those who may benefit most from an invasive strategy. Copyright © 2015 American College of Cardiology Foundation. Published by

  19. Comparisons of likelihood and machine learning methods of individual classification

    Science.gov (United States)

    Guinand, B.; Topchy, A.; Page, K.S.; Burnham-Curtis, M. K.; Punch, W.F.; Scribner, K.T.

    2002-01-01

    Classification methods used in machine learning (e.g., artificial neural networks, decision trees, and k-nearest neighbor clustering) are rarely used with population genetic data. We compare different nonparametric machine learning techniques with parametric likelihood estimations commonly employed in population genetics for purposes of assigning individuals to their population of origin (“assignment tests”). Classifier accuracy was compared across simulated data sets representing different levels of population differentiation (low and high FST), number of loci surveyed (5 and 10), and allelic diversity (average of three or eight alleles per locus). Empirical data for the lake trout (Salvelinus namaycush) exhibiting levels of population differentiation comparable to those used in simulations were examined to further evaluate and compare classification methods. Classification error rates associated with artificial neural networks and likelihood estimators were lower for simulated data sets compared to k-nearest neighbor and decision tree classifiers over the entire range of parameters considered. Artificial neural networks only marginally outperformed the likelihood method for simulated data (0–2.8% lower error rates). The relative performance of each machine learning classifier improved relative likelihood estimators for empirical data sets, suggesting an ability to “learn” and utilize properties of empirical genotypic arrays intrinsic to each population. Likelihood-based estimation methods provide a more accessible option for reliable assignment of individuals to the population of origin due to the intricacies in development and evaluation of artificial neural networks. In recent years, characterization of highly polymorphic molecular markers such as mini- and microsatellites and development of novel methods of analysis have enabled researchers to extend investigations of ecological and evolutionary processes below the population level to the level of

  20. Corporate governance effect on financial distress likelihood: Evidence from Spain

    Directory of Open Access Journals (Sweden)

    Montserrat Manzaneque

    2016-01-01

    Full Text Available The paper explores some mechanisms of corporate governance (ownership and board characteristics in Spanish listed companies and their impact on the likelihood of financial distress. An empirical study was conducted between 2007 and 2012 using a matched-pairs research design with 308 observations, with half of them classified as distressed and non-distressed. Based on the previous study by Pindado, Rodrigues, and De la Torre (2008, a broader concept of bankruptcy is used to define business failure. Employing several conditional logistic models, as well as to other previous studies on bankruptcy, the results confirm that in difficult situations prior to bankruptcy, the impact of board ownership and proportion of independent directors on business failure likelihood are similar to those exerted in more extreme situations. These results go one step further, to offer a negative relationship between board size and the likelihood of financial distress. This result is interpreted as a form of creating diversity and to improve the access to the information and resources, especially in contexts where the ownership is highly concentrated and large shareholders have a great power to influence the board structure. However, the results confirm that ownership concentration does not have a significant impact on financial distress likelihood in the Spanish context. It is argued that large shareholders are passive as regards an enhanced monitoring of management and, alternatively, they do not have enough incentives to hold back the financial distress. These findings have important implications in the Spanish context, where several changes in the regulatory listing requirements have been carried out with respect to corporate governance, and where there is no empirical evidence regarding this respect.

  1. Maximum Likelihood Inference for the Cox Regression Model with Applications to Missing Covariates.

    Science.gov (United States)

    Chen, Ming-Hui; Ibrahim, Joseph G; Shao, Qi-Man

    2009-10-01

    In this paper, we carry out an in-depth theoretical investigation for existence of maximum likelihood estimates for the Cox model (Cox, 1972, 1975) both in the full data setting as well as in the presence of missing covariate data. The main motivation for this work arises from missing data problems, where models can easily become difficult to estimate with certain missing data configurations or large missing data fractions. We establish necessary and sufficient conditions for existence of the maximum partial likelihood estimate (MPLE) for completely observed data (i.e., no missing data) settings as well as sufficient conditions for existence of the maximum likelihood estimate (MLE) for survival data with missing covariates via a profile likelihood method. Several theorems are given to establish these conditions. A real dataset from a cancer clinical trial is presented to further illustrate the proposed methodology.

  2. On divergences tests for composite hypotheses under composite likelihood

    OpenAIRE

    Martin, Nirian; Pardo, Leandro; Zografos, Konstantinos

    2016-01-01

    It is well-known that in some situations it is not easy to compute the likelihood function as the datasets might be large or the model is too complex. In that contexts composite likelihood, derived by multiplying the likelihoods of subjects of the variables, may be useful. The extension of the classical likelihood ratio test statistics to the framework of composite likelihoods is used as a procedure to solve the problem of testing in the context of composite likelihood. In this paper we intro...

  3. Multi-Channel Maximum Likelihood Pitch Estimation

    DEFF Research Database (Denmark)

    Christensen, Mads Græsbøll

    2012-01-01

    In this paper, a method for multi-channel pitch estimation is proposed. The method is a maximum likelihood estimator and is based on a parametric model where the signals in the various channels share the same fundamental frequency but can have different amplitudes, phases, and noise characteristics....... This essentially means that the model allows for different conditions in the various channels, like different signal-to-noise ratios, microphone characteristics and reverberation. Moreover, the method does not assume that a certain array structure is used but rather relies on a more general model and is hence...

  4. CMB Power Spectrum Likelihood with ILC

    CERN Document Server

    Dick, Jason; Delabrouille, Jacques

    2012-01-01

    We extend the ILC method in harmonic space to include the error in its CMB estimate. This allows parameter estimation routines to take into account the effect of the foregrounds as well as the errors in their subtraction in conjunction with the ILC method. Our method requires the use of a model of the foregrounds which we do not develop here. The reduction of the foreground level makes this method less sensitive to unaccounted for errors in the foreground model. Simulations are used to validate the calculations and approximations used in generating this likelihood function.

  5. An improved likelihood model for eye tracking

    DEFF Research Database (Denmark)

    Hammoud, Riad I.; Hansen, Dan Witzner

    2007-01-01

    approach in such cases is to abandon the tracking routine and re-initialize eye detection. Of course this may be a difficult process due to missed data problem. Accordingly, what is needed is an efficient method of reliably tracking a person's eyes between successively produced video image frames, even...... are challenging. It proposes a log likelihood-ratio function of foreground and background models in a particle filter-based eye tracking framework. It fuses key information from even, odd infrared fields (dark and bright-pupil) and their corresponding subtractive image into one single observation model...

  6. LIKEDM: Likelihood calculator of dark matter detection

    Science.gov (United States)

    Huang, Xiaoyuan; Tsai, Yue-Lin Sming; Yuan, Qiang

    2017-04-01

    With the large progress in searches for dark matter (DM) particles with indirect and direct methods, we develop a numerical tool that enables fast calculations of the likelihoods of specified DM particle models given a number of observational data, such as charged cosmic rays from space-borne experiments (e.g., PAMELA, AMS-02), γ-rays from the Fermi space telescope, and underground direct detection experiments. The purpose of this tool - LIKEDM, likelihood calculator for dark matter detection - is to bridge the gap between a particle model of DM and the observational data. The intermediate steps between these two, including the astrophysical backgrounds, the propagation of charged particles, the analysis of Fermi γ-ray data, as well as the DM velocity distribution and the nuclear form factor, have been dealt with in the code. We release the first version (v1.0) focusing on the constraints from indirect detection of DM with charged cosmic and gamma rays. Direct detection will be implemented in the next version. This manual describes the framework, usage, and related physics of the code.

  7. Multiplicative earthquake likelihood models incorporating strain rates

    Science.gov (United States)

    Rhoades, D. A.; Christophersen, A.; Gerstenberger, M. C.

    2017-01-01

    SUMMARYWe examine the potential for strain-rate variables to improve long-term earthquake likelihood models. We derive a set of multiplicative hybrid earthquake likelihood models in which cell rates in a spatially uniform baseline model are scaled using combinations of covariates derived from earthquake catalogue data, fault data, and strain-rates for the New Zealand region. Three components of the strain rate estimated from GPS data over the period 1991-2011 are considered: the shear, rotational and dilatational strain rates. The hybrid model parameters are optimised for earthquakes of M 5 and greater over the period 1987-2006 and tested on earthquakes from the period 2012-2015, which is independent of the strain rate estimates. The shear strain rate is overall the most informative individual covariate, as indicated by Molchan error diagrams as well as multiplicative modelling. Most models including strain rates are significantly more informative than the best models excluding strain rates in both the fitting and testing period. A hybrid that combines the shear and dilatational strain rates with a smoothed seismicity covariate is the most informative model in the fitting period, and a simpler model without the dilatational strain rate is the most informative in the testing period. These results have implications for probabilistic seismic hazard analysis and can be used to improve the background model component of medium-term and short-term earthquake forecasting models.

  8. CORA - emission line fitting with Maximum Likelihood

    Science.gov (United States)

    Ness, J.-U.; Wichmann, R.

    2002-07-01

    The advent of pipeline-processed data both from space- and ground-based observatories often disposes of the need of full-fledged data reduction software with its associated steep learning curve. In many cases, a simple tool doing just one task, and doing it right, is all one wishes. In this spirit we introduce CORA, a line fitting tool based on the maximum likelihood technique, which has been developed for the analysis of emission line spectra with low count numbers and has successfully been used in several publications. CORA uses a rigorous application of Poisson statistics. From the assumption of Poissonian noise we derive the probability for a model of the emission line spectrum to represent the measured spectrum. The likelihood function is used as a criterion for optimizing the parameters of the theoretical spectrum and a fixed point equation is derived allowing an efficient way to obtain line fluxes. As an example we demonstrate the functionality of the program with an X-ray spectrum of Capella obtained with the Low Energy Transmission Grating Spectrometer (LETGS) on board the Chandra observatory and choose the analysis of the Ne IX triplet around 13.5 Å.

  9. Maximum Likelihood Analysis in the PEN Experiment

    Science.gov (United States)

    Lehman, Martin

    2013-10-01

    The experimental determination of the π+ -->e+ ν (γ) decay branching ratio currently provides the most accurate test of lepton universality. The PEN experiment at PSI, Switzerland, aims to improve the present world average experimental precision of 3 . 3 ×10-3 to 5 ×10-4 using a stopped beam approach. During runs in 2008-10, PEN has acquired over 2 ×107 πe 2 events. The experiment includes active beam detectors (degrader, mini TPC, target), central MWPC tracking with plastic scintillator hodoscopes, and a spherical pure CsI electromagnetic shower calorimeter. The final branching ratio will be calculated using a maximum likelihood analysis. This analysis assigns each event a probability for 5 processes (π+ -->e+ ν , π+ -->μ+ ν , decay-in-flight, pile-up, and hadronic events) using Monte Carlo verified probability distribution functions of our observables (energies, times, etc). A progress report on the PEN maximum likelihood analysis will be presented. Work supported by NSF grant PHY-0970013.

  10. Clinical examination is highly sensitive for detecting clinically significant spinal injuries after gunshot wounds.

    Science.gov (United States)

    Inaba, Kenji; Barmparas, Galinos; Ibrahim, David; Branco, Bernardino C; Gruen, Peter; Reddy, Sravanthi; Talving, Peep; Demetriades, Demetrios

    2011-09-01

    The optimal method for spinal evaluation after penetrating trauma is currently unknown. The goal of this study was to determine the sensitivity and specificity of a standardized clinical examination for the detection of spinal injuries after penetrating trauma. After Institutional Review Board approval, all evaluable penetrating trauma patients aged 15 years or more admitted to the Los Angeles County + University of Southern California Medical Center were prospectively evaluated for spinal pain, tenderness to palpation, deformity, and neurologic deficit. During the 6-month study period, 282 patients were admitted after sustaining a penetrating injury; 143 (50.7%) as a result of gunshot wound (GSW) and 139 (49.3%) as a result of stab wound (SW). None of the patients sustaining a SW had a spinal injury. Of the 112 evaluable GSW patients, 9 sustained an injury: 6 with a true-positive and 3 with a false-negative clinical examination. The overall sensitivity, specificity, positive predictive value, and negative predictive value were 66.7%, 89.6%, 46.2% and 95.2%, respectively. For clinically significant injuries requiring surgical intervention, cervical or thoracolumbar spine orthosis, or cord transections, however, the sensitivity of clinical examination was 100.0%, specificity 87.5%, positive predictive value 30.8%, and negative predictive value 87.5%. Clinically significant spinal injury, although rare after SWs, is not uncommon after GSWs. A structured clinical examination of the spine in evaluable patients who have sustained a GSW is highly reliable for identifying those with clinically significant injuries.

  11. Recent developments in maximum likelihood estimation of MTMM models for categorical data

    Directory of Open Access Journals (Sweden)

    Minjeong eJeon

    2014-04-01

    Full Text Available Maximum likelihood (ML estimation of categorical multitrait-multimethod (MTMM data is challenging because the likelihood involves high-dimensional integrals over the crossed method and trait factors, with no known closed-form solution.The purpose of the study is to introduce three newly developed ML methods that are eligible for estimating MTMM models with categorical responses: Variational maximization-maximization, Alternating imputation posterior, and Monte Carlo local likelihood. Each method is briefly described and its applicability for MTMM models with categorical data are discussed.An illustration is provided using an empirical example.

  12. Maximum Likelihood Joint Tracking and Association in Strong Clutter

    Directory of Open Access Journals (Sweden)

    Leonid I. Perlovsky

    2013-01-01

    Full Text Available We have developed a maximum likelihood formulation for a joint detection, tracking and association problem. An efficient non-combinatorial algorithm for this problem is developed in case of strong clutter for radar data. By using an iterative procedure of the dynamic logic process “from vague-to-crisp” explained in the paper, the new tracker overcomes the combinatorial complexity of tracking in highly-cluttered scenarios and results in an orders-of-magnitude improvement in signal-to-clutter ratio.

  13. Planck intermediate results: XVI. Profile likelihoods for cosmological parameters

    DEFF Research Database (Denmark)

    Bartlett, J.G.; Cardoso, J.-F.; Delabrouille, J.

    2014-01-01

    mass distribution. By applying the Feldman-Cousins prescription, we again obtain results very similar to those of the Bayesian methodology. However, the profile-likelihood analysis of the cosmic microwave background (CMB) combination (Planck+WP+highL) reveals a minimum well within the unphysical...... negative-mass region. We show that inclusion of the Planck CMB-lensing information regularizes this issue, and provide a robust frequentist upper limit σmv ≤0:26 eV (95% confidence) from the CMB+lensing+BAO data combination. © ESO 2014....

  14. Maximum likelihood polynomial regression for robust speech recognition

    Institute of Scientific and Technical Information of China (English)

    LU Yong; WU Zhenyang

    2011-01-01

    The linear hypothesis is the main disadvantage of maximum likelihood linear re- gression (MLLR). This paper applies the polynomial regression method to model adaptation and establishes a nonlinear model adaptation algorithm using maximum likelihood polyno

  15. Composite likelihood and two-stage estimation in family studies

    DEFF Research Database (Denmark)

    Andersen, Elisabeth Anne Wreford

    2002-01-01

    Composite likelihood; Two-stage estimation; Family studies; Copula; Optimal weights; All possible pairs......Composite likelihood; Two-stage estimation; Family studies; Copula; Optimal weights; All possible pairs...

  16. Inference in HIV dynamics models via hierarchical likelihood

    OpenAIRE

    2010-01-01

    HIV dynamical models are often based on non-linear systems of ordinary differential equations (ODE), which do not have analytical solution. Introducing random effects in such models leads to very challenging non-linear mixed-effects models. To avoid the numerical computation of multiple integrals involved in the likelihood, we propose a hierarchical likelihood (h-likelihood) approach, treated in the spirit of a penalized likelihood. We give the asymptotic distribution of the maximum h-likelih...

  17. Experiments using machine learning to approximate likelihood ratios for mixture models

    Science.gov (United States)

    Cranmer, K.; Pavez, J.; Louppe, G.; Brooks, W. K.

    2016-10-01

    Likelihood ratio tests are a key tool in many fields of science. In order to evaluate the likelihood ratio the likelihood function is needed. However, it is common in fields such as High Energy Physics to have complex simulations that describe the distribution while not having a description of the likelihood that can be directly evaluated. In this setting it is impossible or computationally expensive to evaluate the likelihood. It is, however, possible to construct an equivalent version of the likelihood ratio that can be evaluated by using discriminative classifiers. We show how this can be used to approximate the likelihood ratio when the underlying distribution is a weighted sum of probability distributions (e.g. signal plus background model). We demonstrate how the results can be considerably improved by decomposing the ratio and use a set of classifiers in a pairwise manner on the components of the mixture model and how this can be used to estimate the unknown coefficients of the model, such as the signal contribution.

  18. Clinical options for women at high risk for breast cancer.

    Science.gov (United States)

    Hartmann, L C; Sellers, T A; Schaid, D J; Nayfield, S; Grant, C S; Bjoraker, J A; Woods, J; Couch, F

    1999-10-01

    Women at hereditary risk of breast cancer face a difficult clinical decision. Each of the options available to them has unique advantages and disadvantages that are summarized in Table 9. Many components enter a high-risk woman's decision: her objective risk of breast cancer; clinical features, such as the consistency of breast tissue and resultant ease of examination; breast density on mammography; personal characteristics, including her experience with cancer within her family; her role and [table: see text] responsibilities within her own nuclear family; her values and goals; her experiences with the medical system; and her subjective assessment of risk. It is generally believed that women significantly overestimate their risk of breast cancer. Thus, it is vital that a woman at risk have access to a genetic counselor who can provide accurate assessment of her risk. Women should be encouraged to take time to understand their risk level and the advantages and disadvantages of the options before them.

  19. Smell identification in individuals at clinical high risk for schizophrenia.

    Science.gov (United States)

    Gill, Kelly Elizabeth; Evans, Elizabeth; Kayser, Jürgen; Ben-David, Shelly; Messinger, Julie; Bruder, Gerard; Malaspina, Dolores; Corcoran, Cheryl Mary

    2014-12-15

    Smell identification deficits exist in schizophrenia, and may be associated with its negative symptoms. Less is known about smell identification and its clinical correlates in individuals at clinical high risk (CHR) for schizophrenia and related psychotic disorders. We examined smell identification, symptoms and IQ in 71 clinical high-risk (CHR) subjects and 36 healthy controls. Smell identification was assessed using both the 40-item University of Pennsylvania Smell Identification Test (UPSIT; Doty, R.L., Shaman, P., Kimmelman, C.P., Dann, M.S., 1984. University of Pennsylvania Smell Identification Test: a rapid quantitative olfactory function test for the clinic. Laryngoscope 94, 176-178) and its extracted 12-item Brief Smell Identification Test (Goudsmit, N., Coleman, E., Seckinger, R.A., Wolitzky, R., Stanford, A.D., Corcoran, C., Goetz, R.R., Malaspina, D., 2003. A brief smell identification test discriminates between deficit and non-deficit schizophrenia. Psychiatry Research 120, 155-164). Smell identification did not significantly differ between CHR subjects and controls. Among CHR subjects, smell identification did not predict schizophrenia (N=19; 27%) within 2 years, nor was it associated with negative or positive symptoms. This is the third prospective cohort study to examine smell identification in CHR subjects, and overall, findings are inconclusive, similar to what is found for other disorders in adolescents, such as autism spectrum, attention deficit and anxiety disorders. Smell identification deficit may not have clear utility as a marker of emergent schizophrenia and related psychotic disorders. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  20. Nonparametric likelihood based estimation of linear filters for point processes

    DEFF Research Database (Denmark)

    Hansen, Niels Richard

    2015-01-01

    result is a representation of the gradient of the log-likelihood, which we use to derive computable approximations of the log-likelihood and the gradient by time discretization. These approximations are then used to minimize the approximate penalized log-likelihood. For time and memory efficiency...

  1. Likelihood ratios of multiple cutoff points of the Taipei City Developmental Checklist for Preschoolers, 2nd version

    Directory of Open Access Journals (Sweden)

    Hua-Fang Liao

    2014-03-01

    Conclusion: Taipei II with multiple cutoff points could give more useful clinical information than using a single cutoff point. The multiple likelihood ratios of Taipei II for children older than 3 years and in different cultural backgrounds need further study.

  2. Hierarchical Linear Modeling with Maximum Likelihood, Restricted Maximum Likelihood, and Fully Bayesian Estimation

    Science.gov (United States)

    Boedeker, Peter

    2017-01-01

    Hierarchical linear modeling (HLM) is a useful tool when analyzing data collected from groups. There are many decisions to be made when constructing and estimating a model in HLM including which estimation technique to use. Three of the estimation techniques available when analyzing data with HLM are maximum likelihood, restricted maximum…

  3. MLDS: Maximum Likelihood Difference Scaling in R

    Directory of Open Access Journals (Sweden)

    Kenneth Knoblauch

    2008-01-01

    Full Text Available The MLDS package in the R programming language can be used to estimate perceptual scales based on the results of psychophysical experiments using the method of difference scaling. In a difference scaling experiment, observers compare two supra-threshold differences (a,b and (c,d on each trial. The approach is based on a stochastic model of how the observer decides which perceptual difference (or interval (a,b or (c,d is greater, and the parameters of the model are estimated using a maximum likelihood criterion. We also propose a method to test the model by evaluating the self-consistency of the estimated scale. The package includes an example in which an observer judges the differences in correlation between scatterplots. The example may be readily adapted to estimate perceptual scales for arbitrary physical continua.

  4. A Predictive Likelihood Approach to Bayesian Averaging

    Directory of Open Access Journals (Sweden)

    Tomáš Jeřábek

    2015-01-01

    Full Text Available Multivariate time series forecasting is applied in a wide range of economic activities related to regional competitiveness and is the basis of almost all macroeconomic analysis. In this paper we combine multivariate density forecasts of GDP growth, inflation and real interest rates from four various models, two type of Bayesian vector autoregression (BVAR models, a New Keynesian dynamic stochastic general equilibrium (DSGE model of small open economy and DSGE-VAR model. The performance of models is identified using historical dates including domestic economy and foreign economy, which is represented by countries of the Eurozone. Because forecast accuracy of observed models are different, the weighting scheme based on the predictive likelihood, the trace of past MSE matrix, model ranks are used to combine the models. The equal-weight scheme is used as a simple combination scheme. The results show that optimally combined densities are comparable to the best individual models.

  5. Groups, information theory, and Einstein's likelihood principle

    Science.gov (United States)

    Sicuro, Gabriele; Tempesta, Piergiulio

    2016-04-01

    We propose a unifying picture where the notion of generalized entropy is related to information theory by means of a group-theoretical approach. The group structure comes from the requirement that an entropy be well defined with respect to the composition of independent systems, in the context of a recently proposed generalization of the Shannon-Khinchin axioms. We associate to each member of a large class of entropies a generalized information measure, satisfying the additivity property on a set of independent systems as a consequence of the underlying group law. At the same time, we also show that Einstein's likelihood function naturally emerges as a byproduct of our informational interpretation of (generally nonadditive) entropies. These results confirm the adequacy of composable entropies both in physical and social science contexts.

  6. Parameter likelihood of intrinsic ellipticity correlations

    CERN Document Server

    Capranico, Federica; Schaefer, Bjoern Malte

    2012-01-01

    Subject of this paper are the statistical properties of ellipticity alignments between galaxies evoked by their coupled angular momenta. Starting from physical angular momentum models, we bridge the gap towards ellipticity correlations, ellipticity spectra and derived quantities such as aperture moments, comparing the intrinsic signals with those generated by gravitational lensing, with the projected galaxy sample of EUCLID in mind. We investigate the dependence of intrinsic ellipticity correlations on cosmological parameters and show that intrinsic ellipticity correlations give rise to non-Gaussian likelihoods as a result of nonlinear functional dependencies. Comparing intrinsic ellipticity spectra to weak lensing spectra we quantify the magnitude of their contaminating effect on the estimation of cosmological parameters and find that biases on dark energy parameters are very small in an angular-momentum based model in contrast to the linear alignment model commonly used. Finally, we quantify whether intrins...

  7. Maximum likelihood estimation in constrained parameter spaces for mixtures of factor analyzers

    OpenAIRE

    Greselin, Francesca; Ingrassia, Salvatore

    2013-01-01

    Mixtures of factor analyzers are becoming more and more popular in the area of model based clustering of high-dimensional data. According to the likelihood approach in data modeling, it is well known that the unconstrained log-likelihood function may present spurious maxima and singularities and this is due to specific patterns of the estimated covariance structure, when their determinant approaches 0. To reduce such drawbacks, in this paper we introduce a procedure for the parameter estimati...

  8. Lumbar disc herniation at high levels : MRI and clinical findings

    Energy Technology Data Exchange (ETDEWEB)

    Paek, Chung Ho; Kwon, Soon Tae; Lee, Jun Kyu; Ahn, Jae Sung; Lee, Hwan Do; Chung, Yon Su; Jeong, Ki Ho; Cho, Jun Sik [Chungnam National Univ. College of Medicine, Taejon (Korea, Republic of)

    1999-04-01

    To assess the frequency, location, associated MR findings, and clinical symptoms of the high level lumbar disc herniation(HLDH). A total of 1076 patients with lunbar disc herniation were retrospectively reviewed. MR images of 41 of these with HLDH(T12-L1, L1-2, L2-3) were analysed in terms of frequency, location, and associated MR findings, and correlated with clinical symptoms of HLDH. The prevalence of HLDH was 3.8%(41/1076). HLDH was located at T12-L1 level in four patients(10%), at L1-2 level in 14(34%), at L2-3 level in 21(51%), and at both L1-2 and L2-3 levels in two. The age of patients ranged from 20 to 72 years (mean, 44), and there were 26 men and 16 women. In 11(27%), whose mean age was 32 years, isolated disc herniation was limited to these high lumbar segments. The remaining 30 patients had HLDH associated with variable involvement of the lower lumbar segments. Associated lesions were as follow : lower level disc herniation(14 patients, 34%); apophyseal ring fracture(8 patients, 19%); Schmorl's node and spondylolisthesis (each 6 patients, each 14%); spondylolysis(3 patients, 7%); and retrolisthesis(2 patients, 5%). In 20 patients(49%) with HLDH(n=41), there was a previous history of trauma. Patients with HLDH showed a relatively high incidence of associated coexisting abnormalities such as lower lumbar disc herniation, apophyseal ring fracture, Schmorl's node, spondylolysis, and retrolisthesis. In about half of all patients with HLDH there was a previous history of trauma. The mean age of patients with isolated HLDH was lower; clinical symptoms of the condition were relatively nonspecific and their incidence was low.

  9. High-resolution multimodal clinical multiphoton tomography of skin

    Science.gov (United States)

    König, Karsten

    2011-03-01

    This review focuses on multimodal multiphoton tomography based on near infrared femtosecond lasers. Clinical multiphoton tomographs for 3D high-resolution in vivo imaging have been placed into the market several years ago. The second generation of this Prism-Award winning High-Tech skin imaging tool (MPTflex) was introduced in 2010. The same year, the world's first clinical CARS studies have been performed with a hybrid multimodal multiphoton tomograph. In particular, non-fluorescent lipids and water as well as mitochondrial fluorescent NAD(P)H, fluorescent elastin, keratin, and melanin as well as SHG-active collagen has been imaged with submicron resolution in patients suffering from psoriasis. Further multimodal approaches include the combination of multiphoton tomographs with low-resolution wide-field systems such as ultrasound, optoacoustical, OCT, and dermoscopy systems. Multiphoton tomographs are currently employed in Australia, Japan, the US, and in several European countries for early diagnosis of skin cancer, optimization of treatment strategies, and cosmetic research including long-term testing of sunscreen nanoparticles as well as anti-aging products.

  10. Labour Force Participation and the Likelihood of Abortion in Finland over Three Birth Cohorts

    Directory of Open Access Journals (Sweden)

    Väisänen, Heini

    2015-12-01

    Full Text Available There is a lack of studies on the association between labour force participation and abortion. This study examined how the likelihood of having an abortion depends on being employed, unemployed, student or outside the workforce using Finnish register data from three birth cohorts (born in 1955-59, 1965-69 and 1975-79 of nearly 260,000 women. The results differed depending on whether all women or only pregnant women were studied and on the cohort analysed. Unemployed women had a high likelihood of abortion when all women were studied, but among pregnant women students had the highest likelihood. The direction and strength of the association varied by relationship status, age, and parity. The results show that the likelihood of abortion depends on women’s economic position. More studies on contraceptive use and pregnancy intentions in Finland are needed to identify the mechanisms behind these findings.

  11. Small-sample likelihood inference in extreme-value regression models

    CERN Document Server

    Ferrari, Silvia L P

    2012-01-01

    We deal with a general class of extreme-value regression models introduced by Barreto- Souza and Vasconcellos (2011). Our goal is to derive an adjusted likelihood ratio statistic that is approximately distributed as \\c{hi}2 with a high degree of accuracy. Although the adjusted statistic requires more computational effort than its unadjusted counterpart, it is shown that the adjustment term has a simple compact form that can be easily implemented in standard statistical software. Further, we compare the finite sample performance of the three classical tests (likelihood ratio, Wald, and score), the gradient test that has been recently proposed by Terrell (2002), and the adjusted likelihood ratio test obtained in this paper. Our simulations favor the latter. Applications of our results are presented. Key words: Extreme-value regression; Gradient test; Gumbel distribution; Likelihood ratio test; Nonlinear models; Score test; Small-sample adjustments; Wald test.

  12. Individual, team, and coach predictors of players' likelihood to aggress in youth soccer.

    Science.gov (United States)

    Chow, Graig M; Murray, Kristen E; Feltz, Deborah L

    2009-08-01

    The purpose of this study was to examine personal and socioenvironmental factors of players' likelihood to aggress. Participants were youth soccer players (N = 258) and their coaches (N = 23) from high school and club teams. Players completed the Judgments About Moral Behavior in Youth Sports Questionnaire (JAMBYSQ; Stephens, Bredemeier, & Shields, 1997), which assessed athletes' stage of moral development, team norm for aggression, and self-described likelihood to aggress against an opponent. Coaches were administered the Coaching Efficacy Scale (CES; Feltz, Chase, Moritz, & Sullivan, 1999). Using multilevel modeling, results demonstrated that the team norm for aggression at the athlete and team level were significant predictors of athletes' self likelihood to aggress scores. Further, coaches' game strategy efficacy emerged as a positive predictor of their players' self-described likelihood to aggress. The findings contribute to previous research examining the socioenvironmental predictors of athletic aggression in youth sport by demonstrating the importance of coaching efficacy beliefs.

  13. Likelihood analysis of the minimal AMSB model

    Energy Technology Data Exchange (ETDEWEB)

    Bagnaschi, E.; Weiglein, G. [DESY, Hamburg (Germany); Borsato, M.; Chobanova, V.; Lucio, M.; Santos, D.M. [Universidade de Santiago de Compostela, Santiago de Compostela (Spain); Sakurai, K. [Institute for Particle Physics Phenomenology, University of Durham, Science Laboratories, Department of Physics, Durham (United Kingdom); University of Warsaw, Faculty of Physics, Institute of Theoretical Physics, Warsaw (Poland); Buchmueller, O.; Citron, M.; Costa, J.C.; Richards, A. [Imperial College, High Energy Physics Group, Blackett Laboratory, London (United Kingdom); Cavanaugh, R. [Fermi National Accelerator Laboratory, Batavia, IL (United States); University of Illinois at Chicago, Physics Department, Chicago, IL (United States); De Roeck, A. [Experimental Physics Department, CERN, Geneva (Switzerland); Antwerp University, Wilrijk (Belgium); Dolan, M.J. [School of Physics, University of Melbourne, ARC Centre of Excellence for Particle Physics at the Terascale, Melbourne (Australia); Ellis, J.R. [King' s College London, Theoretical Particle Physics and Cosmology Group, Department of Physics, London (United Kingdom); CERN, Theoretical Physics Department, Geneva (Switzerland); Flaecher, H. [University of Bristol, H.H. Wills Physics Laboratory, Bristol (United Kingdom); Heinemeyer, S. [Campus of International Excellence UAM+CSIC, Madrid (Spain); Instituto de Fisica Teorica UAM-CSIC, Madrid (Spain); Instituto de Fisica de Cantabria (CSIC-UC), Cantabria (Spain); Isidori, G. [Physik-Institut, Universitaet Zuerich, Zurich (Switzerland); Luo, F. [Kavli IPMU (WPI), UTIAS, The University of Tokyo, Kashiwa, Chiba (Japan); Olive, K.A. [School of Physics and Astronomy, University of Minnesota, William I. Fine Theoretical Physics Institute, Minneapolis, MN (United States)

    2017-04-15

    We perform a likelihood analysis of the minimal anomaly-mediated supersymmetry-breaking (mAMSB) model using constraints from cosmology and accelerator experiments. We find that either a wino-like or a Higgsino-like neutralino LSP, χ{sup 0}{sub 1}, may provide the cold dark matter (DM), both with similar likelihoods. The upper limit on the DM density from Planck and other experiments enforces m{sub χ{sup 0}{sub 1}} 0) but the scalar mass m{sub 0} is poorly constrained. In the wino-LSP case, m{sub 3/2} is constrained to about 900 TeV and m{sub χ{sup 0}{sub 1}} to 2.9 ± 0.1 TeV, whereas in the Higgsino-LSP case m{sub 3/2} has just a lower limit >or similar 650 TeV (>or similar 480 TeV) and m{sub χ{sup 0}{sub 1}} is constrained to 1.12 (1.13) ± 0.02 TeV in the μ > 0 (μ < 0) scenario. In neither case can the anomalous magnetic moment of the muon, (g-2){sub μ}, be improved significantly relative to its Standard Model (SM) value, nor do flavour measurements constrain the model significantly, and there are poor prospects for discovering supersymmetric particles at the LHC, though there are some prospects for direct DM detection. On the other hand, if the χ{sup 0}{sub 1} contributes only a fraction of the cold DM density, future LHC E{sub T}-based searches for gluinos, squarks and heavier chargino and neutralino states as well as disappearing track searches in the wino-like LSP region will be relevant, and interference effects enable BR(B{sub s,d} → μ{sup +}μ{sup -}) to agree with the data better than in the SM in the case of wino-like DM with μ > 0. (orig.)

  14. The Multi-Mission Maximum Likelihood framework (3ML)

    CERN Document Server

    Vianello, Giacomo; Younk, Patrick; Tibaldo, Luigi; Burgess, James M; Ayala, Hugo; Harding, Patrick; Hui, Michelle; Omodei, Nicola; Zhou, Hao

    2015-01-01

    Astrophysical sources are now observed by many different instruments at different wavelengths, from radio to high-energy gamma-rays, with an unprecedented quality. Putting all these data together to form a coherent view, however, is a very difficult task. Each instrument has its own data format, software and analysis procedure, which are difficult to combine. It is for example very challenging to perform a broadband fit of the energy spectrum of the source. The Multi-Mission Maximum Likelihood framework (3ML) aims to solve this issue, providing a common framework which allows for a coherent modeling of sources using all the available data, independent of their origin. At the same time, thanks to its architecture based on plug-ins, 3ML uses the existing official software of each instrument for the corresponding data in a way which is transparent to the user. 3ML is based on the likelihood formalism, in which a model summarizing our knowledge about a particular region of the sky is convolved with the instrument...

  15. Likelihood-based CT reconstruction of objects containing known components

    Energy Technology Data Exchange (ETDEWEB)

    Stayman, J. Webster [Johns Hopkins Univ., Baltimore, MD (United States). Dept. of Biomedical Engineering; Otake, Yoshito; Uneri, Ali; Prince, Jerry L.; Siewerdsen, Jeffrey H.

    2011-07-01

    There are many situations in medical imaging where there are known components within the imaging volume. Such is the case in diagnostic X-ray CT imaging of patients with implants, in intraoperative CT imaging where there may be surgical tools in the field, or in situations where the patient support (table or frame) or other devices are outside the (truncated) reconstruction FOV. In such scenarios it is often of great interest to image the relation between the known component and the surrounding anatomy, or to provide high-quality images at the boundary of these objects, or simply to minimize artifacts arising from such components. We propose a framework for simultaneously estimating the position and orientation of a known component and the surrounding volume. Toward this end, we adopt a likelihood-based objective function with an image volume jointly parameterized by a known object, or objects, with unknown registration parameters and an unknown background attenuation volume. The objective is solved iteratively using an alternating minimization approach between the two parameter types. Because this model integrates a substantial amount of prior knowledge about the overall volume, we expect a number of advantages including the reduction of metal artifacts, potential for more sparse data acquisition (decreased time and dose), and/or improved image quality. We illustrate this approach using simulated spine CT data that contains pedicle screws placed in a vertebra, and demonstrate improved performance over traditional filtered-backprojection and penalized-likelihood reconstruction techniques. (orig.)

  16. Epilepsy and Intellectual Disability: Does Epilepsy Increase the Likelihood of Co-Morbid Psychopathology?

    Science.gov (United States)

    Arshad, Saadia; Winterhalder, Robert; Underwood, Lisa; Kelesidi, Katerina; Chaplin, Eddie; Kravariti, Eugenia; Anagnostopoulos, Dimitrios; Bouras, Nick; McCarthy, Jane; Tsakanikos, Elias

    2011-01-01

    Although epilepsy is particularly common among people with intellectual disability (ID) it remains unclear whether it is associated with an increased likelihood of co-morbid psychopathology. We therefore investigated rates of mental health problems and other clinical characteristics in patients with ID and epilepsy (N=156) as compared to patients…

  17. Epilepsy and Intellectual Disability: Does Epilepsy Increase the Likelihood of Co-Morbid Psychopathology?

    Science.gov (United States)

    Arshad, Saadia; Winterhalder, Robert; Underwood, Lisa; Kelesidi, Katerina; Chaplin, Eddie; Kravariti, Eugenia; Anagnostopoulos, Dimitrios; Bouras, Nick; McCarthy, Jane; Tsakanikos, Elias

    2011-01-01

    Although epilepsy is particularly common among people with intellectual disability (ID) it remains unclear whether it is associated with an increased likelihood of co-morbid psychopathology. We therefore investigated rates of mental health problems and other clinical characteristics in patients with ID and epilepsy (N=156) as compared to patients…

  18. Likelihood Analysis of Supersymmetric SU(5) GUTs

    CERN Document Server

    Bagnaschi, E.

    2017-01-01

    We perform a likelihood analysis of the constraints from accelerator experiments and astrophysical observations on supersymmetric (SUSY) models with SU(5) boundary conditions on soft SUSY-breaking parameters at the GUT scale. The parameter space of the models studied has 7 parameters: a universal gaugino mass $m_{1/2}$, distinct masses for the scalar partners of matter fermions in five- and ten-dimensional representations of SU(5), $m_5$ and $m_{10}$, and for the $\\mathbf{5}$ and $\\mathbf{\\bar 5}$ Higgs representations $m_{H_u}$ and $m_{H_d}$, a universal trilinear soft SUSY-breaking parameter $A_0$, and the ratio of Higgs vevs $\\tan \\beta$. In addition to previous constraints from direct sparticle searches, low-energy and flavour observables, we incorporate constraints based on preliminary results from 13 TeV LHC searches for jets + MET events and long-lived particles, as well as the latest PandaX-II and LUX searches for direct Dark Matter detection. In addition to previously-identified mechanisms for bringi...

  19. Likelihood Analysis of Supersymmetric SU(5) GUTs

    CERN Document Server

    Bagnaschi, E.; Sakurai, K.; Borsato, M.; Buchmueller, O.; Cavanaugh, R.; Chobanova, V.; Citron, M.; De Roeck, A.; Dolan, M.J.; Ellis, J.R.; Flächer, H.; Heinemeyer, S.; Isidori, G.; Lucio, M.; Martínez Santos, D.; Olive, K.A.; Richards, A.; de Vries, K.J.; Weiglein, G.

    2016-01-01

    We perform a likelihood analysis of the constraints from accelerator experiments and astrophysical observations on supersymmetric (SUSY) models with SU(5) boundary conditions on soft SUSY-breaking parameters at the GUT scale. The parameter space of the models studied has 7 parameters: a universal gaugino mass $m_{1/2}$, distinct masses for the scalar partners of matter fermions in five- and ten-dimensional representations of SU(5), $m_5$ and $m_{10}$, and for the $\\mathbf{5}$ and $\\mathbf{\\bar 5}$ Higgs representations $m_{H_u}$ and $m_{H_d}$, a universal trilinear soft SUSY-breaking parameter $A_0$, and the ratio of Higgs vevs $\\tan \\beta$. In addition to previous constraints from direct sparticle searches, low-energy and flavour observables, we incorporate constraints based on preliminary results from 13 TeV LHC searches for jets + MET events and long-lived particles, as well as the latest PandaX-II and LUX searches for direct Dark Matter detection. In addition to previously-identified mechanisms for bringi...

  20. REDUCING THE LIKELIHOOD OF LONG TENNIS MATCHES

    Directory of Open Access Journals (Sweden)

    Tristan Barnett

    2006-12-01

    Full Text Available Long matches can cause problems for tournaments. For example, the starting times of subsequent matches can be substantially delayed causing inconvenience to players, spectators, officials and television scheduling. They can even be seen as unfair in the tournament setting when the winner of a very long match, who may have negative aftereffects from such a match, plays the winner of an average or shorter length match in the next round. Long matches can also lead to injuries to the participating players. One factor that can lead to long matches is the use of the advantage set as the fifth set, as in the Australian Open, the French Open and Wimbledon. Another factor is long rallies and a greater than average number of points per game. This tends to occur more frequently on the slower surfaces such as at the French Open. The mathematical method of generating functions is used to show that the likelihood of long matches can be substantially reduced by using the tiebreak game in the fifth set, or more effectively by using a new type of game, the 50-40 game, throughout the match

  1. Reducing the likelihood of long tennis matches.

    Science.gov (United States)

    Barnett, Tristan; Alan, Brown; Pollard, Graham

    2006-01-01

    Long matches can cause problems for tournaments. For example, the starting times of subsequent matches can be substantially delayed causing inconvenience to players, spectators, officials and television scheduling. They can even be seen as unfair in the tournament setting when the winner of a very long match, who may have negative aftereffects from such a match, plays the winner of an average or shorter length match in the next round. Long matches can also lead to injuries to the participating players. One factor that can lead to long matches is the use of the advantage set as the fifth set, as in the Australian Open, the French Open and Wimbledon. Another factor is long rallies and a greater than average number of points per game. This tends to occur more frequently on the slower surfaces such as at the French Open. The mathematical method of generating functions is used to show that the likelihood of long matches can be substantially reduced by using the tiebreak game in the fifth set, or more effectively by using a new type of game, the 50-40 game, throughout the match. Key PointsThe cumulant generating function has nice properties for calculating the parameters of distributions in a tennis matchA final tiebreaker set reduces the length of matches as currently being used in the US OpenA new 50-40 game reduces the length of matches whilst maintaining comparable probabilities for the better player to win the match.

  2. Likelihood Analysis of Supersymmetric SU(5) GUTs

    Energy Technology Data Exchange (ETDEWEB)

    Bagnaschi, E. [DESY; Costa, J. C. [Imperial Coll., London; Sakurai, K. [Warsaw U.; Borsato, M. [Santiago de Compostela U.; Buchmueller, O. [Imperial Coll., London; Cavanaugh, R. [Illinois U., Chicago; Chobanova, V. [Santiago de Compostela U.; Citron, M. [Imperial Coll., London; De Roeck, A. [Antwerp U.; Dolan, M. J. [Melbourne U.; Ellis, J. R. [King' s Coll. London; Flächer, H. [Bristol U.; Heinemeyer, S. [Madrid, IFT; Isidori, G. [Zurich U.; Lucio, M. [Santiago de Compostela U.; Martínez Santos, D. [Santiago de Compostela U.; Olive, K. A. [Minnesota U., Theor. Phys. Inst.; Richards, A. [Imperial Coll., London; de Vries, K. J. [Imperial Coll., London; Weiglein, G. [DESY

    2016-10-31

    We perform a likelihood analysis of the constraints from accelerator experiments and astrophysical observations on supersymmetric (SUSY) models with SU(5) boundary conditions on soft SUSY-breaking parameters at the GUT scale. The parameter space of the models studied has 7 parameters: a universal gaugino mass $m_{1/2}$, distinct masses for the scalar partners of matter fermions in five- and ten-dimensional representations of SU(5), $m_5$ and $m_{10}$, and for the $\\mathbf{5}$ and $\\mathbf{\\bar 5}$ Higgs representations $m_{H_u}$ and $m_{H_d}$, a universal trilinear soft SUSY-breaking parameter $A_0$, and the ratio of Higgs vevs $\\tan \\beta$. In addition to previous constraints from direct sparticle searches, low-energy and flavour observables, we incorporate constraints based on preliminary results from 13 TeV LHC searches for jets + MET events and long-lived particles, as well as the latest PandaX-II and LUX searches for direct Dark Matter detection. In addition to previously-identified mechanisms for bringing the supersymmetric relic density into the range allowed by cosmology, we identify a novel ${\\tilde u_R}/{\\tilde c_R} - \\tilde{\\chi}^0_1$ coannihilation mechanism that appears in the supersymmetric SU(5) GUT model and discuss the role of ${\\tilde \

  3. Maximum likelihood estimates of pairwise rearrangement distances.

    Science.gov (United States)

    Serdoz, Stuart; Egri-Nagy, Attila; Sumner, Jeremy; Holland, Barbara R; Jarvis, Peter D; Tanaka, Mark M; Francis, Andrew R

    2017-06-21

    Accurate estimation of evolutionary distances between taxa is important for many phylogenetic reconstruction methods. Distances can be estimated using a range of different evolutionary models, from single nucleotide polymorphisms to large-scale genome rearrangements. Corresponding corrections for genome rearrangement distances fall into 3 categories: Empirical computational studies, Bayesian/MCMC approaches, and combinatorial approaches. Here, we introduce a maximum likelihood estimator for the inversion distance between a pair of genomes, using a group-theoretic approach to modelling inversions introduced recently. This MLE functions as a corrected distance: in particular, we show that because of the way sequences of inversions interact with each other, it is quite possible for minimal distance and MLE distance to differently order the distances of two genomes from a third. The second aspect tackles the problem of accounting for the symmetries of circular arrangements. While, generally, a frame of reference is locked, and all computation made accordingly, this work incorporates the action of the dihedral group so that distance estimates are free from any a priori frame of reference. The philosophy of accounting for symmetries can be applied to any existing correction method, for which examples are offered. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Cyberbullying in those at clinical high risk for psychosis.

    Science.gov (United States)

    Magaud, Emilie; Nyman, Karissa; Addington, Jean

    2013-11-01

    Several studies suggest an association between experiences of childhood trauma including bullying and the development of psychotic symptoms. The use of communications technology has created a new media for bullying called 'cyberbullying'. Research has demonstrated associations between traditional bullying and cyberbullying. Negative effects of cyberbullying appear similar in nature and severity to the reported effects of traditional bullying. Our aim was to examine the prevalence and correlates of cyberbullying in those at clinical high risk (CHR) for psychosis. Fifty young people at CHR for psychosis were administered the Childhood Trauma Questionnaire with added questions about cyberbullying. Cyberbullying was reported in 38% of the sample. Those who experienced cyberbullying also reported experiencing previous trauma. It is possible that cyberbullying may be a problem for those at CHR of psychosis, and due to the vulnerable nature of these young people may have longitudinal implications. © 2013 Wiley Publishing Asia Pty Ltd.

  5. Cyberbullying in those at Clinical High Risk for psychosis

    Science.gov (United States)

    Magaud, Emilie; Nyman, Karissa; Addington, Jean

    2012-01-01

    Aim Several studies suggest an association between experiences of childhood trauma including bullying and the development of psychotic symptoms. The use of communications technology has created a new media for bullying called ‘cyberbullying’. Research has demonstrated associations between traditional bullying and cyberbullying. Negative effects of cyberbullying appear similar in nature and severity to the reported effects of traditional bullying. Our aim was to examine the prevalence and correlates of cyberbullying in those at clinical high risk (CHR) for psychosis. Methods Fifty young people at CHR for psychosis were administered the Childhood Trauma Questionnaire with added questions about cyberbullying. Results Cyberbullying was reported in 38% of the sample. Those who experienced cyberbullying also reported experiencing previous trauma. Conclusion It is possible that cyberbullying may be a problem for those at CHR of psychosis and due to the vulnerable nature of these young people, may have longitudinal implications. PMID:23343259

  6. Genetic risk and longitudinal disease activity in systemic lupus erythematosus using targeted maximum likelihood estimation.

    Science.gov (United States)

    Gianfrancesco, M A; Balzer, L; Taylor, K E; Trupin, L; Nititham, J; Seldin, M F; Singer, A W; Criswell, L A; Barcellos, L F

    2016-09-01

    Systemic lupus erythematous (SLE) is a chronic autoimmune disease associated with genetic and environmental risk factors. However, the extent to which genetic risk is causally associated with disease activity is unknown. We utilized longitudinal-targeted maximum likelihood estimation to estimate the causal association between a genetic risk score (GRS) comprising 41 established SLE variants and clinically important disease activity as measured by the validated Systemic Lupus Activity Questionnaire (SLAQ) in a multiethnic cohort of 942 individuals with SLE. We did not find evidence of a clinically important SLAQ score difference (>4.0) for individuals with a high GRS compared with those with a low GRS across nine time points after controlling for sex, ancestry, renal status, dialysis, disease duration, treatment, depression, smoking and education, as well as time-dependent confounding of missing visits. Individual single-nucleotide polymorphism (SNP) analyses revealed that 12 of the 41 variants were significantly associated with clinically relevant changes in SLAQ scores across time points eight and nine after controlling for multiple testing. Results based on sophisticated causal modeling of longitudinal data in a large patient cohort suggest that individual SLE risk variants may influence disease activity over time. Our findings also emphasize a role for other biological or environmental factors.

  7. Coronary CT angiography in clinical triage of patients at high risk of coronary artery disease

    DEFF Research Database (Denmark)

    Kühl, J Tobias; Hove, Jens D; Kristensen, Thomas S

    2017-01-01

    in patients with high likelihood of coronary artery disease and could, in theory, be used to triage high risk patients. As many obstacles remain, including logistical and safety issues, our study does not support the use of CCTA as an additional diagnostic test before ICA in an all-comer NSTEMI population.......%) coronary artery diameter stenosis with a sensitivity, specificity, and positive and negative predictive value of 99%, 81%, 96% and 95%, respectively. CCTA was used to triage patients into guideline defined treatment groups of "no or medical treatment", "referral to percutaneous coronary intervention...

  8. Likelihood analysis of supersymmetric SU(5) GUTs

    Energy Technology Data Exchange (ETDEWEB)

    Bagnaschi, E. [DESY, Hamburg (Germany); Costa, J.C. [Imperial College, London (United Kingdom). Blackett Lab.; Sakurai, K. [Durham Univ. (United Kingdom). Inst. for Particle Physics Phenomonology; Warsaw Univ. (Poland). Inst. of Theoretical Physics; Collaboration: MasterCode Collaboration; and others

    2016-10-15

    We perform a likelihood analysis of the constraints from accelerator experiments and astrophysical observations on supersymmetric (SUSY) models with SU(5) boundary conditions on soft SUSY-breaking parameters at the GUT scale. The parameter space of the models studied has 7 parameters: a universal gaugino mass m{sub 1/2}, distinct masses for the scalar partners of matter fermions in five- and ten-dimensional representations of SU(5), m{sub 5} and m{sub 10}, and for the 5 and anti 5 Higgs representations m{sub H{sub u}} and m{sub H{sub d}}, a universal trilinear soft SUSY-breaking parameter A{sub 0}, and the ratio of Higgs vevs tan β. In addition to previous constraints from direct sparticle searches, low-energy and avour observables, we incorporate constraints based on preliminary results from 13 TeV LHC searches for jets+E{sub T} events and long-lived particles, as well as the latest PandaX-II and LUX searches for direct Dark Matter detection. In addition to previously-identified mechanisms for bringing the supersymmetric relic density into the range allowed by cosmology, we identify a novel u{sub R}/c{sub R}-χ{sup 0}{sub 1} coannihilation mechanism that appears in the supersymmetric SU(5) GUT model and discuss the role of ν{sub T} coannihilation. We find complementarity between the prospects for direct Dark Matter detection and SUSY searches at the LHC.

  9. Likelihood analysis of supersymmetric SU(5) GUTs

    Energy Technology Data Exchange (ETDEWEB)

    Bagnaschi, E.; Weiglein, G. [DESY, Hamburg (Germany); Costa, J.C.; Buchmueller, O.; Citron, M.; Richards, A.; De Vries, K.J. [Imperial College, High Energy Physics Group, Blackett Laboratory, London (United Kingdom); Sakurai, K. [University of Durham, Science Laboratories, Department of Physics, Institute for Particle Physics Phenomenology, Durham (United Kingdom); University of Warsaw, Faculty of Physics, Institute of Theoretical Physics, Warsaw (Poland); Borsato, M.; Chobanova, V.; Lucio, M.; Martinez Santos, D. [Universidade de Santiago de Compostela, Santiago de Compostela (Spain); Cavanaugh, R. [Fermi National Accelerator Laboratory, Batavia, IL (United States); University of Illinois at Chicago, Physics Department, Chicago, IL (United States); Roeck, A. de [CERN, Experimental Physics Department, Geneva (Switzerland); Antwerp University, Wilrijk (Belgium); Dolan, M.J. [University of Melbourne, ARC Centre of Excellence for Particle Physics at the Terascale, School of Physics, Parkville (Australia); Ellis, J.R. [King' s College London, Theoretical Particle Physics and Cosmology Group, Department of Physics, London (United Kingdom); Theoretical Physics Department, CERN, Geneva 23 (Switzerland); Flaecher, H. [University of Bristol, H.H. Wills Physics Laboratory, Bristol (United Kingdom); Heinemeyer, S. [Campus of International Excellence UAM+CSIC, Cantoblanco, Madrid (Spain); Instituto de Fisica Teorica UAM-CSIC, Madrid (Spain); Instituto de Fisica de Cantabria (CSIC-UC), Santander (Spain); Isidori, G. [Universitaet Zuerich, Physik-Institut, Zurich (Switzerland); Olive, K.A. [University of Minnesota, William I. Fine Theoretical Physics Institute, School of Physics and Astronomy, Minneapolis, MN (United States)

    2017-02-15

    We perform a likelihood analysis of the constraints from accelerator experiments and astrophysical observations on supersymmetric (SUSY) models with SU(5) boundary conditions on soft SUSY-breaking parameters at the GUT scale. The parameter space of the models studied has seven parameters: a universal gaugino mass m{sub 1/2}, distinct masses for the scalar partners of matter fermions in five- and ten-dimensional representations of SU(5), m{sub 5} and m{sub 10}, and for the 5 and anti 5 Higgs representations m{sub H{sub u}} and m{sub H{sub d}}, a universal trilinear soft SUSY-breaking parameter A{sub 0}, and the ratio of Higgs vevs tan β. In addition to previous constraints from direct sparticle searches, low-energy and flavour observables, we incorporate constraints based on preliminary results from 13 TeV LHC searches for jets + E{sub T} events and long-lived particles, as well as the latest PandaX-II and LUX searches for direct Dark Matter detection. In addition to previously identified mechanisms for bringing the supersymmetric relic density into the range allowed by cosmology, we identify a novel u{sub R}/c{sub R} - χ{sup 0}{sub 1} coannihilation mechanism that appears in the supersymmetric SU(5) GUT model and discuss the role of ν{sub τ} coannihilation. We find complementarity between the prospects for direct Dark Matter detection and SUSY searches at the LHC. (orig.)

  10. Maximum likelihood molecular clock comb: analytic solutions.

    Science.gov (United States)

    Chor, Benny; Khetan, Amit; Snir, Sagi

    2006-04-01

    Maximum likelihood (ML) is increasingly used as an optimality criterion for selecting evolutionary trees, but finding the global optimum is a hard computational task. Because no general analytic solution is known, numeric techniques such as hill climbing or expectation maximization (EM), are used in order to find optimal parameters for a given tree. So far, analytic solutions were derived only for the simplest model--three taxa, two state characters, under a molecular clock. Four taxa rooted trees have two topologies--the fork (two subtrees with two leaves each) and the comb (one subtree with three leaves, the other with a single leaf). In a previous work, we devised a closed form analytic solution for the ML molecular clock fork. In this work, we extend the state of the art in the area of analytic solutions ML trees to the family of all four taxa trees under the molecular clock assumption. The change from the fork topology to the comb incurs a major increase in the complexity of the underlying algebraic system and requires novel techniques and approaches. We combine the ultrametric properties of molecular clock trees with the Hadamard conjugation to derive a number of topology dependent identities. Employing these identities, we substantially simplify the system of polynomial equations. We finally use tools from algebraic geometry (e.g., Gröbner bases, ideal saturation, resultants) and employ symbolic algebra software to obtain analytic solutions for the comb. We show that in contrast to the fork, the comb has no closed form solutions (expressed by radicals in the input data). In general, four taxa trees can have multiple ML points. In contrast, we can now prove that under the molecular clock assumption, the comb has a unique (local and global) ML point. (Such uniqueness was previously shown for the fork.).

  11. Likelihood inference for a nonstationary fractional autoregressive model

    DEFF Research Database (Denmark)

    Johansen, Søren; Ørregård Nielsen, Morten

    2010-01-01

    the conditional Gaussian likelihood and for the probability analysis we also condition on initial values but assume that the errors in the autoregressive model are i.i.d. with suitable moment conditions. We analyze the conditional likelihood and its derivatives as stochastic processes in the parameters, including...... d and b, and prove that they converge in distribution. We use the results to prove consistency of the maximum likelihood estimator for d,b in a large compact subset of {1/2...

  12. Estimating nonlinear dynamic equilibrium economies: a likelihood approach

    OpenAIRE

    2004-01-01

    This paper presents a framework to undertake likelihood-based inference in nonlinear dynamic equilibrium economies. The authors develop a sequential Monte Carlo algorithm that delivers an estimate of the likelihood function of the model using simulation methods. This likelihood can be used for parameter estimation and for model comparison. The algorithm can deal both with nonlinearities of the economy and with the presence of non-normal shocks. The authors show consistency of the estimate and...

  13. Rayleigh-maximum-likelihood bilateral filter for ultrasound image enhancement.

    Science.gov (United States)

    Li, Haiyan; Wu, Jun; Miao, Aimin; Yu, Pengfei; Chen, Jianhua; Zhang, Yufeng

    2017-04-17

    Ultrasound imaging plays an important role in computer diagnosis since it is non-invasive and cost-effective. However, ultrasound images are inevitably contaminated by noise and speckle during acquisition. Noise and speckle directly impact the physician to interpret the images and decrease the accuracy in clinical diagnosis. Denoising method is an important component to enhance the quality of ultrasound images; however, several limitations discourage the results because current denoising methods can remove noise while ignoring the statistical characteristics of speckle and thus undermining the effectiveness of despeckling, or vice versa. In addition, most existing algorithms do not identify noise, speckle or edge before removing noise or speckle, and thus they reduce noise and speckle while blurring edge details. Therefore, it is a challenging issue for the traditional methods to effectively remove noise and speckle in ultrasound images while preserving edge details. To overcome the above-mentioned limitations, a novel method, called Rayleigh-maximum-likelihood switching bilateral filter (RSBF) is proposed to enhance ultrasound images by two steps: noise, speckle and edge detection followed by filtering. Firstly, a sorted quadrant median vector scheme is utilized to calculate the reference median in a filtering window in comparison with the central pixel to classify the target pixel as noise, speckle or noise-free. Subsequently, the noise is removed by a bilateral filter and the speckle is suppressed by a Rayleigh-maximum-likelihood filter while the noise-free pixels are kept unchanged. To quantitatively evaluate the performance of the proposed method, synthetic ultrasound images contaminated by speckle are simulated by using the speckle model that is subjected to Rayleigh distribution. Thereafter, the corrupted synthetic images are generated by the original image multiplied with the Rayleigh distributed speckle of various signal to noise ratio (SNR) levels and

  14. Empirical likelihood estimation of discretely sampled processes of OU type

    Institute of Scientific and Technical Information of China (English)

    SUN ShuGuang; ZHANG XinSheng

    2009-01-01

    This paper presents an empirical likelihood estimation procedure for parameters of the discretely sampled process of Ornstein-Uhlenbeck type. The proposed procedure is based on the condi-tional characteristic function, and the maximum empirical likelihood estimator is proved to be consistent and asymptotically normal. Moreover, this estimator is shown to be asymptotically efficient under some tensity parameter can be exactly recovered, and we study the maximum empirical likelihood estimator with the plug-in estimated intensity parameter. Testing procedures based on the empirical likelihood ratio statistic are developed for parameters and for estimating equations, respectively. Finally, Monte Carlo simulations are conducted to demonstrate the performance of proposed estimators.

  15. Determinants of women's likelihood of vaginal self-sampling for human papillomavirus to screen for cervical cancer in Taiwan: a cross-sectional study.

    Science.gov (United States)

    Chen, Shu-Ling; Hsieh, Pao-Chun; Chou, Chia-Hui; Tzeng, Ya-Ling

    2014-11-25

    Many Taiwanese women (43.8%) did not participate in regular cervical screening in 2011. An alternative to cervical screening, self-sampling for human papillomavirus (HPV), has been available at no cost under Taiwan's National Health Insurance since 2010, but the extent and likelihood of HPV self-sampling were unknown. A cross-sectional study was performed to explore determinants of women's likelihood of HPV self-sampling. Data were collected by questionnaire from a convenience sample of 500 women attending hospital gynecologic clinics in central Taiwan from June to October 2012. Data were analyzed by descriptive statistics, chi-square test, and logistic regression. Of 500 respondents, 297 (59.4%) had heard of HPV; of these 297 women, 69 (23%) had self-sampled for HPV. Among the 297 women who had heard of HPV, 234 (78.8%) considered cost a priority for HPV self-sampling. Likelihood of HPV self-sampling was determined by previous Pap testing, high perceived risk of cervical cancer, willingness to self-sample for HPV, high HPV knowledge, and cost as a priority consideration. Outreach efforts to increase the acceptability of self-sampling for HPV testing rates should target women who have had a Pap test, perceive themselves at high risk for cervical cancer, are willing to self-sample for HPV, have a high level of HPV knowledge, and for whom the cost of self-sampling covered by health insurance is a priority.

  16. CONSTRUCTING A FLEXIBLE LIKELIHOOD FUNCTION FOR SPECTROSCOPIC INFERENCE

    Energy Technology Data Exchange (ETDEWEB)

    Czekala, Ian; Andrews, Sean M.; Mandel, Kaisey S.; Green, Gregory M. [Harvard-Smithsonian Center for Astrophysics, 60 Garden Street, Cambridge, MA 02138 (United States); Hogg, David W., E-mail: iczekala@cfa.harvard.edu [Center for Cosmology and Particle Physics, Department of Physics, New York University, 4 Washington Place, New York, NY, 10003 (United States)

    2015-10-20

    We present a modular, extensible likelihood framework for spectroscopic inference based on synthetic model spectra. The subtraction of an imperfect model from a continuously sampled spectrum introduces covariance between adjacent datapoints (pixels) into the residual spectrum. For the high signal-to-noise data with large spectral range that is commonly employed in stellar astrophysics, that covariant structure can lead to dramatically underestimated parameter uncertainties (and, in some cases, biases). We construct a likelihood function that accounts for the structure of the covariance matrix, utilizing the machinery of Gaussian process kernels. This framework specifically addresses the common problem of mismatches in model spectral line strengths (with respect to data) due to intrinsic model imperfections (e.g., in the atomic/molecular databases or opacity prescriptions) by developing a novel local covariance kernel formalism that identifies and self-consistently downweights pathological spectral line “outliers.” By fitting many spectra in a hierarchical manner, these local kernels provide a mechanism to learn about and build data-driven corrections to synthetic spectral libraries. An open-source software implementation of this approach is available at http://iancze.github.io/Starfish, including a sophisticated probabilistic scheme for spectral interpolation when using model libraries that are sparsely sampled in the stellar parameters. We demonstrate some salient features of the framework by fitting the high-resolution V-band spectrum of WASP-14, an F5 dwarf with a transiting exoplanet, and the moderate-resolution K-band spectrum of Gliese 51, an M5 field dwarf.

  17. Constructing a Flexible Likelihood Function for Spectroscopic Inference

    Science.gov (United States)

    Czekala, Ian; Andrews, Sean M.; Mandel, Kaisey S.; Hogg, David W.; Green, Gregory M.

    2015-10-01

    We present a modular, extensible likelihood framework for spectroscopic inference based on synthetic model spectra. The subtraction of an imperfect model from a continuously sampled spectrum introduces covariance between adjacent datapoints (pixels) into the residual spectrum. For the high signal-to-noise data with large spectral range that is commonly employed in stellar astrophysics, that covariant structure can lead to dramatically underestimated parameter uncertainties (and, in some cases, biases). We construct a likelihood function that accounts for the structure of the covariance matrix, utilizing the machinery of Gaussian process kernels. This framework specifically addresses the common problem of mismatches in model spectral line strengths (with respect to data) due to intrinsic model imperfections (e.g., in the atomic/molecular databases or opacity prescriptions) by developing a novel local covariance kernel formalism that identifies and self-consistently downweights pathological spectral line “outliers.” By fitting many spectra in a hierarchical manner, these local kernels provide a mechanism to learn about and build data-driven corrections to synthetic spectral libraries. An open-source software implementation of this approach is available at http://iancze.github.io/Starfish, including a sophisticated probabilistic scheme for spectral interpolation when using model libraries that are sparsely sampled in the stellar parameters. We demonstrate some salient features of the framework by fitting the high-resolution V-band spectrum of WASP-14, an F5 dwarf with a transiting exoplanet, and the moderate-resolution K-band spectrum of Gliese 51, an M5 field dwarf.

  18. 21 CFR 862.2260 - High pressure liquid chromatography system for clinical use.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false High pressure liquid chromatography system for... Clinical Laboratory Instruments § 862.2260 High pressure liquid chromatography system for clinical use. (a) Identification. A high pressure liquid chromatography system for clinical use is a device intended to...

  19. Norplant's high cost may prohibit use in Title 10 clinics.

    Science.gov (United States)

    1991-04-01

    The article discusses the prohibitive cost of Norplant for the Title 10 low-income population served in public family planning clinics in the U.S. It is argued that it's unfair for U.S. users to pay $350 to Wyeth- Ayerst when another pharmaceutical company provides developing countries with Norplant at a cost of $14 - 23. Although the public sector and private foundations funded the development, it was explained that the company needs to recoup the investment in training and education. Medicaid and third party payers such as insurance companies will reimburse for the higher price, but if the public sector price is lowered, then the company would not make a profit and everyone would have argued for the reimbursement at the lower cost. It was suggested that a boycott of American Home Products, Wyeth-Ayerst's parent company, be made. Public family planning providers who are particularly low in funding reflect that their budget of $30,000 would only provide 85 users, and identified in this circumstance by drug abusers and multiple pregnancy women, and the need for teenagers remains unfulfilled. Another remarked that the client population served is 4700 with $54,000 in funding, which is already accounted for. The general trend of comments was that for low income women the cost is to high.

  20. Planck 2013 results. XV. CMB power spectra and likelihood

    DEFF Research Database (Denmark)

    Tauber, Jan; Bartlett, J.G.; Bucher, M.;

    2014-01-01

    This paper presents the Planck 2013 likelihood, a complete statistical description of the two-point correlation function of the CMB temperature fluctuations that accounts for all known relevant uncertainties, both instrumental and astrophysical in nature. We use this likelihood to derive our best...

  1. EMPIRICAL LIKELIHOOD FOR LINEAR MODELS UNDER m-DEPENDENT ERRORS

    Institute of Scientific and Technical Information of China (English)

    QinYongsong; JiangBo; LiYufang

    2005-01-01

    In this paper,the empirical likelihood confidence regions for the regression coefficient in a linear model are constructed under m-dependent errors. It is shown that the blockwise empirical likelihood is a good way to deal with dependent samples.

  2. Empirical likelihood inference for diffusion processes with jumps

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    In this paper, we consider the empirical likelihood inference for the jump-diffusion model. We construct the confidence intervals based on the empirical likelihood for the infinitesimal moments in the jump-diffusion models. They are better than the confidence intervals which are based on the asymptotic normality of point estimates.

  3. Expert elicitation on ultrafine particles: likelihood of health effects and causal pathways

    Directory of Open Access Journals (Sweden)

    Brunekreef Bert

    2009-07-01

    Full Text Available Abstract Background Exposure to fine ambient particulate matter (PM has consistently been associated with increased morbidity and mortality. The relationship between exposure to ultrafine particles (UFP and health effects is less firmly established. If UFP cause health effects independently from coarser fractions, this could affect health impact assessment of air pollution, which would possibly lead to alternative policy options to be considered to reduce the disease burden of PM. Therefore, we organized an expert elicitation workshop to assess the evidence for a causal relationship between exposure to UFP and health endpoints. Methods An expert elicitation on the health effects of ambient ultrafine particle exposure was carried out, focusing on: 1 the likelihood of causal relationships with key health endpoints, and 2 the likelihood of potential causal pathways for cardiac events. Based on a systematic peer-nomination procedure, fourteen European experts (epidemiologists, toxicologists and clinicians were selected, of whom twelve attended. They were provided with a briefing book containing key literature. After a group discussion, individual expert judgments in the form of ratings of the likelihood of causal relationships and pathways were obtained using a confidence scheme adapted from the one used by the Intergovernmental Panel on Climate Change. Results The likelihood of an independent causal relationship between increased short-term UFP exposure and increased all-cause mortality, hospital admissions for cardiovascular and respiratory diseases, aggravation of asthma symptoms and lung function decrements was rated medium to high by most experts. The likelihood for long-term UFP exposure to be causally related to all cause mortality, cardiovascular and respiratory morbidity and lung cancer was rated slightly lower, mostly medium. The experts rated the likelihood of each of the six identified possible causal pathways separately. Out of these

  4. Mean likelihood estimation of target micro-motion parameters in laser detection

    Science.gov (United States)

    Guo, Liren; Hu, Yihua; Wang, Yunpeng

    2016-10-01

    Maximum Likelihood Estimation(MLE) is the optimal estimator for Micro-Doppler feature extracting. However, the enormous computational burden of the grid search and the existence of many local maxima of the respective highly nonlinear cost function are harmful for accurate estimation. A new method combining the Mean Likelihood Estimation(MELE) and the Monte Carlo(MC) way is proposed to solve this problem. A closed-form expression to evaluate the parameters which maximize the cost function is derived. Then the compressed likelihood function is designed to obtain the global maximum. Finally the parameters are estimated by calculating the circular mean of the samples get from MC method. The high dependence of accurate initials and the computational complexity of the iteration algorithms are avoided in this method. Applied to the simulated and experimental data, the proposed method achieves similar performance as MLE but less computational amount. Meanwhile, this method guarantees the global convergence and joint parameter estimation.

  5. INTERACTING MULTIPLE MODEL ALGORITHM BASED ON JOINT LIKELIHOOD ESTIMATION

    Institute of Scientific and Technical Information of China (English)

    Sun Jie; Jiang Chaoshu; Chen Zhuming; Zhang Wei

    2011-01-01

    A novel approach is proposed for the estimation of likelihood on Interacting Multiple-Model (IMM) filter.In this approach,the actual innovation,based on a mismatched model,can be formulated as sum of the theoretical innovation based on a matched model and the distance between matched and mismatched models,whose probability distributions are known.The joint likelihood of innovation sequence can be estimated by convolution of the two known probability density functions.The likelihood of tracking models can be calculated by conditional probability formula.Compared with the conventional likelihood estimation method,the proposed method improves the estimation accuracy of likelihood and robustness of IMM,especially when maneuver occurs.

  6. Eliciting information from experts on the likelihood of rapid climate change.

    Science.gov (United States)

    Arnell, Nigel W; Tompkins, Emma L; Adger, W Neil

    2005-12-01

    The threat of so-called rapid or abrupt climate change has generated considerable public interest because of its potentially significant impacts. The collapse of the North Atlantic Thermohaline Circulation or the West Antarctic Ice Sheet, for example, would have potentially catastrophic effects on temperatures and sea level, respectively. But how likely are such extreme climatic changes? Is it possible actually to estimate likelihoods? This article reviews the societal demand for the likelihoods of rapid or abrupt climate change, and different methods for estimating likelihoods: past experience, model simulation, or through the elicitation of expert judgments. The article describes a survey to estimate the likelihoods of two characterizations of rapid climate change, and explores the issues associated with such surveys and the value of information produced. The surveys were based on key scientists chosen for their expertise in the climate science of abrupt climate change. Most survey respondents ascribed low likelihoods to rapid climate change, due either to the collapse of the Thermohaline Circulation or increased positive feedbacks. In each case one assessment was an order of magnitude higher than the others. We explore a high rate of refusal to participate in this expert survey: many scientists prefer to rely on output from future climate model simulations.

  7. Maximum likelihood estimation for semiparametric density ratio model.

    Science.gov (United States)

    Diao, Guoqing; Ning, Jing; Qin, Jing

    2012-06-27

    In the statistical literature, the conditional density model specification is commonly used to study regression effects. One attractive model is the semiparametric density ratio model, under which the conditional density function is the product of an unknown baseline density function and a known parametric function containing the covariate information. This model has a natural connection with generalized linear models and is closely related to biased sampling problems. Despite the attractive features and importance of this model, most existing methods are too restrictive since they are based on multi-sample data or conditional likelihood functions. The conditional likelihood approach can eliminate the unknown baseline density but cannot estimate it. We propose efficient estimation procedures based on the nonparametric likelihood. The nonparametric likelihood approach allows for general forms of covariates and estimates the regression parameters and the baseline density simultaneously. Therefore, the nonparametric likelihood approach is more versatile than the conditional likelihood approach especially when estimation of the conditional mean or other quantities of the outcome is of interest. We show that the nonparametric maximum likelihood estimators are consistent, asymptotically normal, and asymptotically efficient. Simulation studies demonstrate that the proposed methods perform well in practical settings. A real example is used for illustration.

  8. Effect of formal and informal likelihood functions on uncertainty assessment in a single event rainfall-runoff model

    Science.gov (United States)

    Nourali, Mahrouz; Ghahraman, Bijan; Pourreza-Bilondi, Mohsen; Davary, Kamran

    2016-09-01

    In the present study, DREAM(ZS), Differential Evolution Adaptive Metropolis combined with both formal and informal likelihood functions, is used to investigate uncertainty of parameters of the HEC-HMS model in Tamar watershed, Golestan province, Iran. In order to assess the uncertainty of 24 parameters used in HMS, three flood events were used to calibrate and one flood event was used to validate the posterior distributions. Moreover, performance of seven different likelihood functions (L1-L7) was assessed by means of DREAM(ZS)approach. Four likelihood functions, L1-L4, Nash-Sutcliffe (NS) efficiency, Normalized absolute error (NAE), Index of agreement (IOA), and Chiew-McMahon efficiency (CM), is considered as informal, whereas remaining (L5-L7) is represented in formal category. L5 focuses on the relationship between the traditional least squares fitting and the Bayesian inference, and L6, is a hetereoscedastic maximum likelihood error (HMLE) estimator. Finally, in likelihood function L7, serial dependence of residual errors is accounted using a first-order autoregressive (AR) model of the residuals. According to the results, sensitivities of the parameters strongly depend on the likelihood function, and vary for different likelihood functions. Most of the parameters were better defined by formal likelihood functions L5 and L7 and showed a high sensitivity to model performance. Posterior cumulative distributions corresponding to the informal likelihood functions L1, L2, L3, L4 and the formal likelihood function L6 are approximately the same for most of the sub-basins, and these likelihood functions depict almost a similar effect on sensitivity of parameters. 95% total prediction uncertainty bounds bracketed most of the observed data. Considering all the statistical indicators and criteria of uncertainty assessment, including RMSE, KGE, NS, P-factor and R-factor, results showed that DREAM(ZS) algorithm performed better under formal likelihood functions L5 and L7

  9. Empirical Likelihood based Confidence Regions for first order parameters of a heavy tailed distribution

    CERN Document Server

    Worms, Julien

    2010-01-01

    Let $X_1, \\ldots, X_n$ be some i.i.d. observations from a heavy tailed distribution $F$, i.e. such that the common distribution of the excesses over a high threshold $u_n$ can be approximated by a Generalized Pareto Distribution $G_{\\gamma,\\sigma_n}$ with $\\gamma >0$. This work is devoted to the problem of finding confidence regions for the couple $(\\gamma,\\sigma_n)$ : combining the empirical likelihood methodology with estimation equations (close but not identical to the likelihood equations) introduced by J. Zhang (Australian and New Zealand J. Stat n.49(1), 2007), asymptotically valid confidence regions for $(\\gamma,\\sigma_n)$ are obtained and proved to perform better than Wald-type confidence regions (especially those derived from the asymptotic normality of the maximum likelihood estimators). By profiling out the scale parameter, confidence intervals for the tail index are also derived.

  10. Modified likelihood ratio tests in heteroskedastic multivariate regression models with measurement error

    CERN Document Server

    Melo, Tatiane F N; Patriota, Alexandre G

    2012-01-01

    In this paper, we develop a modified version of the likelihood ratio test for multivariate heteroskedastic errors-in-variables regression models. The error terms are allowed to follow a multivariate distribution in the elliptical class of distributions, which has the normal distribution as a special case. We derive the Skovgaard adjusted likelihood ratio statistic, which follows a chi-squared distribution with a high degree of accuracy. We conduct a simulation study and show that the proposed test displays superior finite sample behavior as compared to the standard likelihood ratio test. We illustrate the usefulness of our results in applied settings using a data set from the WHO MONICA Project on cardiovascular disease.

  11. Maximum Likelihood Estimation of the Identification Parameters and Its Correction

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    By taking the subsequence out of the input-output sequence of a system polluted by white noise, anindependent observation sequence and its probability density are obtained and then a maximum likelihood estimation of theidentification parameters is given. In order to decrease the asymptotic error, a corrector of maximum likelihood (CML)estimation with its recursive algorithm is given. It has been proved that the corrector has smaller asymptotic error thanthe least square methods. A simulation example shows that the corrector of maximum likelihood estimation is of higherapproximating precision to the true parameters than the least square methods.

  12. MAXIMUM LIKELIHOOD ESTIMATION IN GENERALIZED GAMMA TYPE MODEL

    Directory of Open Access Journals (Sweden)

    Vinod Kumar

    2010-01-01

    Full Text Available In the present paper, the maximum likelihood estimates of the two parameters of ageneralized gamma type model have been obtained directly by solving the likelihood equationsas well as by reparametrizing the model first and then solving the likelihood equations (as doneby Prentice, 1974 for fixed values of the third parameter. It is found that reparametrization doesneither reduce the bulk nor the complexity of calculations. as claimed by Prentice (1974. Theprocedure has been illustrated with the help of an example. The distribution of MLE of q alongwith its properties has also been obtained.

  13. clinical versus molecular diagnosis of heterozygous familial ...

    African Journals Online (AJOL)

    on clinical findings and elevated low-density lipoprotein (LDL) cholesterol levels. The rare ... to the high death rate from coronary heart disease (CHD) in ... population whose chance before cholesterol testing may be only 1 in 500 0 in 70 ..... The likelihood of apo E allelic status as a contributing factor. December 2001, Vol.

  14. A Bivariate Pseudo-Likelihood for Incomplete Longitudinal Binary Data with Nonignorable Non-monotone Missingness

    Science.gov (United States)

    Sinha, Sanjoy K.; Troxel, Andrea B.; Lipsitz, Stuart R.; Sinha, Debajyoti; Fitzmaurice, Garrett M.; Molenberghs, Geert; Ibrahim, Joseph G.

    2010-01-01

    Summary For analyzing longitudinal binary data with nonignorable and non-monotone missing responses, a full likelihood method is complicated algebraically, and often requires intensive computation, especially when there are many follow-up times. As an alternative, a pseudo-likelihood approach has been proposed in the literature under minimal parametric assumptions. This formulation only requires specification of the marginal distributions of the responses and missing data mechanism, and uses an independence working assumption. However, this estimator can be inefficient for estimating both time-varying and time-stationary effects under moderate to strong within-subject associations among repeated responses. In this article, we propose an alternative estimator, based on a bivariate pseudo-likelihood, and demonstrate in simulations that the proposed method can be much more efficient than the previous pseudo-likelihood obtained under the assumption of independence. We illustrate the method using longitudinal data on CD4 counts from two clinical trials of HIV-infected patients. PMID:21155748

  15. ON THE LIKELIHOOD OF PLANET FORMATION IN CLOSE BINARIES

    Energy Technology Data Exchange (ETDEWEB)

    Jang-Condell, Hannah, E-mail: hjangcon@uwyo.edu [Department of Physics and Astronomy, University of Wyoming, 1000 East University, Department 3905, Laramie, WY 82071 (United States)

    2015-02-01

    To date, several exoplanets have been discovered orbiting stars with close binary companions (a ≲ 30 AU). The fact that planets can form in these dynamically challenging environments implies that planet formation must be a robust process. The initial protoplanetary disks in these systems from which planets must form should be tidally truncated to radii of a few AU, which indicates that the efficiency of planet formation must be high. Here, we examine the truncation of circumstellar protoplanetary disks in close binary systems, studying how the likelihood of planet formation is affected over a range of disk parameters. If the semimajor axis of the binary is too small or its eccentricity is too high, the disk will have too little mass for planet formation to occur. However, we find that the stars in the binary systems known to have planets should have once hosted circumstellar disks that were capable of supporting planet formation despite their truncation. We present a way to characterize the feasibility of planet formation based on binary orbital parameters such as stellar mass, companion mass, eccentricity, and semimajor axis. Using this measure, we can quantify the robustness of planet formation in close binaries and better understand the overall efficiency of planet formation in general.

  16. Cosmological Parameters from CMB Maps without Likelihood Approximation

    CERN Document Server

    Racine, Benjamin; Eriksen, Hans Kristian K; Wehus, Ingunn K

    2015-01-01

    We propose an efficient Bayesian MCMC algorithm for estimating cosmological parameters from CMB data without use of likelihood approximations. It builds on a previously developed Gibbs sampling framework that allows for exploration of the joint CMB sky signal and power spectrum posterior, P(s,Cl|d), and addresses a long-standing problem of efficient parameter estimation simultaneously in high and low signal-to-noise regimes. To achieve this, our new algorithm introduces a joint Markov Chain move in which both the signal map and power spectrum are synchronously modified, by rescaling the map according to the proposed power spectrum before evaluating the Metropolis-Hastings accept probability. Such a move was already introduced by Jewell et al. (2009), who used it to explore low signal-to-noise posteriors. However, they also found that the same algorithm is inefficient in the high signal-to-noise regime, since a brute-force rescaling operation does not account for phase information. This problem is mitigated in...

  17. Maximum likelihood pedigree reconstruction using integer linear programming.

    Science.gov (United States)

    Cussens, James; Bartlett, Mark; Jones, Elinor M; Sheehan, Nuala A

    2013-01-01

    Large population biobanks of unrelated individuals have been highly successful in detecting common genetic variants affecting diseases of public health concern. However, they lack the statistical power to detect more modest gene-gene and gene-environment interaction effects or the effects of rare variants for which related individuals are ideally required. In reality, most large population studies will undoubtedly contain sets of undeclared relatives, or pedigrees. Although a crude measure of relatedness might sometimes suffice, having a good estimate of the true pedigree would be much more informative if this could be obtained efficiently. Relatives are more likely to share longer haplotypes around disease susceptibility loci and are hence biologically more informative for rare variants than unrelated cases and controls. Distant relatives are arguably more useful for detecting variants with small effects because they are less likely to share masking environmental effects. Moreover, the identification of relatives enables appropriate adjustments of statistical analyses that typically assume unrelatedness. We propose to exploit an integer linear programming optimisation approach to pedigree learning, which is adapted to find valid pedigrees by imposing appropriate constraints. Our method is not restricted to small pedigrees and is guaranteed to return a maximum likelihood pedigree. With additional constraints, we can also search for multiple high-probability pedigrees and thus account for the inherent uncertainty in any particular pedigree reconstruction. The true pedigree is found very quickly by comparison with other methods when all individuals are observed. Extensions to more complex problems seem feasible.

  18. Maximum Likelihood Factor Structure of the Family Environment Scale.

    Science.gov (United States)

    Fowler, Patrick C.

    1981-01-01

    Presents the maximum likelihood factor structure of the Family Environment Scale. The first bipolar dimension, "cohesion v conflict," measures relationship-centered concerns, while the second unipolar dimension is an index of "organizational and control" activities. (Author)

  19. Likelihood Inference for a Fractionally Cointegrated Vector Autoregressive Model

    DEFF Research Database (Denmark)

    Johansen, Søren; Nielsen, Morten Ørregaard

    2012-01-01

    We consider model based inference in a fractionally cointegrated (or cofractional) vector autoregressive model with a restricted constant term, ¿, based on the Gaussian likelihood conditional on initial values. The model nests the I(d) VAR model. We give conditions on the parameters......likelihood estimators. To this end we prove weak convergence of the conditional likelihood as a continuous stochastic...... process in the parameters when errors are i.i.d. with suitable moment conditions and initial values are bounded. When the limit is deterministic this implies uniform convergence in probability of the conditional likelihood function. If the true value b0>1/2, we prove that the limit distribution of (ß...

  20. Likelihood Inference for a Nonstationary Fractional Autoregressive Model

    DEFF Research Database (Denmark)

    Johansen, Søren; Nielsen, Morten Ørregaard

    values Xº-n, n = 0, 1, ..., under the assumption that the errors are i.i.d. Gaussian. We consider the likelihood and its derivatives as stochastic processes in the parameters, and prove that they converge in distribution when the errors are i.i.d. with suitable moment conditions and the initial values......This paper discusses model based inference in an autoregressive model for fractional processes based on the Gaussian likelihood. The model allows for the process to be fractional of order d or d - b; where d = b > 1/2 are parameters to be estimated. We model the data X¿, ..., X¿ given the initial...... are bounded. We use this to prove existence and consistency of the local likelihood estimator, and to ?find the asymptotic distribution of the estimators and the likelihood ratio test of the associated fractional unit root hypothesis, which contains the fractional Brownian motion of type II...

  1. Likelihood inference for a nonstationary fractional autoregressive model

    DEFF Research Database (Denmark)

    Johansen, Søren; Nielsen, Morten Ørregaard

    values X0-n, n = 0, 1,...,under the assumption that the errors are i.i.d. Gaussian. We consider the likelihood and its derivatives as stochastic processes in the parameters, and prove that they converge in distribution when the errors are i.i.d. with suitable moment conditions and the initial values......This paper discusses model based inference in an autoregressive model for fractional processes based on the Gaussian likelihood. The model allows for the process to be fractional of order d or d-b; where d ≥ b > 1/2 are parameters to be estimated. We model the data X1,...,XT given the initial...... are bounded. We use this to prove existence and consistency of the local likelihood estimator, and to find the asymptotic distribution of the estimators and the likelihood ratio test of the associated fractional unit root hypothesis, which contains the fractional Brownian motion of type II....

  2. Empirical likelihood estimation of discretely sampled processes of OU type

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    This paper presents an empirical likelihood estimation procedure for parameters of the discretely sampled process of Ornstein-Uhlenbeck type. The proposed procedure is based on the condi- tional characteristic function, and the maximum empirical likelihood estimator is proved to be consistent and asymptotically normal. Moreover, this estimator is shown to be asymptotically efficient under some mild conditions. When the background driving Lévy process is of type A or B, we show that the intensity parameter can be exactly recovered, and we study the maximum empirical likelihood estimator with the plug-in estimated intensity parameter. Testing procedures based on the empirical likelihood ratio statistic are developed for parameters and for estimating equations, respectively. Finally, Monte Carlo simulations are conducted to demonstrate the performance of proposed estimators.

  3. Practical likelihood analysis for spatial generalized linear mixed models

    DEFF Research Database (Denmark)

    Bonat, W. H.; Ribeiro, Paulo Justiniano

    2016-01-01

    We investigate an algorithm for maximum likelihood estimation of spatial generalized linear mixed models based on the Laplace approximation. We compare our algorithm with a set of alternative approaches for two datasets from the literature. The Rhizoctonia root rot and the Rongelap are, respectiv......We investigate an algorithm for maximum likelihood estimation of spatial generalized linear mixed models based on the Laplace approximation. We compare our algorithm with a set of alternative approaches for two datasets from the literature. The Rhizoctonia root rot and the Rongelap are...... of Laplace approximation include the computation of the maximized log-likelihood value, which can be used for model selection and tests, and the possibility to obtain realistic confidence intervals for model parameters based on profile likelihoods. The Laplace approximation also avoids the tuning...

  4. Young adult consumers' media usage and online purchase likelihood

    African Journals Online (AJOL)

    Young adult consumers' media usage and online purchase likelihood. ... in new media applications such as the internet, email, blogging, twitter and social networks. ... Convenience sampling resulted in 1 298 completed questionnaires.

  5. Posterior distributions for likelihood ratios in forensic science.

    Science.gov (United States)

    van den Hout, Ardo; Alberink, Ivo

    2016-09-01

    Evaluation of evidence in forensic science is discussed using posterior distributions for likelihood ratios. Instead of eliminating the uncertainty by integrating (Bayes factor) or by conditioning on parameter values, uncertainty in the likelihood ratio is retained by parameter uncertainty derived from posterior distributions. A posterior distribution for a likelihood ratio can be summarised by the median and credible intervals. Using the posterior mean of the distribution is not recommended. An analysis of forensic data for body height estimation is undertaken. The posterior likelihood approach has been criticised both theoretically and with respect to applicability. This paper addresses the latter and illustrates an interesting application area. Copyright © 2016 The Chartered Society of Forensic Sciences. Published by Elsevier Ireland Ltd. All rights reserved.

  6. Empirical Likelihood Ratio Confidence Interval for Positively Associated Series

    Institute of Scientific and Technical Information of China (English)

    Jun-jian Zhang

    2007-01-01

    Empirical likelihood is discussed by using the blockwise technique for strongly stationary,positively associated random variables.Our results show that the statistics is asymptotically chi-square distributed and the corresponding confidence interval can be constructed.

  7. Maximum likelihood estimation of finite mixture model for economic data

    Science.gov (United States)

    Phoong, Seuk-Yen; Ismail, Mohd Tahir

    2014-06-01

    Finite mixture model is a mixture model with finite-dimension. This models are provides a natural representation of heterogeneity in a finite number of latent classes. In addition, finite mixture models also known as latent class models or unsupervised learning models. Recently, maximum likelihood estimation fitted finite mixture models has greatly drawn statistician's attention. The main reason is because maximum likelihood estimation is a powerful statistical method which provides consistent findings as the sample sizes increases to infinity. Thus, the application of maximum likelihood estimation is used to fit finite mixture model in the present paper in order to explore the relationship between nonlinear economic data. In this paper, a two-component normal mixture model is fitted by maximum likelihood estimation in order to investigate the relationship among stock market price and rubber price for sampled countries. Results described that there is a negative effect among rubber price and stock market price for Malaysia, Thailand, Philippines and Indonesia.

  8. Improved Clinical Outcomes With High-Dose Image Guided Radiotherapy Compared With Non-IGRT for the Treatment of Clinically Localized Prostate Cancer

    Energy Technology Data Exchange (ETDEWEB)

    Zelefsky, Michael J., E-mail: Zelefskm@mskcc.org [Department of Radiation Oncology, Memorial Sloan-Kettering Cancer Center, New York, NY (United States); Kollmeier, Marisa; Cox, Brett; Fidaleo, Anthony; Sperling, Dahlia; Pei, Xin [Department of Radiation Oncology, Memorial Sloan-Kettering Cancer Center, New York, NY (United States); Carver, Brett; Coleman, Jonathan [Department of Surgery, Memorial Sloan-Kettering Cancer Center, New York, NY (United States); Lovelock, Michael; Hunt, Margie [Department of Medical Physics, Memorial Sloan-Kettering Cancer Center, New York, NY (United States)

    2012-09-01

    Purpose: To compare toxicity profiles and biochemical tumor control outcomes between patients treated with high-dose image-guided radiotherapy (IGRT) and high-dose intensity-modulated radiotherapy (IMRT) for clinically localized prostate cancer. Materials and Methods: Between 2008 and 2009, 186 patients with prostate cancer were treated with IGRT to a dose of 86.4 Gy with daily correction of the target position based on kilovoltage imaging of implanted prostatic fiducial markers. This group of patients was retrospectively compared with a similar cohort of 190 patients who were treated between 2006 and 2007 with IMRT to the same prescription dose without, however, implanted fiducial markers in place (non-IGRT). The median follow-up time was 2.8 years (range, 2-6 years). Results: A significant reduction in late urinary toxicity was observed for IGRT patients compared with the non-IGRT patients. The 3-year likelihood of grade 2 and higher urinary toxicity for the IGRT and non-IGRT cohorts were 10.4% and 20.0%, respectively (p = 0.02). Multivariate analysis identifying predictors for grade 2 or higher late urinary toxicity demonstrated that, in addition to the baseline Internatinoal Prostate Symptom Score, IGRT was associated with significantly less late urinary toxicity compared with non-IGRT. The incidence of grade 2 and higher rectal toxicity was low for both treatment groups (1.0% and 1.6%, respectively; p = 0.81). No differences in prostate-specific antigen relapse-free survival outcomes were observed for low- and intermediate-risk patients when treated with IGRT and non-IGRT. For high-risk patients, a significant improvement was observed at 3 years for patients treated with IGRT compared with non-IGRT. Conclusions: IGRT is associated with an improvement in biochemical tumor control among high-risk patients and a lower rate of late urinary toxicity compared with high-dose IMRT. These data suggest that, for definitive radiotherapy, the placement of fiducial markers

  9. Conditional likelihood inference in generalized linear mixed models.

    OpenAIRE

    Sartori, Nicola; Severini , T.A

    2002-01-01

    Consider a generalized linear model with a canonical link function, containing both fixed and random effects. In this paper, we consider inference about the fixed effects based on a conditional likelihood function. It is shown that this conditional likelihood function is valid for any distribution of the random effects and, hence, the resulting inferences about the fixed effects are insensitive to misspecification of the random effects distribution. Inferences based on the conditional likelih...

  10. Sieve likelihood ratio inference on general parameter space

    Institute of Scientific and Technical Information of China (English)

    SHEN Xiaotong; SHI Jian

    2005-01-01

    In this paper,a theory on sieve likelihood ratio inference on general parameter spaces(including infinite dimensional) is studied.Under fairly general regularity conditions,the sieve log-likelihood ratio statistic is proved to be asymptotically x2 distributed,which can be viewed as a generalization of the well-known Wilks' theorem.As an example,a emiparametric partial linear model is investigated.

  11. A notion of graph likelihood and an infinite monkey theorem

    CERN Document Server

    Banerji, Christopher R S; Severini, Simone

    2013-01-01

    We play with a graph-theoretic analogue of the folklore infinite monkey theorem. We define a notion of graph likelihood as the probability that a given graph is constructed by a monkey in a number of time steps equal to the number of vertices. We present an algorithm to compute this graph invariant and closed formulas for some infinite classes. We have to leave the computational complexity of the likelihood as an open problem.

  12. A notion of graph likelihood and an infinite monkey theorem

    Science.gov (United States)

    Banerji, Christopher R. S.; Mansour, Toufik; Severini, Simone

    2014-01-01

    We play with a graph-theoretic analogue of the folklore infinite monkey theorem. We define a notion of graph likelihood as the probability that a given graph is constructed by a monkey in a number of time steps equal to the number of vertices. We present an algorithm to compute this graph invariant and closed formulas for some infinite classes. We have to leave the computational complexity of the likelihood as an open problem.

  13. On the likelihood function of Gaussian max-stable processes

    KAUST Repository

    Genton, M. G.

    2011-05-24

    We derive a closed form expression for the likelihood function of a Gaussian max-stable process indexed by ℝd at p≤d+1 sites, d≥1. We demonstrate the gain in efficiency in the maximum composite likelihood estimators of the covariance matrix from p=2 to p=3 sites in ℝ2 by means of a Monte Carlo simulation study. © 2011 Biometrika Trust.

  14. Estimating dynamic equilibrium economies: linear versus nonlinear likelihood

    OpenAIRE

    2004-01-01

    This paper compares two methods for undertaking likelihood-based inference in dynamic equilibrium economies: a sequential Monte Carlo filter proposed by Fernández-Villaverde and Rubio-Ramírez (2004) and the Kalman filter. The sequential Monte Carlo filter exploits the nonlinear structure of the economy and evaluates the likelihood function of the model by simulation methods. The Kalman filter estimates a linearization of the economy around the steady state. The authors report two main results...

  15. Hybrid TOA/AOA Approximate Maximum Likelihood Mobile Localization

    OpenAIRE

    Mohamed Zhaounia; Mohamed Adnan Landolsi; Ridha Bouallegue

    2010-01-01

    This letter deals with a hybrid time-of-arrival/angle-of-arrival (TOA/AOA) approximate maximum likelihood (AML) wireless location algorithm. Thanks to the use of both TOA/AOA measurements, the proposed technique can rely on two base stations (BS) only and achieves better performance compared to the original approximate maximum likelihood (AML) method. The use of two BSs is an important advantage in wireless cellular communication systems because it avoids hearability problems and reduces netw...

  16. Tapered composite likelihood for spatial max-stable models

    KAUST Repository

    Sang, Huiyan

    2014-05-01

    Spatial extreme value analysis is useful to environmental studies, in which extreme value phenomena are of interest and meaningful spatial patterns can be discerned. Max-stable process models are able to describe such phenomena. This class of models is asymptotically justified to characterize the spatial dependence among extremes. However, likelihood inference is challenging for such models because their corresponding joint likelihood is unavailable and only bivariate or trivariate distributions are known. In this paper, we propose a tapered composite likelihood approach by utilizing lower dimensional marginal likelihoods for inference on parameters of various max-stable process models. We consider a weighting strategy based on a "taper range" to exclude distant pairs or triples. The "optimal taper range" is selected to maximize various measures of the Godambe information associated with the tapered composite likelihood function. This method substantially reduces the computational cost and improves the efficiency over equally weighted composite likelihood estimators. We illustrate its utility with simulation experiments and an analysis of rainfall data in Switzerland.

  17. Optimized Large-Scale CMB Likelihood And Quadratic Maximum Likelihood Power Spectrum Estimation

    CERN Document Server

    Gjerløw, E; Eriksen, H K; Górski, K M; Gruppuso, A; Jewell, J B; Plaszczynski, S; Wehus, I K

    2015-01-01

    We revisit the problem of exact CMB likelihood and power spectrum estimation with the goal of minimizing computational cost through linear compression. This idea was originally proposed for CMB purposes by Tegmark et al.\\ (1997), and here we develop it into a fully working computational framework for large-scale polarization analysis, adopting \\WMAP\\ as a worked example. We compare five different linear bases (pixel space, harmonic space, noise covariance eigenvectors, signal-to-noise covariance eigenvectors and signal-plus-noise covariance eigenvectors) in terms of compression efficiency, and find that the computationally most efficient basis is the signal-to-noise eigenvector basis, which is closely related to the Karhunen-Loeve and Principal Component transforms, in agreement with previous suggestions. For this basis, the information in 6836 unmasked \\WMAP\\ sky map pixels can be compressed into a smaller set of 3102 modes, with a maximum error increase of any single multipole of 3.8\\% at $\\ell\\le32$, and a...

  18. Likelihood ratio model for classification of forensic evidence

    Energy Technology Data Exchange (ETDEWEB)

    Zadora, G., E-mail: gzadora@ies.krakow.pl [Institute of Forensic Research, Westerplatte 9, 31-033 Krakow (Poland); Neocleous, T., E-mail: tereza@stats.gla.ac.uk [University of Glasgow, Department of Statistics, 15 University Gardens, Glasgow G12 8QW (United Kingdom)

    2009-05-29

    One of the problems of analysis of forensic evidence such as glass fragments, is the determination of their use-type category, e.g. does a glass fragment originate from an unknown window or container? Very small glass fragments arise during various accidents and criminal offences, and could be carried on the clothes, shoes and hair of participants. It is therefore necessary to obtain information on their physicochemical composition in order to solve the classification problem. Scanning Electron Microscopy coupled with an Energy Dispersive X-ray Spectrometer and the Glass Refractive Index Measurement method are routinely used in many forensic institutes for the investigation of glass. A natural form of glass evidence evaluation for forensic purposes is the likelihood ratio-LR = p(E|H{sub 1})/p(E|H{sub 2}). The main aim of this paper was to study the performance of LR models for glass object classification which considered one or two sources of data variability, i.e. between-glass-object variability and(or) within-glass-object variability. Within the proposed model a multivariate kernel density approach was adopted for modelling the between-object distribution and a multivariate normal distribution was adopted for modelling within-object distributions. Moreover, a graphical method of estimating the dependence structure was employed to reduce the highly multivariate problem to several lower-dimensional problems. The performed analysis showed that the best likelihood model was the one which allows to include information about between and within-object variability, and with variables derived from elemental compositions measured by SEM-EDX, and refractive values determined before (RI{sub b}) and after (RI{sub a}) the annealing process, in the form of dRI = log{sub 10}|RI{sub a} - RI{sub b}|. This model gave better results than the model with only between-object variability considered. In addition, when dRI and variables derived from elemental compositions were used, this

  19. Rate of strong consistency of the maximum quasi-likelihood estimator in quasi-likelihood nonlinear models

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Quasi-likelihood nonlinear models (QLNM) include generalized linear models as a special case.Under some regularity conditions,the rate of the strong consistency of the maximum quasi-likelihood estimation (MQLE) is obtained in QLNM.In an important case,this rate is O(n-1/2(loglogn)1/2),which is just the rate of LIL of partial sums for I.I.d variables,and thus cannot be improved anymore.

  20. The Likelihood of Experiencing Relative Poverty over the Life Course.

    Science.gov (United States)

    Rank, Mark R; Hirschl, Thomas A

    2015-01-01

    Research on poverty in the United States has largely consisted of examining cross-sectional levels of absolute poverty. In this analysis, we focus on understanding relative poverty within a life course context. Specifically, we analyze the likelihood of individuals falling below the 20th percentile and the 10th percentile of the income distribution between the ages of 25 and 60. A series of life tables are constructed using the nationally representative Panel Study of Income Dynamics data set. This includes panel data from 1968 through 2011. Results indicate that the prevalence of relative poverty is quite high. Consequently, between the ages of 25 to 60, 61.8 percent of the population will experience a year below the 20th percentile, and 42.1 percent will experience a year below the 10th percentile. Characteristics associated with experiencing these levels of poverty include those who are younger, nonwhite, female, not married, with 12 years or less of education, or who have a work disability.

  1. The Likelihood of Experiencing Relative Poverty over the Life Course.

    Directory of Open Access Journals (Sweden)

    Mark R Rank

    Full Text Available Research on poverty in the United States has largely consisted of examining cross-sectional levels of absolute poverty. In this analysis, we focus on understanding relative poverty within a life course context. Specifically, we analyze the likelihood of individuals falling below the 20th percentile and the 10th percentile of the income distribution between the ages of 25 and 60. A series of life tables are constructed using the nationally representative Panel Study of Income Dynamics data set. This includes panel data from 1968 through 2011. Results indicate that the prevalence of relative poverty is quite high. Consequently, between the ages of 25 to 60, 61.8 percent of the population will experience a year below the 20th percentile, and 42.1 percent will experience a year below the 10th percentile. Characteristics associated with experiencing these levels of poverty include those who are younger, nonwhite, female, not married, with 12 years or less of education, or who have a work disability.

  2. The Likelihood of Experiencing Relative Poverty over the Life Course

    Science.gov (United States)

    Rank, Mark R.; Hirschl, Thomas A.

    2015-01-01

    Research on poverty in the United States has largely consisted of examining cross-sectional levels of absolute poverty. In this analysis, we focus on understanding relative poverty within a life course context. Specifically, we analyze the likelihood of individuals falling below the 20th percentile and the 10th percentile of the income distribution between the ages of 25 and 60. A series of life tables are constructed using the nationally representative Panel Study of Income Dynamics data set. This includes panel data from 1968 through 2011. Results indicate that the prevalence of relative poverty is quite high. Consequently, between the ages of 25 to 60, 61.8 percent of the population will experience a year below the 20th percentile, and 42.1 percent will experience a year below the 10th percentile. Characteristics associated with experiencing these levels of poverty include those who are younger, nonwhite, female, not married, with 12 years or less of education, or who have a work disability. PMID:26200781

  3. Radiofrequency solutions in clinical high field magnetic resonance

    NARCIS (Netherlands)

    Andreychenko, A.|info:eu-repo/dai/nl/341697672

    2013-01-01

    Magnetic resonance imaging (MRI) and spectroscopy (MRS) benefit from the sensitivity gain at high field (≥7T). However, high field brings also certain challenges associated with growing frequency and spectral dispersion. Frequency growth results in degraded performance of large volume radiofrequency

  4. Salvage of relapse of patients with Hodgkin's disease in clinical stages I or II who were staged with laparotomy and initially treated with radiotherapy alone. A report from the international database on Hodgkin's disease

    DEFF Research Database (Denmark)

    Specht, L; Horwich, A; Ashley, S

    1994-01-01

    To analyze presentation variables that might indicate a high or low likelihood of success of the treatment of patients relapsing after initial radiotherapy of Hodgkin's disease in clinical Stages I or II who were staged with laparotomy....

  5. Clinical Implications High Frequency Chest Wall Oscillation (HFCWO)

    OpenAIRE

    Mantellini E.; Perrero L.; Petrozzino S.; Gatta A.; Bona S.

    2012-01-01

    Purpose: patients with neuromuscular diseases presents an high incidence of respiratory infections favoured by stagnation of deep bronchial secretions and deficit of cough. The aim of the study is to evaluate the correct treatment of this condition and the role of High Frequency Chest Wall Oscillation (HFCWO) in helping the removal of bronchial secretions and reduce the incidence of infections in patients with neuromuscular disease.Methods: analysis of the current bibliography related to resp...

  6. Clinical evaluation of a highly wear resistant composite.

    Science.gov (United States)

    Dickinson, G L; Gerbo, L R; Leinfelder, K F

    1993-04-01

    The purpose of this clinical study was to determine the long-term potential of a resin composite restorative material. A total of 62 restorations of a modified Herculite Incisal formulation were inserted into Class I and Class II preparations. A control group of the conventional Herculite formulation was also placed into Class I and Class II cavity preparations at an earlier date. The cavity preparations for both formulations were standardized to conform to that of conventional conservative amalgams. Deep portions of the preparations were lined with calcium hydroxide. The enamel margins were etched per manufacturers' directions followed by a dentin bonding agent. After application of the appropriate matrix, the restorations were placed incrementally. Each restoration was independently evaluated by two clinicians at baseline, 6-months, 1, 2 and 3 years in accordance with the USPHS criteria. In addition, all restorations were evaluated for wear using a series of optical standards (M-L). The color matching ability of the material never fell below 96%. The percent of restorations exhibiting a surface texture similar to enamel never fell below 90% Alfa. At the end of 3 years, the total average loss of material was only 28 microns. No clinical evidence of bulk fracture was detected with the modified Herculite formulation at 3 years. The wear rate of the modified formulation of Herculite was essentially one-half that of conventional Herculite XR. Marginal ditching, which is characteristic of most posterior resin composites in which the filler particle is 1 micron or less, was exhibited.(ABSTRACT TRUNCATED AT 250 WORDS)

  7. Exclusion probabilities and likelihood ratios with applications to kinship problems.

    Science.gov (United States)

    Slooten, Klaas-Jan; Egeland, Thore

    2014-05-01

    In forensic genetics, DNA profiles are compared in order to make inferences, paternity cases being a standard example. The statistical evidence can be summarized and reported in several ways. For example, in a paternity case, the likelihood ratio (LR) and the probability of not excluding a random man as father (RMNE) are two common summary statistics. There has been a long debate on the merits of the two statistics, also in the context of DNA mixture interpretation, and no general consensus has been reached. In this paper, we show that the RMNE is a certain weighted average of inverse likelihood ratios. This is true in any forensic context. We show that the likelihood ratio in favor of the correct hypothesis is, in expectation, bigger than the reciprocal of the RMNE probability. However, with the exception of pathological cases, it is also possible to obtain smaller likelihood ratios. We illustrate this result for paternity cases. Moreover, some theoretical properties of the likelihood ratio for a large class of general pairwise kinship cases, including expected value and variance, are derived. The practical implications of the findings are discussed and exemplified.

  8. Computer-Based versus High-Fidelity Mannequin Simulation in Developing Clinical Judgment in Nursing Education

    Science.gov (United States)

    Howard, Beverly J.

    2013-01-01

    The purpose of this study was to determine if students learn clinical judgment as effectively using computer-based simulations as when using high-fidelity mannequin simulations. There was a single research questions for this study: What is the difference in clinical judgment between participants completing high-fidelity human simulator mannequin…

  9. Clinical Implications High Frequency Chest Wall Oscillation (HFCWO

    Directory of Open Access Journals (Sweden)

    Mantellini E.

    2012-01-01

    Full Text Available Purpose: patients with neuromuscular diseases presents an high incidence of respiratory infections favoured by stagnation of deep bronchial secretions and deficit of cough. The aim of the study is to evaluate the correct treatment of this condition and the role of High Frequency Chest Wall Oscillation (HFCWO in helping the removal of bronchial secretions and reduce the incidence of infections in patients with neuromuscular disease.Methods: analysis of the current bibliography related to respiratory infections and neuromuscular disease. PCEF (Peak Cough Expiratory Flow is used as a standardized indicator of efficiency of cough.Results: the High Frequency Chest Wall Oscillation (HFCWO is useful, in cases of increased production of mucus and impairment of muco-ciliary clearance, to remove the tracheobronchial secretions and reduce the incidence of infections.Conclusions: the correct approach to patients with neuromuscular disease and frequent respiratory infections is focused on treatment of cough ineffective and management of bronchial secretions. High Frequency Chest Wall Oscillation (HFCWO (VEST has a central role in treatment of cough ineffective and management of bronchial secretions reducing respiratory infections.

  10. Radical prostatectomy in clinically localized high-risk prostate cancer

    DEFF Research Database (Denmark)

    Røder, Martin Andreas; Berg, Kasper Drimer; Christensen, Ib Jarle;

    2013-01-01

    Abstract Objective. The optimal therapeutic strategy for high-risk localized prostate cancer (PCa) is controversial. Supported by randomized trials, the combination of external beam radiation therapy (EBRT) and endocrine therapy (ET) is advocated by many, while radical prostatectomy (RP) is regar......Abstract Objective. The optimal therapeutic strategy for high-risk localized prostate cancer (PCa) is controversial. Supported by randomized trials, the combination of external beam radiation therapy (EBRT) and endocrine therapy (ET) is advocated by many, while radical prostatectomy (RP......) is regarded as primary therapy by others. This study examined the outcome for high-risk localized PCa patients treated with RP. Material and methods. Of 1300 patients who underwent RP, 231 were identified as high-risk. Patients were followed for biochemical recurrence (BCR) (defined as prostate......-specific antigen ≥ 0.2 ng/ml), metastatic disease and survival. Excluding node-positive patients, none of the patients received adjuvant therapy before BCR was confirmed. Univariate and multivariate analysis was performed with Kaplan-Meier and Cox proportional hazard models. Results. Median follow-up was 4.4 years...

  11. CORRECTING LABIAL THICK AND HIGH ATTACHED FRENUM (CLINICAL OBSERVATION.

    Directory of Open Access Journals (Sweden)

    Silvia Krusteva

    2012-11-01

    Full Text Available Labial thick and high attached maxillary frenum is commonly regarded as contributing etiology for maintaining midline diastema and upper jaw delayed development. The surgical modalities used to solve this problem are known to be quite stressful for children. Dental lasers have recently been increasingly used to treat wide variety of problems in medicine. AIM: Using a high energy diode laser to remove a short, high attached frenum of the upper lip and present the results of the procedure. MATERIAL AND METHODS: We performed frenectomy in 10 randomly selected patients of both sexes aged 7-9 years with short, thick frena of the upper lip. A Picasso soft tissue diode laser, class IV, power output 7 W, λ-810 nm was used for the procedure. RESULTS AND DISCUSSION: The healing process was uneventful, painless and without edemas developing in the soft tissues. No inflammation was found in the treated tissues. The children undergoing the procedure showed no fear. This was the reason why we preferred to use lasers as a modern therapeutic modality for soft tissue correction in the mouth.CONCLUSION: Using lasers to remove short, high attached maxillary labial frenum has the benefit of inducing less stress in children than that they experience if anaesthesia and surgery are administered. Anaesthesia with topical anaesthetics is used in the procedure. The postoperative period is free of pain and far from severe. This makes this technique particularly useful for children.

  12. Seasonal species interactions minimize the impact of species turnover on the likelihood of community persistence.

    Science.gov (United States)

    Saavedra, Serguei; Rohr, Rudolf P; Fortuna, Miguel A; Selva, Nuria; Bascompte, Jordi

    2016-04-01

    Many of the observed species interactions embedded in ecological communities are not permanent, but are characterized by temporal changes that are observed along with abiotic and biotic variations. While work has been done describing and quantifying these changes, little is known about their consequences for species coexistence. Here, we investigate the extent to which changes of species composition impact the likelihood of persistence of the predator-prey community in the highly seasonal Białowieza Primeval Forest (northeast Poland), and the extent to which seasonal changes of species interactions (predator diet) modulate the expected impact. This likelihood is estimated extending recent developments on the study of structural stability in ecological communities. We find that the observed species turnover strongly varies the likelihood of community persistence between summer and winter. Importantly, we demonstrate that the observed seasonal interaction changes minimize the variation in the likelihood of persistence associated with species turnover across the year. We find that these community dynamics can be explained as the coupling of individual species to their environment by minimizing both the variation in persistence conditions and the interaction changes between seasons. Our results provide a homeostatic explanation for seasonal species interactions and suggest that monitoring the association of interactions changes with the level of variation in community dynamics can provide a good indicator of the response of species to environmental pressures.

  13. Likelihood Estimation of the Systemic Poison-Induced Morbidity in an Adult North Eastern Romanian Population

    Directory of Open Access Journals (Sweden)

    Cătălina Lionte

    2016-12-01

    Full Text Available Purpose: Acute exposure to a systemic poison represents an important segment of medical emergencies. We aimed to estimate the likelihood of systemic poison-induced morbidity in a population admitted in a tertiary referral center from North East Romania, based on the determinant factors. Methodology: This was a prospective observational cohort study on adult poisoned patients. Demographic, clinical and laboratory characteristics were recorded in all patients. We analyzed three groups of patients, based on the associated morbidity during hospitalization. We identified significant differences between groups and predictors with significant effects on morbidity using multiple multinomial logistic regressions. ROC analysis proved that a combination of tests could improve diagnostic accuracy of poison-related morbidity. Main findings: Of the 180 patients included, aged 44.7 ± 17.2 years, 51.1% males, 49.4% had no poison-related morbidity, 28.9% developed a mild morbidity, and 21.7% had a severe morbidity, followed by death in 16 patients (8.9%. Multiple complications and deaths were recorded in patients aged 53.4 ± 17.6 years (p .001, with a lower Glasgow Coma Scale (GCS score upon admission and a significantly higher heart rate (101 ± 32 beats/min, p .011. Routine laboratory tests were significantly higher in patients with a recorded morbidity. Multiple logistic regression analysis demonstrated that a GCS < 8, a high white blood cells count (WBC, alanine aminotransferase (ALAT, myoglobin, glycemia and brain natriuretic peptide (BNP are strongly predictive for in-hospital severe morbidity. Originality: This is the first Romanian prospective study on adult poisoned patients, which identifies the factors responsible for in-hospital morbidity using logistic regression analyses, with resulting receiver operating characteristic (ROC curves. Conclusion: In acute intoxication with systemic poisons, we identified several clinical and laboratory variables

  14. Factors Influencing the Intended Likelihood of Exposing Sexual Infidelity.

    Science.gov (United States)

    Kruger, Daniel J; Fisher, Maryanne L; Fitzgerald, Carey J

    2015-08-01

    There is a considerable body of literature on infidelity within romantic relationships. However, there is a gap in the scientific literature on factors influencing the likelihood of uninvolved individuals exposing sexual infidelity. Therefore, we devised an exploratory study examining a wide range of potentially relevant factors. Based in part on evolutionary theory, we anticipated nine potential domains or types of influences on the likelihoods of exposing or protecting cheaters, including kinship, strong social alliances, financial support, previous relationship behaviors (including infidelity and abuse), potential relationship transitions, stronger sexual and emotional aspects of the extra-pair relationship, and disease risk. The pattern of results supported these predictions (N = 159 men, 328 women). In addition, there appeared to be a small positive bias for participants to report infidelity when provided with any additional information about the situation. Overall, this study contributes a broad initial description of factors influencing the predicted likelihood of exposing sexual infidelity and encourages further studies in this area.

  15. Joint analysis of prevalence and incidence data using conditional likelihood.

    Science.gov (United States)

    Saarela, Olli; Kulathinal, Sangita; Karvanen, Juha

    2009-07-01

    Disease prevalence is the combined result of duration, disease incidence, case fatality, and other mortality. If information is available on all these factors, and on fixed covariates such as genotypes, prevalence information can be utilized in the estimation of the effects of the covariates on disease incidence. Study cohorts that are recruited as cross-sectional samples and subsequently followed up for disease events of interest produce both prevalence and incidence information. In this paper, we make use of both types of information using a likelihood, which is conditioned on survival until the cross section. In a simulation study making use of real cohort data, we compare the proposed conditional likelihood method to a standard analysis where prevalent cases are omitted and the likelihood expression is conditioned on healthy status at the cross section.

  16. Penalized maximum likelihood estimation and variable selection in geostatistics

    CERN Document Server

    Chu, Tingjin; Wang, Haonan; 10.1214/11-AOS919

    2012-01-01

    We consider the problem of selecting covariates in spatial linear models with Gaussian process errors. Penalized maximum likelihood estimation (PMLE) that enables simultaneous variable selection and parameter estimation is developed and, for ease of computation, PMLE is approximated by one-step sparse estimation (OSE). To further improve computational efficiency, particularly with large sample sizes, we propose penalized maximum covariance-tapered likelihood estimation (PMLE$_{\\mathrm{T}}$) and its one-step sparse estimation (OSE$_{\\mathrm{T}}$). General forms of penalty functions with an emphasis on smoothly clipped absolute deviation are used for penalized maximum likelihood. Theoretical properties of PMLE and OSE, as well as their approximations PMLE$_{\\mathrm{T}}$ and OSE$_{\\mathrm{T}}$ using covariance tapering, are derived, including consistency, sparsity, asymptotic normality and the oracle properties. For covariance tapering, a by-product of our theoretical results is consistency and asymptotic normal...

  17. Generalized empirical likelihood methods for analyzing longitudinal data

    KAUST Repository

    Wang, S.

    2010-02-16

    Efficient estimation of parameters is a major objective in analyzing longitudinal data. We propose two generalized empirical likelihood based methods that take into consideration within-subject correlations. A nonparametric version of the Wilks theorem for the limiting distributions of the empirical likelihood ratios is derived. It is shown that one of the proposed methods is locally efficient among a class of within-subject variance-covariance matrices. A simulation study is conducted to investigate the finite sample properties of the proposed methods and compare them with the block empirical likelihood method by You et al. (2006) and the normal approximation with a correctly estimated variance-covariance. The results suggest that the proposed methods are generally more efficient than existing methods which ignore the correlation structure, and better in coverage compared to the normal approximation with correctly specified within-subject correlation. An application illustrating our methods and supporting the simulation study results is also presented.

  18. IMPROVING VOICE ACTIVITY DETECTION VIA WEIGHTING LIKELIHOOD AND DIMENSION REDUCTION

    Institute of Scientific and Technical Information of China (English)

    Wang Huanliang; Han Jiqing; Li Haifeng; Zheng Tieran

    2008-01-01

    The performance of the traditional Voice Activity Detection (VAD) algorithms declines sharply in lower Signal-to-Noise Ratio (SNR) environments. In this paper, a feature weighting likelihood method is proposed for noise-robust VAD. The contribution of dynamic features to likelihood score can be increased via the method, which improves consequently the noise robustness of VAD.Divergence based dimension reduction method is proposed for saving computation, which reduces these feature dimensions with smaller divergence value at the cost of degrading the performance a little.Experimental results on Aurora Ⅱ database show that the detection performance in noise environments can remarkably be improved by the proposed method when the model trained in clean data is used to detect speech endpoints. Using weighting likelihood on the dimension-reduced features obtains comparable, even better, performance compared to original full-dimensional feature.

  19. Penalized maximum likelihood estimation for generalized linear point processes

    DEFF Research Database (Denmark)

    Hansen, Niels Richard

    2010-01-01

    A generalized linear point process is specified in terms of an intensity that depends upon a linear predictor process through a fixed non-linear function. We present a framework where the linear predictor is parametrized by a Banach space and give results on Gateaux differentiability of the log-likelihood....... Of particular interest is when the intensity is expressed in terms of a linear filter parametrized by a Sobolev space. Using that the Sobolev spaces are reproducing kernel Hilbert spaces we derive results on the representation of the penalized maximum likelihood estimator in a special case and the gradient...... of the negative log-likelihood in general. The latter is used to develop a descent algorithm in the Sobolev space. We conclude the paper by extensions to multivariate and additive model specifications. The methods are implemented in the R-package ppstat....

  20. How to Maximize the Likelihood Function for a DSGE Model

    DEFF Research Database (Denmark)

    Andreasen, Martin Møller

    This paper extends two optimization routines to deal with objective functions for DSGE models. The optimization routines are i) a version of Simulated Annealing developed by Corana, Marchesi & Ridella (1987), and ii) the evolutionary algorithm CMA-ES developed by Hansen, Müller & Koumoutsakos (2003......). Following these extensions, we examine the ability of the two routines to maximize the likelihood function for a sequence of test economies. Our results show that the CMA- ES routine clearly outperforms Simulated Annealing in its ability to find the global optimum and in efficiency. With 10 unknown...... structural parameters in the likelihood function, the CMA-ES routine finds the global optimum in 95% of our test economies compared to 89% for Simulated Annealing. When the number of unknown structural parameters in the likelihood function increases to 20 and 35, then the CMA-ES routine finds the global...

  1. Preparation for high-acuity clinical placement: confidence levels of final-year nursing students

    Directory of Open Access Journals (Sweden)

    Porter J

    2013-04-01

    Full Text Available Joanne Porter, Julia Morphet, Karen Missen, Anita Raymond School of Nursing and Midwifery, Monash University, Churchill, VIC, Australia Aim: To measure final-year nursing students’ preparation for high-acuity placement with emphasis on clinical skill performance confidence. Background: Self-confidence has been reported as being a key component for effective clinical performance, and confident students are more likely to be more effective nurses. Clinical skill performance is reported to be the most influential source of self-confidence. Student preparation and skill acquisition are therefore important aspects in ensuring students have successful clinical placements, especially in areas of high acuity. Curriculum development should aim to assist students with their theoretical and clinical preparedness for the clinical environment. Method: A modified pretest/posttest survey design was used to measure the confidence of third-year undergraduate nursing students (n = 318 for placement into a high-acuity clinical setting. The survey comprised four questions related to clinical placement and prospect of participating in a cardiac arrest scenario, and confidence rating levels of skills related to practice in a high-acuity setting. Content and face validity were established by an expert panel (α = 0.90 and reliability was established by the pilot study in 2009. Comparisons were made between confidence levels at the beginning and end of semester. Results: Student confidence to perform individual clinical skills increased over the semester; however their feelings of preparedness for high-acuity clinical placement decreased over the same time period. Reported confidence levels improved with further exposure to clinical placement. Conclusion: There may be many external factors that influence students’ perceptions of confidence and preparedness for practice. Further research is recommended to identify causes of poor self-confidence in final-year nursing

  2. Empirical likelihood-based evaluations of Value at Risk models

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    Value at Risk (VaR) is a basic and very useful tool in measuring market risks. Numerous VaR models have been proposed in literature. Therefore, it is of great interest to evaluate the efficiency of these models, and to select the most appropriate one. In this paper, we shall propose to use the empirical likelihood approach to evaluate these models. Simulation results and real life examples show that the empirical likelihood method is more powerful and more robust than some of the asymptotic method available in literature.

  3. LIKELIHOOD ESTIMATION OF PARAMETERS USING SIMULTANEOUSLY MONITORED PROCESSES

    DEFF Research Database (Denmark)

    Friis-Hansen, Peter; Ditlevsen, Ove Dalager

    2004-01-01

    The topic is maximum likelihood inference from several simultaneously monitored response processes of a structure to obtain knowledge about the parameters of other not monitored but important response processes when the structure is subject to some Gaussian load field in space and time. The consi......The topic is maximum likelihood inference from several simultaneously monitored response processes of a structure to obtain knowledge about the parameters of other not monitored but important response processes when the structure is subject to some Gaussian load field in space and time....... The considered example is a ship sailing with a given speed through a Gaussian wave field....

  4. Unbinned likelihood maximisation framework for neutrino clustering in Python

    Energy Technology Data Exchange (ETDEWEB)

    Coenders, Stefan [Technische Universitaet Muenchen, Boltzmannstr. 2, 85748 Garching (Germany)

    2016-07-01

    Albeit having detected an astrophysical neutrino flux with IceCube, sources of astrophysical neutrinos remain hidden up to now. A detection of a neutrino point source is a smoking gun for hadronic processes and acceleration of cosmic rays. The search for neutrino sources has many degrees of freedom, for example steady versus transient, point-like versus extended sources, et cetera. Here, we introduce a Python framework designed for unbinned likelihood maximisations as used in searches for neutrino point sources by IceCube. Implementing source scenarios in a modular way, likelihood searches on various kinds can be implemented in a user-friendly way, without sacrificing speed and memory management.

  5. Semiparametric maximum likelihood for nonlinear regression with measurement errors.

    Science.gov (United States)

    Suh, Eun-Young; Schafer, Daniel W

    2002-06-01

    This article demonstrates semiparametric maximum likelihood estimation of a nonlinear growth model for fish lengths using imprecisely measured ages. Data on the species corvina reina, found in the Gulf of Nicoya, Costa Rica, consist of lengths and imprecise ages for 168 fish and precise ages for a subset of 16 fish. The statistical problem may therefore be classified as nonlinear errors-in-variables regression with internal validation data. Inferential techniques are based on ideas extracted from several previous works on semiparametric maximum likelihood for errors-in-variables problems. The illustration of the example clarifies practical aspects of the associated computational, inferential, and data analytic techniques.

  6. Nearly Efficient Likelihood Ratio Tests of the Unit Root Hypothesis

    DEFF Research Database (Denmark)

    Jansson, Michael; Nielsen, Morten Ørregaard

    Seemingly absent from the arsenal of currently available "nearly efficient" testing procedures for the unit root hypothesis, i.e. tests whose local asymptotic power functions are indistinguishable from the Gaussian power envelope, is a test admitting a (quasi-)likelihood ratio interpretation. We...... show that the likelihood ratio unit root test derived in a Gaussian AR(1) model with standard normal innovations is nearly efficient in that model. Moreover, these desirable properties carry over to more complicated models allowing for serially correlated and/or non-Gaussian innovations....

  7. Modified maximum likelihood registration based on information fusion

    Institute of Scientific and Technical Information of China (English)

    Yongqing Qi; Zhongliang Jing; Shiqiang Hu

    2007-01-01

    The bias estimation of passive sensors is considered based on information fusion in multi-platform multisensor tracking system. The unobservable problem of bearing-only tracking in blind spot is analyzed. A modified maximum likelihood method, which uses the redundant information of multi-sensor system to calculate the target position, is investigated to estimate the biases. Monte Carlo simulation results show that the modified method eliminates the effect of unobservable problem in the blind spot and can estimate the biases more rapidly and accurately than maximum likelihood method. It is statistically efficient since the standard deviation of bias estimation errors meets the theoretical lower bounds.

  8. Likelihood-based inference for clustered line transect data

    DEFF Research Database (Denmark)

    Waagepetersen, Rasmus; Schweder, Tore

    2006-01-01

    The uncertainty in estimation of spatial animal density from line transect surveys depends on the degree of spatial clustering in the animal population. To quantify the clustering we model line transect data as independent thinnings of spatial shot-noise Cox processes. Likelihood-based inference...... is implemented using markov chain Monte Carlo (MCMC) methods to obtain efficient estimates of spatial clustering parameters. Uncertainty is addressed using parametric bootstrap or by consideration of posterior distributions in a Bayesian setting. Maximum likelihood estimation and Bayesian inference are compared...

  9. Maximum likelihood estimation for life distributions with competing failure modes

    Science.gov (United States)

    Sidik, S. M.

    1979-01-01

    The general model for the competing failure modes assuming that location parameters for each mode are expressible as linear functions of the stress variables and the failure modes act independently is presented. The general form of the likelihood function and the likelihood equations are derived for the extreme value distributions, and solving these equations using nonlinear least squares techniques provides an estimate of the asymptotic covariance matrix of the estimators. Monte-Carlo results indicate that, under appropriate conditions, the location parameters are nearly unbiased, the scale parameter is slightly biased, and the asymptotic covariances are rapidly approached.

  10. Approximated maximum likelihood estimation in multifractal random walks

    CERN Document Server

    Løvsletten, Ola

    2011-01-01

    We present an approximated maximum likelihood method for the multifractal random walk processes of [E. Bacry et al., Phys. Rev. E 64, 026103 (2001)]. The likelihood is computed using a Laplace approximation and a truncation in the dependency structure for the latent volatility. The procedure is implemented as a package in the R computer language. Its performance is tested on synthetic data and compared to an inference approach based on the generalized method of moments. The method is applied to estimate parameters for various financial stock indices.

  11. Parameter estimation in X-ray astronomy using maximum likelihood

    Science.gov (United States)

    Wachter, K.; Leach, R.; Kellogg, E.

    1979-01-01

    Methods of estimation of parameter values and confidence regions by maximum likelihood and Fisher efficient scores starting from Poisson probabilities are developed for the nonlinear spectral functions commonly encountered in X-ray astronomy. It is argued that these methods offer significant advantages over the commonly used alternatives called minimum chi-squared because they rely on less pervasive statistical approximations and so may be expected to remain valid for data of poorer quality. Extensive numerical simulations of the maximum likelihood method are reported which verify that the best-fit parameter value and confidence region calculations are correct over a wide range of input spectra.

  12. 多普勒频率变化率快速最大似然估计辅助的高动态载波跟踪环路%Carrier Tracking Loop in High Dynamic Environment Aided by Fast Maximum Likelihood Estimation of Doppler Frequency Rate-of-change

    Institute of Scientific and Technical Information of China (English)

    郇浩; 陶选如; 陶然; 程小康; 董朝; 李鹏飞

    2014-01-01

    To reach a compromise between efficient dynamic performance and high tracking accuracy of carrier tracking loop in high-dynamic circumstance which results in large Doppler frequency and Doppler frequency rate-of-change, a fast maximum likelihood estimation method of Doppler frequency rate-of-change is proposed in this paper, and the estimation value is utilized to aid the carrier tracking loop. First, it is pointed out that the maximum likelihood estimation method of Doppler frequency and Doppler frequency rate-of-change is equivalent to the Fractional Fourier Fransform (FrFT). Second, the estimation method of Doppler frequency rate-of-change, which combines the instant self-correlation and the segmental Discrete Fourier Transform (DFT) is proposed to solve the large two-dimensional search calculation amount of the Doppler frequency and Doppler frequency rate-of-change, and the received coarse estimation value is applied to narrow down the search range. Finally, the estimation value is used in the carrier tracking loop to reduce the dynamic stress and improve the tracking accuracy. Theoretical analysis and computer simulation show that the search calculation amount falls to 5.25 percent of the original amount with Signal to Noise Ratio (SNR)-30 dB, and the Root Mean Sguare Error(RMSE) of frequency tracked is only 8.46 Hz/s, compared with the traditional carrier tracking method the tracking sensitivity can be improved more than 3 dB.%高动态环境下接收信号含有较大的多普勒频率及其变化率,传统载波跟踪方法难以在高动态应力和跟踪精度两方面取得较好折中,针对这一问题该文提出一种多普勒频率变化率快速最大似然估计方法,并利用估计值辅助载波跟踪环路。首先指出了多普勒频率及其变化率的最大似然估计可等效采用分数阶傅里叶变换(FrFT)来实现;其次,针对频率及其变化率2维搜索运算量大的问题,提出一种瞬时自相关与分段离

  13. Stealing among High School Students: Prevalence and Clinical Correlates

    Science.gov (United States)

    Grant, Jon E.; Potenza, Marc N.; Krishnan-Sarin, Suchitra; Cavallo, Dana A.; Desai, Rani A.

    2013-01-01

    Although stealing among adolescents appears to be fairly common, an assessment of adolescent stealing and its relationship to other behaviors and health issues is incompletely understood. A large sample of high school students (n=3999) was examined using a self-report survey with 153 questions concerning demographic characteristics, stealing behaviors, other health behaviors including substance use, and functioning variables such as grades and violent behavior. The overall prevalence of stealing was 15.2% (95%CI: 14.8–17.0). Twenty-nine (0.72%) students endorsed symptoms consistent with a diagnosis of DSM-IV kleptomania. Poor grades, alcohol and drug use, regular smoking, sadness and hopelessness, and other antisocial behaviors were all significantly (p<.05) associated with any stealing behavior. Stealing appears fairly common among high school students and is associated with a range of potentially addictive and antisocial behaviors. Significant distress and loss of control over this behavior suggests that stealing often has significant associated morbidity. PMID:21389165

  14. Estimating likelihood of future crashes for crash-prone drivers

    Directory of Open Access Journals (Sweden)

    Subasish Das

    2015-06-01

    Full Text Available At-fault crash-prone drivers are usually considered as the high risk group for possible future incidents or crashes. In Louisiana, 34% of crashes are repeatedly committed by the at-fault crash-prone drivers who represent only 5% of the total licensed drivers in the state. This research has conducted an exploratory data analysis based on the driver faultiness and proneness. The objective of this study is to develop a crash prediction model to estimate the likelihood of future crashes for the at-fault drivers. The logistic regression method is used by employing eight years' traffic crash data (2004–2011 in Louisiana. Crash predictors such as the driver's crash involvement, crash and road characteristics, human factors, collision type, and environmental factors are considered in the model. The at-fault and not-at-fault status of the crashes are used as the response variable. The developed model has identified a few important variables, and is used to correctly classify at-fault crashes up to 62.40% with a specificity of 77.25%. This model can identify as many as 62.40% of the crash incidence of at-fault drivers in the upcoming year. Traffic agencies can use the model for monitoring the performance of an at-fault crash-prone drivers and making roadway improvements meant to reduce crash proneness. From the findings, it is recommended that crash-prone drivers should be targeted for special safety programs regularly through education and regulations.

  15. Maximum-likelihood estimation of haplotype frequencies in nuclear families.

    Science.gov (United States)

    Becker, Tim; Knapp, Michael

    2004-07-01

    The importance of haplotype analysis in the context of association fine mapping of disease genes has grown steadily over the last years. Since experimental methods to determine haplotypes on a large scale are not available, phase has to be inferred statistically. For individual genotype data, several reconstruction techniques and many implementations of the expectation-maximization (EM) algorithm for haplotype frequency estimation exist. Recent research work has shown that incorporating available genotype information of related individuals largely increases the precision of haplotype frequency estimates. We, therefore, implemented a highly flexible program written in C, called FAMHAP, which calculates maximum likelihood estimates (MLEs) of haplotype frequencies from general nuclear families with an arbitrary number of children via the EM-algorithm for up to 20 SNPs. For more loci, we have implemented a locus-iterative mode of the EM-algorithm, which gives reliable approximations of the MLEs for up to 63 SNP loci, or less when multi-allelic markers are incorporated into the analysis. Missing genotypes can be handled as well. The program is able to distinguish cases (haplotypes transmitted to the first affected child of a family) from pseudo-controls (non-transmitted haplotypes with respect to the child). We tested the performance of FAMHAP and the accuracy of the obtained haplotype frequencies on a variety of simulated data sets. The implementation proved to work well when many markers were considered and no significant differences between the estimates obtained with the usual EM-algorithm and those obtained in its locus-iterative mode were observed. We conclude from the simulations that the accuracy of haplotype frequency estimation and reconstruction in nuclear families is very reliable in general and robust against missing genotypes.

  16. Asperity-based earthquake likelihood models for Italy

    Directory of Open Access Journals (Sweden)

    Danijel Schorlemmer

    2010-11-01

    Full Text Available The Asperity Likelihood Model (ALM hypothesizes that small-scale spatial variations in the b-value of the Gutenberg-Richter relationship have a central role in forecasting future seismicity. The physical basis of the ALM is the concept that the local b-value is inversely dependent on the applied shear stress. Thus low b-values (b <0.7 characterize locked patches of faults, or asperities, from which future mainshocks are more likely to be generated, whereas high b-values (b >1.1, which can be found, for example, in creeping sections of faults, suggest a lower probability of large events. To turn this hypothesis into a forecast model for Italy, we first determined the regional b-value (b = 0.93 ±0.01 and compared it with the locally determined b-values at each node of the forecast grid, based on sampling radii ranging from 6 km to 20 km. We used the local b-values if their Akaike Information Criterion scores were lower than those of the regional b-values. We then explored two modifications to this model: in the ALM.IT, we declustered the input catalog for M ≥2 and smoothed the node-wise rates of the declustered catalog with a Gaussian filter. Completeness values for each node were determined using the probability-based magnitude of completeness method. In the second model, the hybrid ALM (HALM, as a «hybrid» between a grid-based and a zoning model, the Italian territory was divided into eight distinct regions that depended on the main tectonic regimes, and the local b-value variability was thus mapped using the regional b-values for each tectonic zone.

  17. Algorithms, data structures, and numerics for likelihood-based phylogenetic inference of huge trees

    Directory of Open Access Journals (Sweden)

    Izquierdo-Carrasco Fernando

    2011-12-01

    Full Text Available Abstract Background The rapid accumulation of molecular sequence data, driven by novel wet-lab sequencing technologies, poses new challenges for large-scale maximum likelihood-based phylogenetic analyses on trees with more than 30,000 taxa and several genes. The three main computational challenges are: numerical stability, the scalability of search algorithms, and the high memory requirements for computing the likelihood. Results We introduce methods for solving these three key problems and provide respective proof-of-concept implementations in RAxML. The mechanisms presented here are not RAxML-specific and can thus be applied to any likelihood-based (Bayesian or maximum likelihood tree inference program. We develop a new search strategy that can reduce the time required for tree inferences by more than 50% while yielding equally good trees (in the statistical sense for well-chosen starting trees. We present an adaptation of the Subtree Equality Vector technique for phylogenomic datasets with missing data (already available in RAxML v728 that can reduce execution times and memory requirements by up to 50%. Finally, we discuss issues pertaining to the numerical stability of the Γ model of rate heterogeneity on very large trees and argue in favor of rate heterogeneity models that use a single rate or rate category for each site to resolve these problems. Conclusions We address three major issues pertaining to large scale tree reconstruction under maximum likelihood and propose respective solutions. Respective proof-of-concept/production-level implementations of our ideas are made available as open-source code.

  18. High-dose intravenous vitamin C combined with cytotoxic chemotherapy in patients with advanced cancer: a phase I-II clinical trial.

    Directory of Open Access Journals (Sweden)

    L John Hoffer

    Full Text Available Biological and some clinical evidence suggest that high-dose intravenous vitamin C (IVC could increase the effectiveness of cancer chemotherapy. IVC is widely used by integrative and complementary cancer therapists, but rigorous data are lacking as to its safety and which cancers and chemotherapy regimens would be the most promising to investigate in detail.We carried out a phase I-II safety, tolerability, pharmacokinetic and efficacy trial of IVC combined with chemotherapy in patients whose treating oncologist judged that standard-of-care or off-label chemotherapy offered less than a 33% likelihood of a meaningful response. We documented adverse events and toxicity associated with IVC infusions, determined pre- and post-chemotherapy vitamin C and oxalic acid pharmacokinetic profiles, and monitored objective clinical responses, mood and quality of life. Fourteen patients were enrolled. IVC was safe and generally well tolerated, although some patients experienced transient adverse events during or after IVC infusions. The pre- and post-chemotherapy pharmacokinetic profiles suggested that tissue uptake of vitamin C increases after chemotherapy, with no increase in urinary oxalic acid excretion. Three patients with different types of cancer experienced unexpected transient stable disease, increased energy and functional improvement.Despite IVC's biological and clinical plausibility, career cancer investigators currently ignore it while integrative cancer therapists use it widely but without reporting the kind of clinical data that is normally gathered in cancer drug development. The present study neither proves nor disproves IVC's value in cancer therapy, but it provides practical information, and indicates a feasible way to evaluate this plausible but unproven therapy in an academic environment that is currently uninterested in it. If carried out in sufficient numbers, simple studies like this one could identify specific clusters of cancer type

  19. FlowMax: A Computational Tool for Maximum Likelihood Deconvolution of CFSE Time Courses.

    Directory of Open Access Journals (Sweden)

    Maxim Nikolaievich Shokhirev

    Full Text Available The immune response is a concerted dynamic multi-cellular process. Upon infection, the dynamics of lymphocyte populations are an aggregate of molecular processes that determine the activation, division, and longevity of individual cells. The timing of these single-cell processes is remarkably widely distributed with some cells undergoing their third division while others undergo their first. High cell-to-cell variability and technical noise pose challenges for interpreting popular dye-dilution experiments objectively. It remains an unresolved challenge to avoid under- or over-interpretation of such data when phenotyping gene-targeted mouse models or patient samples. Here we develop and characterize a computational methodology to parameterize a cell population model in the context of noisy dye-dilution data. To enable objective interpretation of model fits, our method estimates fit sensitivity and redundancy by stochastically sampling the solution landscape, calculating parameter sensitivities, and clustering to determine the maximum-likelihood solution ranges. Our methodology accounts for both technical and biological variability by using a cell fluorescence model as an adaptor during population model fitting, resulting in improved fit accuracy without the need for ad hoc objective functions. We have incorporated our methodology into an integrated phenotyping tool, FlowMax, and used it to analyze B cells from two NFκB knockout mice with distinct phenotypes; we not only confirm previously published findings at a fraction of the expended effort and cost, but reveal a novel phenotype of nfkb1/p105/50 in limiting the proliferative capacity of B cells following B-cell receptor stimulation. In addition to complementing experimental work, FlowMax is suitable for high throughput analysis of dye dilution studies within clinical and pharmacological screens with objective and quantitative conclusions.

  20. The Multivariate Watson Distribution: Maximum-Likelihood Estimation and other Aspects

    CERN Document Server

    Sra, Suvrit

    2011-01-01

    This paper studies fundamental aspects of modelling data using multivariate Watson distributions. Although these distributions are natural for modelling axially symmetric data (i.e., unit vectors where $\\pm \\x$ are equivalent), for high-dimensions using them can be difficult. Why so? Largely because for Watson distributions even basic tasks such as maximum-likelihood are numerically challenging. To tackle the numerical difficulties some approximations have been derived---but these are either grossly inaccurate in high-dimensions (\\emph{Directional Statistics}, Mardia & Jupp. 2000) or when reasonably accurate (\\emph{J. Machine Learning Research, W. & C.P., v2}, Bijral \\emph{et al.}, 2007, pp. 35--42), they lack theoretical justification. We derive new approximations to the maximum-likelihood estimates; our approximations are theoretically well-defined, numerically accurate, and easy to compute. We build on our parameter estimation and discuss mixture-modelling with Watson distributions; here we uncover...

  1. Efficient estimators for likelihood ratio sensitivity indices of complex stochastic dynamics

    Science.gov (United States)

    Arampatzis, Georgios; Katsoulakis, Markos A.; Rey-Bellet, Luc

    2016-03-01

    We demonstrate that centered likelihood ratio estimators for the sensitivity indices of complex stochastic dynamics are highly efficient with low, constant in time variance and consequently they are suitable for sensitivity analysis in long-time and steady-state regimes. These estimators rely on a new covariance formulation of the likelihood ratio that includes as a submatrix a Fisher information matrix for stochastic dynamics and can also be used for fast screening of insensitive parameters and parameter combinations. The proposed methods are applicable to broad classes of stochastic dynamics such as chemical reaction networks, Langevin-type equations and stochastic models in finance, including systems with a high dimensional parameter space and/or disparate decorrelation times between different observables. Furthermore, they are simple to implement as a standard observable in any existing simulation algorithm without additional modifications.

  2. A Unified Maximum Likelihood Approach to Document Retrieval.

    Science.gov (United States)

    Bodoff, David; Enache, Daniel; Kambil, Ajit; Simon, Gary; Yukhimets, Alex

    2001-01-01

    Addresses the query- versus document-oriented dichotomy in information retrieval. Introduces a maximum likelihood approach to utilizing feedback data that can be used to construct a concrete object function that estimates both document and query parameters in accordance with all available feedback data. (AEF)

  3. Profile likelihood maps of a 15-dimensional MSSM

    NARCIS (Netherlands)

    Strege, C.; Bertone, G.; Besjes, G.J.; Caron, S.; Ruiz de Austri, R.; Strubig, A.; Trotta, R.

    2014-01-01

    We present statistically convergent profile likelihood maps obtained via global fits of a phenomenological Minimal Supersymmetric Standard Model with 15 free parameters (the MSSM-15), based on over 250M points. We derive constraints on the model parameters from direct detection limits on dark matter

  4. MAXIMUM-LIKELIHOOD-ESTIMATION OF THE ENTROPY OF AN ATTRACTOR

    NARCIS (Netherlands)

    SCHOUTEN, JC; TAKENS, F; VANDENBLEEK, CM

    1994-01-01

    In this paper, a maximum-likelihood estimate of the (Kolmogorov) entropy of an attractor is proposed that can be obtained directly from a time series. Also, the relative standard deviation of the entropy estimate is derived; it is dependent on the entropy and on the number of samples used in the est

  5. GPU Accelerated Likelihoods for Stereo-Based Articulated Tracking

    DEFF Research Database (Denmark)

    Friborg, Rune Møllegaard; Hauberg, Søren; Erleben, Kenny

    For many years articulated tracking has been an active research topic in the computer vision community. While working solutions have been suggested, computational time is still problematic. We present a GPU implementation of a ray-casting based likelihood model that is orders of magnitude faster...

  6. A KULLBACK-LEIBLER EMPIRICAL LIKELIHOOD INFERENCE FOR CENSORED DATA

    Institute of Scientific and Technical Information of China (English)

    SHI Jian; Tai-Shing Lau

    2002-01-01

    In this paper, two kinds of Kullback-Leibler criteria with appropriate con straints are proposed to construct empirical likelihood confidence intervals for the mean of right censored data. It is shown that one of the criteria is equivalent to Adimari's (1997)procedure, and the other shares the same asymptotic behavior.

  7. GPU accelerated likelihoods for stereo-based articulated tracking

    DEFF Research Database (Denmark)

    Friborg, Rune Møllegaard; Hauberg, Søren; Erleben, Kenny

    2010-01-01

    For many years articulated tracking has been an active research topic in the computer vision community. While working solutions have been suggested, computational time is still problematic. We present a GPU implementation of a ray-casting based likelihood model that is orders of magnitude faster...

  8. A KULLBACK—LEIBLER EMPIRICAL LIKELIHOOD INFERENCE FOR CENSORED DATA

    Institute of Scientific and Technical Information of China (English)

    SHIJian; Tai-ShingLan

    2002-01-01

    In this paper,two kinds of Kullback-Leibler criteria with appropriate constraints are proposed to construct empirical likelihood confidence intervals for the mean of right censored data.It is shown that one of the criteria is equivalent to Adimari's(1997) procedure,and the other shares the same asymptotic behavior.

  9. Community Level Disadvantage and the Likelihood of First Ischemic Stroke

    Directory of Open Access Journals (Sweden)

    Bernadette Boden-Albala

    2012-01-01

    Full Text Available Background and Purpose. Residing in “disadvantaged” communities may increase morbidity and mortality independent of individual social resources and biological factors. This study evaluates the impact of population-level disadvantage on incident ischemic stroke likelihood in a multiethnic urban population. Methods. A population based case-control study was conducted in an ethnically diverse community of New York. First ischemic stroke cases and community controls were enrolled and a stroke risk assessment performed. Data regarding population level economic indicators for each census tract was assembled using geocoding. Census variables were also grouped together to define a broader measure of collective disadvantage. We evaluated the likelihood of stroke for population-level variables controlling for individual social (education, social isolation, and insurance and vascular risk factors. Results. We age-, sex-, and race-ethnicity-matched 687 incident ischemic stroke cases to 1153 community controls. The mean age was 69 years: 60% women; 22% white, 28% black, and 50% Hispanic. After adjustment, the index of community level disadvantage (OR 2.0, 95% CI 1.7–2.1 was associated with increased stroke likelihood overall and among all three race-ethnic groups. Conclusion. Social inequalities measured by census tract data including indices of community disadvantage confer a significant likelihood of ischemic stroke independent of conventional risk factors.

  10. Heteroscedastic one-factor models and marginal maximum likelihood estimation

    NARCIS (Netherlands)

    Hessen, D.J.; Dolan, C.V.

    2009-01-01

    In the present paper, a general class of heteroscedastic one-factor models is considered. In these models, the residual variances of the observed scores are explicitly modelled as parametric functions of the one-dimensional factor score. A marginal maximum likelihood procedure for parameter estimati

  11. Bias Correction for Alternating Iterative Maximum Likelihood Estimators

    Institute of Scientific and Technical Information of China (English)

    Gang YU; Wei GAO; Ningzhong SHI

    2013-01-01

    In this paper,we give a definition of the alternating iterative maximum likelihood estimator (AIMLE) which is a biased estimator.Furthermore we adjust the AIMLE to result in asymptotically unbiased and consistent estimators by using a bootstrap iterative bias correction method as in Kuk (1995).Two examples and simulation results reported illustrate the performance of the bias correction for AIMLE.

  12. Maximum likelihood Jukes-Cantor triplets: analytic solutions.

    Science.gov (United States)

    Chor, Benny; Hendy, Michael D; Snir, Sagi

    2006-03-01

    Maximum likelihood (ML) is a popular method for inferring a phylogenetic tree of the evolutionary relationship of a set of taxa, from observed homologous aligned genetic sequences of the taxa. Generally, the computation of the ML tree is based on numerical methods, which in a few cases, are known to converge to a local maximum on a tree, which is suboptimal. The extent of this problem is unknown, one approach is to attempt to derive algebraic equations for the likelihood equation and find the maximum points analytically. This approach has so far only been successful in the very simplest cases, of three or four taxa under the Neyman model of evolution of two-state characters. In this paper we extend this approach, for the first time, to four-state characters, the Jukes-Cantor model under a molecular clock, on a tree T on three taxa, a rooted triple. We employ spectral methods (Hadamard conjugation) to express the likelihood function parameterized by the path-length spectrum. Taking partial derivatives, we derive a set of polynomial equations whose simultaneous solution contains all critical points of the likelihood function. Using tools of algebraic geometry (the resultant of two polynomials) in the computer algebra packages (Maple), we are able to find all turning points analytically. We then employ this method on real sequence data and obtain realistic results on the primate-rodents divergence time.

  13. A Monte Carlo Evaluation of Maximum Likelihood Multidimensional Scaling Methods

    NARCIS (Netherlands)

    Bijmolt, T.H.A.; Wedel, M.

    1996-01-01

    We compare three alternative Maximum Likelihood Multidimensional Scaling methods for pairwise dissimilarity ratings, namely MULTISCALE, MAXSCAL, and PROSCAL in a Monte Carlo study.The three MLMDS methods recover the true con gurations very well.The recovery of the true dimensionality depends on the

  14. Maximum likelihood estimation of phase-type distributions

    DEFF Research Database (Denmark)

    Esparza, Luz Judith R

    This work is concerned with the statistical inference of phase-type distributions and the analysis of distributions with rational Laplace transform, known as matrix-exponential distributions. The thesis is focused on the estimation of the maximum likelihood parameters of phase-type distributions ...

  15. Likelihood Inference for a Nonstationary Fractional Autoregressive Model

    DEFF Research Database (Denmark)

    Johansen, Søren; Nielsen, Morten Ørregaard

    This paper discusses model based inference in an autoregressive model for fractional processes based on the Gaussian likelihood. The model allows for the process to be fractional of order d or d - b; where d = b > 1/2 are parameters to be estimated. We model the data X¿, ..., X¿ given the initial...

  16. Composite likelihood and two-stage estimation in family studies

    DEFF Research Database (Denmark)

    Andersen, Elisabeth Anne Wreford

    2004-01-01

    In this paper register based family studies provide the motivation for linking a two-stage estimation procedure in copula models for multivariate failure time data with a composite likelihood approach. The asymptotic properties of the estimators in both parametric and semi-parametric models are d...

  17. Likelihood Inference for a Fractionally Cointegrated Vector Autoregressive Model

    DEFF Research Database (Denmark)

    Johansen, Søren; Nielsen, Morten Ørregaard

    We consider model based inference in a fractionally cointegrated (or cofractional) vector autoregressive model based on the conditional Gaussian likelihood. The model allows the process X(t) to be fractional of order d and cofractional of order d-b; that is, there exist vectors ß for which ß...

  18. Trimmed Likelihood-based Estimation in Binary Regression Models

    NARCIS (Netherlands)

    Cizek, P.

    2005-01-01

    The binary-choice regression models such as probit and logit are typically estimated by the maximum likelihood method.To improve its robustness, various M-estimation based procedures were proposed, which however require bias corrections to achieve consistency and their resistance to outliers is rela

  19. Maximum Likelihood Estimation of Nonlinear Structural Equation Models.

    Science.gov (United States)

    Lee, Sik-Yum; Zhu, Hong-Tu

    2002-01-01

    Developed an EM type algorithm for maximum likelihood estimation of a general nonlinear structural equation model in which the E-step is completed by a Metropolis-Hastings algorithm. Illustrated the methodology with results from a simulation study and two real examples using data from previous studies. (SLD)

  20. Reconceptualizing Social Influence in Counseling: The Elaboration Likelihood Model.

    Science.gov (United States)

    McNeill, Brian W.; Stoltenberg, Cal D.

    1989-01-01

    Presents Elaboration Likelihood Model (ELM) of persuasion (a reconceptualization of the social influence process) as alternative model of attitude change. Contends ELM unifies conflicting social psychology results and can potentially account for inconsistent research findings in counseling psychology. Provides guidelines on integrating…

  1. Maximum likelihood estimation of the attenuated ultrasound pulse

    DEFF Research Database (Denmark)

    Rasmussen, Klaus Bolding

    1994-01-01

    The attenuated ultrasound pulse is divided into two parts: a stationary basic pulse and a nonstationary attenuation pulse. A standard ARMA model is used for the basic pulse, and a nonstandard ARMA model is derived for the attenuation pulse. The maximum likelihood estimator of the attenuated...

  2. Microarray background correction: maximum likelihood estimation for the normal-exponential convolution

    DEFF Research Database (Denmark)

    Silver, Jeremy D; Ritchie, Matthew E; Smyth, Gordon K

    2009-01-01

    is developed for exact maximum likelihood estimation (MLE) using high-quality optimization software and using the saddle-point estimates as starting values. "MLE" is shown to outperform heuristic estimators proposed by other authors, both in terms of estimation accuracy and in terms of performance on real data....... The saddle-point approximation is an adequate replacement in most practical situations. The performance of normexp for assessing differential expression is improved by adding a small offset to the corrected intensities....

  3. Seasonal species interactions minimize the impact of species turnover on the likelihood of community persistence

    OpenAIRE

    Saavedra, Serguei; Rudolf P. Rohr; Fortuna, Miguel A.; Selva, Nuria; Bascompte, Jordi

    2015-01-01

    Many of the observed species interactions embedded in ecological communities are not permanent, but are characterized by temporal changes that are observed along with abiotic and biotic variations. While work has been done describing and quantifying these changes, little is known about their consequences for species coexistence. Here, we investigate the extent to which changes of species composition impact the likelihood of persistence of the predator–prey community in the highly seasonal Bia...

  4. Exclusion probabilities and likelihood ratios with applications to mixtures.

    Science.gov (United States)

    Slooten, Klaas-Jan; Egeland, Thore

    2016-01-01

    The statistical evidence obtained from mixed DNA profiles can be summarised in several ways in forensic casework including the likelihood ratio (LR) and the Random Man Not Excluded (RMNE) probability. The literature has seen a discussion of the advantages and disadvantages of likelihood ratios and exclusion probabilities, and part of our aim is to bring some clarification to this debate. In a previous paper, we proved that there is a general mathematical relationship between these statistics: RMNE can be expressed as a certain average of the LR, implying that the expected value of the LR, when applied to an actual contributor to the mixture, is at least equal to the inverse of the RMNE. While the mentioned paper presented applications for kinship problems, the current paper demonstrates the relevance for mixture cases, and for this purpose, we prove some new general properties. We also demonstrate how to use the distribution of the likelihood ratio for donors of a mixture, to obtain estimates for exceedance probabilities of the LR for non-donors, of which the RMNE is a special case corresponding to L R>0. In order to derive these results, we need to view the likelihood ratio as a random variable. In this paper, we describe how such a randomization can be achieved. The RMNE is usually invoked only for mixtures without dropout. In mixtures, artefacts like dropout and drop-in are commonly encountered and we address this situation too, illustrating our results with a basic but widely implemented model, a so-called binary model. The precise definitions, modelling and interpretation of the required concepts of dropout and drop-in are not entirely obvious, and we attempt to clarify them here in a general likelihood framework for a binary model.

  5. A Clinical model to identify patients with high-risk coronary artery disease

    NARCIS (Netherlands)

    Y. Yang (Yelin); L. Chen (Li); Y. Yam (Yeung); S. Achenbach (Stephan); M. Al-Mallah (Mouaz); D.S. Berman (Daniel); M.J. Budoff (Matthew); F. Cademartiri (Filippo); T.Q. Callister (Tracy); H.-J. Chang (Hyuk-Jae); V.Y. Cheng (Victor); K. Chinnaiyan (Kavitha); R.C. Cury (Ricardo); A. Delago (Augustin); A. Dunning (Allison); G.M. Feuchtner (Gudrun); M. Hadamitzky (Martin); J. Hausleiter (Jörg); R.P. Karlsberg (Ronald); P.A. Kaufmann (Philipp); Y.-J. Kim (Yong-Jin); J. Leipsic (Jonathon); T.M. LaBounty (Troy); F.Y. Lin (Fay); E. Maffei (Erica); G.L. Raff (Gilbert); L.J. Shaw (Leslee); T.C. Villines (Todd); J.K. Min (James K.); B.J.W. Chow (Benjamin)

    2015-01-01

    textabstractObjectives This study sought to develop a clinical model that identifies patients with and without high-risk coronary artery disease (CAD). Background Although current clinical models help to estimate a patient's pre-test probability of obstructive CAD, they do not accurately identify th

  6. A Clinical model to identify patients with high-risk coronary artery disease

    NARCIS (Netherlands)

    Y. Yang (Yelin); L. Chen (Li); Y. Yam (Yeung); S. Achenbach (Stephan); M. Al-Mallah (Mouaz); D.S. Berman (Daniel); M.J. Budoff (Matthew); F. Cademartiri (Filippo); T.Q. Callister (Tracy); H.-J. Chang (Hyuk-Jae); V.Y. Cheng (Victor); K. Chinnaiyan (Kavitha); R.C. Cury (Ricardo); A. Delago (Augustin); A. Dunning (Allison); G.M. Feuchtner (Gudrun); M. Hadamitzky (Martin); J. Hausleiter (Jörg); R.P. Karlsberg (Ronald); P.A. Kaufmann (Philipp); Y.-J. Kim (Yong-Jin); J. Leipsic (Jonathon); T.M. LaBounty (Troy); F.Y. Lin (Fay); E. Maffei (Erica); G.L. Raff (Gilbert); L.J. Shaw (Leslee); T.C. Villines (Todd); J.K. Min (James K.); B.J.W. Chow (Benjamin)

    2015-01-01

    textabstractObjectives This study sought to develop a clinical model that identifies patients with and without high-risk coronary artery disease (CAD). Background Although current clinical models help to estimate a patient's pre-test probability of obstructive CAD, they do not accurately identify th

  7. The Impact of Data Fragmentation on High-Throughput Clinical Phenotyping

    Science.gov (United States)

    Wei, Weiqi

    2012-01-01

    Subject selection is essential and has become the rate-limiting step for harvesting knowledge to advance healthcare through clinical research. Present manual approaches inhibit researchers from conducting deep and broad studies and drawing confident conclusions. High-throughput clinical phenotyping (HTCP), a recently proposed approach, leverages…

  8. Predicting rotator cuff tears using data mining and Bayesian likelihood ratios.

    Directory of Open Access Journals (Sweden)

    Hsueh-Yi Lu

    Full Text Available OBJECTIVES: Rotator cuff tear is a common cause of shoulder diseases. Correct diagnosis of rotator cuff tears can save patients from further invasive, costly and painful tests. This study used predictive data mining and Bayesian theory to improve the accuracy of diagnosing rotator cuff tears by clinical examination alone. METHODS: In this retrospective study, 169 patients who had a preliminary diagnosis of rotator cuff tear on the basis of clinical evaluation followed by confirmatory MRI between 2007 and 2011 were identified. MRI was used as a reference standard to classify rotator cuff tears. The predictor variable was the clinical assessment results, which consisted of 16 attributes. This study employed 2 data mining methods (ANN and the decision tree and a statistical method (logistic regression to classify the rotator cuff diagnosis into "tear" and "no tear" groups. Likelihood ratio and Bayesian theory were applied to estimate the probability of rotator cuff tears based on the results of the prediction models. RESULTS: Our proposed data mining procedures outperformed the classic statistical method. The correction rate, sensitivity, specificity and area under the ROC curve of predicting a rotator cuff tear were statistical better in the ANN and decision tree models compared to logistic regression. Based on likelihood ratios derived from our prediction models, Fagan's nomogram could be constructed to assess the probability of a patient who has a rotator cuff tear using a pretest probability and a prediction result (tear or no tear. CONCLUSIONS: Our predictive data mining models, combined with likelihood ratios and Bayesian theory, appear to be good tools to classify rotator cuff tears as well as determine the probability of the presence of the disease to enhance diagnostic decision making for rotator cuff tears.

  9. Resident satisfaction with continuity clinic and career choice in general internal medicine.

    Science.gov (United States)

    Peccoralo, Lauren A; Tackett, Sean; Ward, Lawrence; Federman, Alex; Helenius, Ira; Christmas, Colleen; Thomas, David C

    2013-08-01

    The quality of the continuity clinic experience for internal medicine (IM) residents may influence their choice to enter general internal medicine (GIM), yet few data exist to support this hypothesis. To assess the relationship between IM residents' satisfaction with continuity clinic and interest in GIM careers. Cross-sectional survey assessing satisfaction with elements of continuity clinic and residents' likelihood of career choice in GIM. IM residents at three urban medical centers. Bivariate and multivariate associations between satisfaction with 32 elements of outpatient clinic in 6 domains (clinical preceptors, educational environment, ancillary staff, time management, administrative, personal experience) and likelihood of considering a GIM career. Of the 225 (90 %) residents who completed surveys, 48 % planned to enter GIM before beginning their continuity clinic, whereas only 38 % did as a result of continuity clinic. Comparing residents' likelihood to enter GIM as a result of clinic to likelihood to enter a career in GIM before clinic showed that 59 % of residents had no difference in likelihood, 28 % reported a lower likelihood as a result of clinic, and 11 % reported higher likelihood as a result of clinic. Most residents were very satisfied or satisfied with all clinic elements. Significantly more residents (p ≤ 0.002) were likely vs. unlikely to enter GIM if they were very satisfied with faculty mentorship (76 % vs. 53 %), time for appointments (28 % vs. 11 %), number of patients seen (33 % vs. 15 %), personal reward from work (51 % vs. 23 %), relationship with patients (64 % vs. 42 %), and continuity with patients (57 % vs. 33 %). In the multivariate analysis, being likely to enter GIM before clinic (OR 29.0, 95 % CI 24.0-34.8) and being very satisfied with the continuity of relationships with patients (OR 4.08, 95 % CI 2.50-6.64) were the strongest independent predictors of likelihood to enter GIM as a result of clinic. Resident satisfaction

  10. Adaptive speckle reduction of ultrasound images based on maximum likelihood estimation

    Institute of Scientific and Technical Information of China (English)

    Xu Liu(刘旭); Yongfeng Huang(黄永锋); Wende Shou(寿文德); Tao Ying(应涛)

    2004-01-01

    A method has been developed in this paper to gain effective speckle reduction in medical ultrasound images.To exploit full knowledge of the speckle distribution, here maximum likelihood was used to estimate speckle parameters corresponding to its statistical mode. Then the results were incorporated into the nonlinear anisotropic diffusion to achieve adaptive speckle reduction. Verified with simulated and ultrasound images,we show that this algorithm is capable of enhancing features of clinical interest and reduces speckle noise more efficiently than just applying classical filters. To avoid edge contribution, changes of contrast-to-noise ratio of different regions are also compared to investigate the performance of this approach.

  11. Probability calculus for quantitative HREM. Part II: entropy and likelihood concepts.

    Science.gov (United States)

    Möbus, G

    2000-12-01

    The technique of extracting atomic coordinates from HREM images by R-factor refinement via iterative simulation and global optimisation is described in the context of probability density estimations for unknown parameters. In the second part of this two-part paper we discuss in comparison maximum likelihood and maximum entropy techniques with respect to their suitability of application within HREM. We outline practical difficulties of likelihood estimation and present a synthesis of two point-cloud techniques as a recommendable solution. This R-factor refinement with independent Monte-Carlo error calibration is a highly versatile method which allows adaptation to the special needs of HREM. Unlike simple text-book estimation methods, there is no requirement here on the noise being additive, uncorrelated, or Gaussian. It also becomes possible to account for a subset of systematic errors.

  12. Analyzing weak lensing of the cosmic microwave background using the likelihood function

    CERN Document Server

    Hirata, C M; Hirata, Christopher M.; Seljak, Uros

    2003-01-01

    Future experiments will produce high-resolution temperature maps of the cosmic microwave background (CMB) and are expected to reveal the signature of gravitational lensing by intervening large-scale structures. We construct all-sky maximum-likelihood estimators that use the lensing effect to estimate the projected density (convergence) of these structures, its power spectrum, and cross-correlation with other observables. This contrasts with earlier quadratic-estimator approaches that Taylor-expanded the observed CMB temperature to linear order in the lensing deflection angle; these approaches gave estimators for the temperature-convergence correlation in terms of the CMB three-point correlation function and for the convergence power spectrum in terms of the CMB four-point correlation function, which can be biased and non-optimal due to terms beyond the linear order. We show that for sufficiently weak lensing, the maximum-likelihood estimator reduces to the computationally less demanding quadratic estimator. T...

  13. ABC of SV: Limited Information Likelihood Inference in Stochastic Volatility Jump-Diffusion Models

    DEFF Research Database (Denmark)

    Creel, Michael; Kristensen, Dennis

    We develop novel methods for estimation and filtering of continuous-time models with stochastic volatility and jumps using so-called Approximate Bayesian Computation which build likelihoods based on limited information. The proposed estimators and filters are computationally attractive relative...... to standard likelihood-based versions since they rely on low-dimensional auxiliary statistics and so avoid computation of high-dimensional integrals. Despite their computational simplicity, we find that estimators and filters perform well in practice and lead to precise estimates of model parameters...... stochastic volatility model for the dynamics of the S&P 500 equity index. We find evidence of the presence of a dynamic jump rate and in favor of a structural break in parameters at the time of the recent financial crisis. We find evidence that possible measurement error in log price is small and has little...

  14. Maximum-likelihood fits to histograms for improved parameter estimation

    CERN Document Server

    Fowler, Joseph W

    2013-01-01

    Straightforward methods for adapting the familiar chi^2 statistic to histograms of discrete events and other Poisson distributed data generally yield biased estimates of the parameters of a model. The bias can be important even when the total number of events is large. For the case of estimating a microcalorimeter's energy resolution at 6 keV from the observed shape of the Mn K-alpha fluorescence spectrum, a poor choice of chi^2 can lead to biases of at least 10% in the estimated resolution when up to thousands of photons are observed. The best remedy is a Poisson maximum-likelihood fit, through a simple modification of the standard Levenberg-Marquardt algorithm for chi^2 minimization. Where the modification is not possible, another approach allows iterative approximation of the maximum-likelihood fit.

  15. Measures of family resemblance for binary traits: likelihood based inference.

    Science.gov (United States)

    Shoukri, Mohamed M; ElDali, Abdelmoneim; Donner, Allan

    2012-07-24

    Detection and estimation of measures of familial aggregation is considered the first step to establish whether a certain disease has genetic component. Such measures are usually estimated from observational studies on siblings, parent-offspring, extended pedigrees or twins. When the trait of interest is quantitative (e.g. Blood pressures, body mass index, blood glucose levels, etc.) efficient likelihood estimation of such measures is feasible under the assumption of multivariate normality of the distributions of the traits. In this case the intra-class and inter-class correlations are used to assess the similarities among family members. When the trail is measured on the binary scale, we establish a full likelihood inference on such measures among siblings, parents, and parent-offspring. We illustrate the methodology on nuclear family data where the trait is the presence or absence of hypertension.

  16. Applications of the Likelihood Theory in Finance: Modelling and Pricing

    CERN Document Server

    Janssen, Arnold

    2012-01-01

    This paper discusses the connection between mathematical finance and statistical modelling which turns out to be more than a formal mathematical correspondence. We like to figure out how common results and notions in statistics and their meaning can be translated to the world of mathematical finance and vice versa. A lot of similarities can be expressed in terms of LeCam's theory for statistical experiments which is the theory of the behaviour of likelihood processes. For positive prices the arbitrage free financial assets fit into filtered experiments. It is shown that they are given by filtered likelihood ratio processes. From the statistical point of view, martingale measures, completeness and pricing formulas are revisited. The pricing formulas for various options are connected with the power functions of tests. For instance the Black-Scholes price of a European option has an interpretation as Bayes risk of a Neyman Pearson test. Under contiguity the convergence of financial experiments and option prices ...

  17. A composite likelihood approach for spatially correlated survival data.

    Science.gov (United States)

    Paik, Jane; Ying, Zhiliang

    2013-01-01

    The aim of this paper is to provide a composite likelihood approach to handle spatially correlated survival data using pairwise joint distributions. With e-commerce data, a recent question of interest in marketing research has been to describe spatially clustered purchasing behavior and to assess whether geographic distance is the appropriate metric to describe purchasing dependence. We present a model for the dependence structure of time-to-event data subject to spatial dependence to characterize purchasing behavior from the motivating example from e-commerce data. We assume the Farlie-Gumbel-Morgenstern (FGM) distribution and then model the dependence parameter as a function of geographic and demographic pairwise distances. For estimation of the dependence parameters, we present pairwise composite likelihood equations. We prove that the resulting estimators exhibit key properties of consistency and asymptotic normality under certain regularity conditions in the increasing-domain framework of spatial asymptotic theory.

  18. Smoothed log-concave maximum likelihood estimation with applications

    CERN Document Server

    Chen, Yining

    2011-01-01

    We study the smoothed log-concave maximum likelihood estimator of a probability distribution on $\\mathbb{R}^d$. This is a fully automatic nonparametric density estimator, obtained as a canonical smoothing of the log-concave maximum likelihood estimator. We demonstrate its attractive features both through an analysis of its theoretical properties and a simulation study. Moreover, we show how the estimator can be used as an intermediate stage of more involved procedures, such as constructing a classifier or estimating a functional of the density. Here again, the use of the estimator can be justified both on theoretical grounds and through its finite sample performance, and we illustrate its use in a breast cancer diagnosis (classification) problem.

  19. Gaussian maximum likelihood and contextual classification algorithms for multicrop classification

    Science.gov (United States)

    Di Zenzo, Silvano; Bernstein, Ralph; Kolsky, Harwood G.; Degloria, Stephen D.

    1987-01-01

    The paper reviews some of the ways in which context has been handled in the remote-sensing literature, and additional possibilities are introduced. The problem of computing exhaustive and normalized class-membership probabilities from the likelihoods provided by the Gaussian maximum likelihood classifier (to be used as initial probability estimates to start relaxation) is discussed. An efficient implementation of probabilistic relaxation is proposed, suiting the needs of actual remote-sensing applications. A modified fuzzy-relaxation algorithm using generalized operations between fuzzy sets is presented. Combined use of the two relaxation algorithms is proposed to exploit context in multispectral classification of remotely sensed data. Results on both one artificially created image and one MSS data set are reported.

  20. Secondary Analysis under Cohort Sampling Designs Using Conditional Likelihood

    Directory of Open Access Journals (Sweden)

    Olli Saarela

    2012-01-01

    Full Text Available Under cohort sampling designs, additional covariate data are collected on cases of a specific type and a randomly selected subset of noncases, primarily for the purpose of studying associations with a time-to-event response of interest. With such data available, an interest may arise to reuse them for studying associations between the additional covariate data and a secondary non-time-to-event response variable, usually collected for the whole study cohort at the outset of the study. Following earlier literature, we refer to such a situation as secondary analysis. We outline a general conditional likelihood approach for secondary analysis under cohort sampling designs and discuss the specific situations of case-cohort and nested case-control designs. We also review alternative methods based on full likelihood and inverse probability weighting. We compare the alternative methods for secondary analysis in two simulated settings and apply them in a real-data example.

  1. A Weighted Likelihood Ratio of Two Related Negative Hypergeomeric Distributions

    Institute of Scientific and Technical Information of China (English)

    Titi Obilade

    2004-01-01

    In this paper we consider some related negative hypergeometric distributions arising from the problem of sampling without replacement from an urn containing balls of different colours and in different proportions but stopping only after some specifi number of balls of different colours have been obtained.With the aid of some simple recurrence relations and identities we obtain in the case of two colours the moments for the maximum negative hypergeometric distribution,the minimum negative hypergeometric distribution,the likelihood ratio negative hypergeometric distribution and consequently the likelihood proportional negative hypergeometric distributiuon.To the extent that the sampling scheme is applicable to modelling data as illustrated with a biological example and in fact many situations of estimating Bernoulli parameters for binary traits within afinite population,these are important first-step results.

  2. Bayesian experimental design for models with intractable likelihoods.

    Science.gov (United States)

    Drovandi, Christopher C; Pettitt, Anthony N

    2013-12-01

    In this paper we present a methodology for designing experiments for efficiently estimating the parameters of models with computationally intractable likelihoods. The approach combines a commonly used methodology for robust experimental design, based on Markov chain Monte Carlo sampling, with approximate Bayesian computation (ABC) to ensure that no likelihood evaluations are required. The utility function considered for precise parameter estimation is based upon the precision of the ABC posterior distribution, which we form efficiently via the ABC rejection algorithm based on pre-computed model simulations. Our focus is on stochastic models and, in particular, we investigate the methodology for Markov process models of epidemics and macroparasite population evolution. The macroparasite example involves a multivariate process and we assess the loss of information from not observing all variables.

  3. A composite likelihood approach for spatially correlated survival data

    Science.gov (United States)

    Paik, Jane; Ying, Zhiliang

    2013-01-01

    The aim of this paper is to provide a composite likelihood approach to handle spatially correlated survival data using pairwise joint distributions. With e-commerce data, a recent question of interest in marketing research has been to describe spatially clustered purchasing behavior and to assess whether geographic distance is the appropriate metric to describe purchasing dependence. We present a model for the dependence structure of time-to-event data subject to spatial dependence to characterize purchasing behavior from the motivating example from e-commerce data. We assume the Farlie-Gumbel-Morgenstern (FGM) distribution and then model the dependence parameter as a function of geographic and demographic pairwise distances. For estimation of the dependence parameters, we present pairwise composite likelihood equations. We prove that the resulting estimators exhibit key properties of consistency and asymptotic normality under certain regularity conditions in the increasing-domain framework of spatial asymptotic theory. PMID:24223450

  4. Maximum likelihood method and Fisher's information in physics and econophysics

    CERN Document Server

    Syska, Jacek

    2012-01-01

    Three steps in the development of the maximum likelihood (ML) method are presented. At first, the application of the ML method and Fisher information notion in the model selection analysis is described (Chapter 1). The fundamentals of differential geometry in the construction of the statistical space are introduced, illustrated also by examples of the estimation of the exponential models. At second, the notions of the relative entropy and the information channel capacity are introduced (Chapter 2). The observed and expected structural information principle (IP) and the variational IP of the modified extremal physical information (EPI) method of Frieden and Soffer are presented and discussed (Chapter 3). The derivation of the structural IP based on the analyticity of the logarithm of the likelihood function and on the metricity of the statistical space of the system is given. At third, the use of the EPI method is developed (Chapters 4-5). The information channel capacity is used for the field theory models cl...

  5. Maximum-likelihood estimation prevents unphysical Mueller matrices

    CERN Document Server

    Aiello, A; Voigt, D; Woerdman, J P

    2005-01-01

    We show that the method of maximum-likelihood estimation, recently introduced in the context of quantum process tomography, can be applied to the determination of Mueller matrices characterizing the polarization properties of classical optical systems. Contrary to linear reconstruction algorithms, the proposed method yields physically acceptable Mueller matrices even in presence of uncontrolled experimental errors. We illustrate the method on the case of an unphysical measured Mueller matrix taken from the literature.

  6. Maximum Likelihood Under Response Biased Sampling\\ud

    OpenAIRE

    Chambers, Raymond; Dorfman, Alan; Wang, Suojin

    2003-01-01

    Informative sampling occurs when the probability of inclusion in sample depends on\\ud the value of the survey response variable. Response or size biased sampling is a\\ud particular case of informative sampling where the inclusion probability is proportional\\ud to the value of this variable. In this paper we describe a general model for response\\ud biased sampling, which we call array sampling, and develop maximum likelihood and\\ud estimating equation theory appropriate to this situation. The ...

  7. Likelihood-based inference for clustered line transect data

    DEFF Research Database (Denmark)

    Waagepetersen, Rasmus Plenge; Schweder, Tore

    The uncertainty in estimation of spatial animal density from line transect surveys depends on the degree of spatial clustering in the animal population. To quantify the clustering we model line transect data as independent thinnings of spatial shot-noise Cox processes. Likelihood-based inference...... in an example concerning minke whales in the North Atlantic. Our modelling and computational approach is flexible but demanding in terms of computing time....

  8. Forecasting New Product Sales from Likelihood of Purchase Ratings

    OpenAIRE

    William J. Infosino

    1986-01-01

    This paper compares consumer likelihood of purchase ratings for a proposed new product to their actual purchase behavior after the product was introduced. The ratings were obtained from a mail survey a few weeks before the product was introduced. The analysis leads to a model for forecasting new product sales. The model is supported by both empirical evidence and a reasonable theoretical foundation. In addition to calibrating the relationship between questionnaire ratings and actual purchases...

  9. Improved Likelihood Function in Particle-based IR Eye Tracking

    DEFF Research Database (Denmark)

    Satria, R.; Sorensen, J.; Hammoud, R.

    2005-01-01

    In this paper we propose a log likelihood-ratio function of foreground and background models used in a particle filter to track the eye region in dark-bright pupil image sequences. This model fuses information from both dark and bright pupil images and their difference image into one model. Our...... performance in challenging sequences with test subjects showing large head movements and under significant light conditions....

  10. Australian food life style segments and elaboration likelihood differences

    DEFF Research Database (Denmark)

    Brunsø, Karen; Reid, Mike

    As the global food marketing environment becomes more competitive, the international and comparative perspective of consumers' attitudes and behaviours becomes more important for both practitioners and academics. This research employs the Food-Related Life Style (FRL) instrument in Australia...... insights into cross-cultural similarities and differences, into elaboration likelihood differences among consumer segments, and show how the involvement construct may be used as basis for communication development....

  11. Maximizing Friend-Making Likelihood for Social Activity Organization

    Science.gov (United States)

    2015-05-22

    the interplay of the group size, the constraint on existing friendships and the objective function on the likelihood of friend making. We prove that...social networks (OSNs), e.g., Facebook , Meetup, and Skout1, more and more people initiate friend gatherings or group activities via these OSNs. For...example, more than 16 millions of events are created on Facebook each month to organize various kinds of activities2, and more than 500 thousands of face

  12. Penalized maximum likelihood estimation for generalized linear point processes

    OpenAIRE

    2010-01-01

    A generalized linear point process is specified in terms of an intensity that depends upon a linear predictor process through a fixed non-linear function. We present a framework where the linear predictor is parametrized by a Banach space and give results on Gateaux differentiability of the log-likelihood. Of particular interest is when the intensity is expressed in terms of a linear filter parametrized by a Sobolev space. Using that the Sobolev spaces are reproducing kernel Hilbert spaces we...

  13. Semidefinite Programming for Approximate Maximum Likelihood Sinusoidal Parameter Estimation

    OpenAIRE

    2009-01-01

    We study the convex optimization approach for parameter estimation of several sinusoidal models, namely, single complex/real tone, multiple complex sinusoids, and single two-dimensional complex tone, in the presence of additive Gaussian noise. The major difficulty for optimally determining the parameters is that the corresponding maximum likelihood (ML) estimators involve finding the global minimum or maximum of multimodal cost functions because the frequencies are nonlinear in the observed s...

  14. Maximum Likelihood Sequence Detection Receivers for Nonlinear Optical Channels

    OpenAIRE

    2015-01-01

    The space-time whitened matched filter (ST-WMF) maximum likelihood sequence detection (MLSD) architecture has been recently proposed (Maggio et al., 2014). Its objective is reducing implementation complexity in transmissions over nonlinear dispersive channels. The ST-WMF-MLSD receiver (i) drastically reduces the number of states of the Viterbi decoder (VD) and (ii) offers a smooth trade-off between performance and complexity. In this work the ST-WMF-MLSD receiver is investigated in detail. We...

  15. Influence functions of trimmed likelihood estimators for lifetime experiments

    OpenAIRE

    2015-01-01

    We provide a general approach for deriving the influence function for trimmed likelihood estimators using the implicit function theorem. The approach is applied to lifetime models with exponential or lognormal distributions possessing a linear or nonlinear link function. A side result is that the functional form of the trimmed estimator for location and linear regression used by Bednarski and Clarke (1993, 2002) and Bednarski et al. (2010) is not generally always the correct fu...

  16. Fertilization response likelihood for the interpretation of leaf analyses

    Directory of Open Access Journals (Sweden)

    Celsemy Eleutério Maia

    2012-04-01

    Full Text Available Leaf analysis is the chemical evaluation of the nutritional status where the nutrient concentrations found in the tissue reflect the nutritional status of the plants. Thus, a correct interpretation of the results of leaf analysis is fundamental for an effective use of this tool. The purpose of this study was to propose and compare the method of Fertilization Response Likelihood (FRL for interpretation of leaf analysis with that of the Diagnosis and Recommendation Integrated System (DRIS. The database consisted of 157 analyses of the N, P, K, Ca, Mg, S, Cu, Fe, Mn, Zn, and B concentrations in coffee leaves, which were divided into two groups: low yield ( 30 bags ha-1. The DRIS indices were calculated using the method proposed by Jones (1981. The fertilization response likelihood was computed based on the approximation of normal distribution. It was found that the Fertilization Response Likelihood (FRL allowed an evaluation of the nutritional status of coffee trees, coinciding with the DRIS-based diagnoses in 84.96 % of the crops.

  17. CMB likelihood approximation by a Gaussianized Blackwell-Rao estimator

    CERN Document Server

    Rudjord, Ø; Eriksen, H K; Huey, Greg; Górski, K M; Jewell, J B

    2008-01-01

    We introduce a new CMB temperature likelihood approximation called the Gaussianized Blackwell-Rao (GBR) estimator. This estimator is derived by transforming the observed marginal power spectrum distributions obtained by the CMB Gibbs sampler into standard univariate Gaussians, and then approximate their joint transformed distribution by a multivariate Gaussian. The method is exact for full-sky coverage and uniform noise, and an excellent approximation for sky cuts and scanning patterns relevant for modern satellite experiments such as WMAP and Planck. A single evaluation of this estimator between l=2 and 200 takes ~0.2 CPU milliseconds, while for comparison, a single pixel space likelihood evaluation between l=2 and 30 for a map with ~2500 pixels requires ~20 seconds. We apply this tool to the 5-year WMAP temperature data, and re-estimate the angular temperature power spectrum, $C_{\\ell}$, and likelihood, L(C_l), for l<=200, and derive new cosmological parameters for the standard six-parameter LambdaCDM mo...

  18. Likelihood inference for a fractionally cointegrated vector autoregressive model

    DEFF Research Database (Denmark)

    Johansen, Søren; Nielsen, Morten Ørregaard

    We consider model based inference in a fractionally cointegrated (or cofractional) vector autoregressive model based on the conditional Gaussian likelihood. The model allows the process X_{t} to be fractional of order d and cofractional of order d-b; that is, there exist vectors β for which β......′X_{t} is fractional of order d-b. The parameters d and b satisfy either d≥b≥1/2, d=b≥1/2, or d=d_{0}≥b≥1/2. Our main technical contribution is the proof of consistency of the maximum likelihood estimators on the set 1/2≤b≤d≤d_{1} for any d_{1}≥d_{0}. To this end, we consider the conditional likelihood as a stochastic...... process in the parameters, and prove that it converges in distribution when errors are i.i.d. with suitable moment conditions and initial values are bounded. We then prove that the estimator of β is asymptotically mixed Gaussian and estimators of the remaining parameters are asymptotically Gaussian. We...

  19. Likelihood Inference for a Fractionally Cointegrated Vector Autoregressive Model

    DEFF Research Database (Denmark)

    Johansen, Søren; Nielsen, Morten Ørregaard

    We consider model based inference in a fractionally cointegrated (or cofractional) vector autoregressive model based on the conditional Gaussian likelihood. The model allows the process X(t) to be fractional of order d and cofractional of order d-b; that is, there exist vectors ß for which ß......'X(t) is fractional of order d-b. The parameters d and b satisfy either d=b=1/2, d=b=1/2, or d=d0=b=1/2. Our main technical contribution is the proof of consistency of the maximum likelihood estimators on the set 1/2=b=d=d1 for any d1=d0. To this end, we consider the conditional likelihood as a stochastic process...... in the parameters, and prove that it converges in distribution when errors are i.i.d. with suitable moment conditions and initial values are bounded. We then prove that the estimator of ß is asymptotically mixed Gaussian and estimators of the remaining parameters are asymptotically Gaussian. We also find...

  20. Moment Conditions Selection Based on Adaptive Penalized Empirical Likelihood

    Directory of Open Access Journals (Sweden)

    Yunquan Song

    2014-01-01

    Full Text Available Empirical likelihood is a very popular method and has been widely used in the fields of artificial intelligence (AI and data mining as tablets and mobile application and social media dominate the technology landscape. This paper proposes an empirical likelihood shrinkage method to efficiently estimate unknown parameters and select correct moment conditions simultaneously, when the model is defined by moment restrictions in which some are possibly misspecified. We show that our method enjoys oracle-like properties; that is, it consistently selects the correct moment conditions and at the same time its estimator is as efficient as the empirical likelihood estimator obtained by all correct moment conditions. Moreover, unlike the GMM, our proposed method allows us to carry out confidence regions for the parameters included in the model without estimating the covariances of the estimators. For empirical implementation, we provide some data-driven procedures for selecting the tuning parameter of the penalty function. The simulation results show that the method works remarkably well in terms of correct moment selection and the finite sample properties of the estimators. Also, a real-life example is carried out to illustrate the new methodology.

  1. Local solutions of Maximum Likelihood Estimation in Quantum State Tomography

    CERN Document Server

    Gonçalves, Douglas S; Lavor, Carlile; Farías, Osvaldo Jiménez; Ribeiro, P H Souto

    2011-01-01

    Maximum likelihood estimation is one of the most used methods in quantum state tomography, where the aim is to find the best density matrix for the description of a physical system. Results of measurements on the system should match the expected values produced by the density matrix. In some cases however, if the matrix is parameterized to ensure positivity and unit trace, the negative log-likelihood function may have several local minima. In several papers in the field, authors associate a source of errors to the possibility that most of these local minima are not global, so that optimization methods can be trapped in the wrong minimum, leading to a wrong density matrix. Here we show that, for convex negative log-likelihood functions, all local minima are global. We also show that a practical source of errors is in fact the use of optimization methods that do not have global convergence property or present numerical instabilities. The clarification of this point has important repercussion on quantum informat...

  2. Accurate determination of phase arrival times using autoregressive likelihood estimation

    Directory of Open Access Journals (Sweden)

    G. Kvaerna

    1994-06-01

    Full Text Available We have investigated the potential automatic use of an onset picker based on autoregressive likelihood estimation. Both a single component version and a three component version of this method have been tested on data from events located in the Khibiny Massif of the Kola peninsula, recorded at the Apatity array, the Apatity three component station and the ARCESS array. Using this method, we have been able to estimate onset times to an accuracy (standard deviation of about 0.05 s for P-phases and 0.15 0.20 s for S phases. These accuracies are as good as for analyst picks, and are considerably better than the accuracies of the current onset procedure used for processing of regional array data at NORSAR. In another application, we have developed a generic procedure to reestimate the onsets of all types of first arriving P phases. By again applying the autoregressive likelihood technique, we have obtained automatic onset times of a quality such that 70% of the automatic picks are within 0.1 s of the best manual pick. For the onset time procedure currently used at NORSAR, the corresponding number is 28%. Clearly, automatic reestimation of first arriving P onsets using the autoregressive likelihood technique has the potential of significantly reducing the retiming efforts of the analyst.

  3. Maximum likelihood tuning of a vehicle motion filter

    Science.gov (United States)

    Trankle, Thomas L.; Rabin, Uri H.

    1990-01-01

    This paper describes the use of maximum likelihood parameter estimation unknown parameters appearing in a nonlinear vehicle motion filter. The filter uses the kinematic equations of motion of a rigid body in motion over a spherical earth. The nine states of the filter represent vehicle velocity, attitude, and position. The inputs to the filter are three components of translational acceleration and three components of angular rate. Measurements used to update states include air data, altitude, position, and attitude. Expressions are derived for the elements of filter matrices needed to use air data in a body-fixed frame with filter states expressed in a geographic frame. An expression for the likelihood functions of the data is given, along with accurate approximations for the function's gradient and Hessian with respect to unknown parameters. These are used by a numerical quasi-Newton algorithm for maximizing the likelihood function of the data in order to estimate the unknown parameters. The parameter estimation algorithm is useful for processing data from aircraft flight tests or for tuning inertial navigation systems.

  4. Empirical likelihood method for non-ignorable missing data problems.

    Science.gov (United States)

    Guan, Zhong; Qin, Jing

    2017-01-01

    Missing response problem is ubiquitous in survey sampling, medical, social science and epidemiology studies. It is well known that non-ignorable missing is the most difficult missing data problem where the missing of a response depends on its own value. In statistical literature, unlike the ignorable missing data problem, not many papers on non-ignorable missing data are available except for the full parametric model based approach. In this paper we study a semiparametric model for non-ignorable missing data in which the missing probability is known up to some parameters, but the underlying distributions are not specified. By employing Owen (1988)'s empirical likelihood method we can obtain the constrained maximum empirical likelihood estimators of the parameters in the missing probability and the mean response which are shown to be asymptotically normal. Moreover the likelihood ratio statistic can be used to test whether the missing of the responses is non-ignorable or completely at random. The theoretical results are confirmed by a simulation study. As an illustration, the analysis of a real AIDS trial data shows that the missing of CD4 counts around two years are non-ignorable and the sample mean based on observed data only is biased.

  5. Brain potentials of conflict and error-likelihood following errorful and errorless learning in obsessive-compulsive disorder.

    Directory of Open Access Journals (Sweden)

    Anke Hammer

    Full Text Available BACKGROUND: The anterior cingulate cortex (ACC is thought to be overacting in patients with Obsessive Compulsive Disorder (OCD reflecting an enhanced action monitoring system. However, influences of conflict and error-likelihood have not been explored. Here, the error-related negativity (ERN originating in ACC served as a measure of conflict and error-likelihood during memory recognition following different learning modes. Errorless learning prevents the generation of false memory candidates and has been shown to be superior to trial-and-error-learning. The latter, errorful learning, introduces false memory candidates which interfere with correct information in later recognition leading to enhanced conflict processing. METHODOLOGY/PRINCIPAL FINDINGS: Sixteen OCD patients according to DSM-IV criteria and 16 closely matched healthy controls participated voluntarily in the event-related potential study. Both, OCD- and control group showed enhanced memory performance following errorless compared to errorful learning. Nevertheless, response-locked data showed clear modulations of the ERN amplitude. OCD patients compared to controls showed an increased error-likelihood effect after errorless learning. However, with increased conflict after errorful learning, OCD patients showed a reduced error-likelihood effect in contrast to controls who showed an increase. CONCLUSION/SIGNIFICANCE: The increase of the errorlikelihood effect for OCD patients within low conflict situations (recognition after errorless learning might be conceptualized as a hyperactive monitoring system. However, within high conflict situations (recognition after EF-learning the opposite effect was observed: whereas the control group showed an increased error-likelihood effect, the OCD group showed a reduction of the error-likelihood effect based on altered ACC learning rates in response to errors. These findings support theoretical frameworks explaining differences in ACC activity on

  6. Communicating likelihoods and probabilities in forecasts of volcanic eruptions

    Science.gov (United States)

    Doyle, Emma E. H.; McClure, John; Johnston, David M.; Paton, Douglas

    2014-02-01

    The issuing of forecasts and warnings of natural hazard events, such as volcanic eruptions, earthquake aftershock sequences and extreme weather often involves the use of probabilistic terms, particularly when communicated by scientific advisory groups to key decision-makers, who can differ greatly in relative expertise and function in the decision making process. Recipients may also differ in their perception of relative importance of political and economic influences on interpretation. Consequently, the interpretation of these probabilistic terms can vary greatly due to the framing of the statements, and whether verbal or numerical terms are used. We present a review from the psychology literature on how the framing of information influences communication of these probability terms. It is also unclear as to how people rate their perception of an event's likelihood throughout a time frame when a forecast time window is stated. Previous research has identified that, when presented with a 10-year time window forecast, participants viewed the likelihood of an event occurring ‘today’ as being of less than that in year 10. Here we show that this skew in perception also occurs for short-term time windows (under one week) that are of most relevance for emergency warnings. In addition, unlike the long-time window statements, the use of the phrasing “within the next…” instead of “in the next…” does not mitigate this skew, nor do we observe significant differences between the perceived likelihoods of scientists and non-scientists. This finding suggests that effects occurring due to the shorter time window may be ‘masking’ any differences in perception due to wording or career background observed for long-time window forecasts. These results have implications for scientific advice, warning forecasts, emergency management decision-making, and public information as any skew in perceived event likelihood towards the end of a forecast time window may result in

  7. PRE-MARKET CLINICAL EVALUATIONS OF INNOVATIVE HIGH-RISK MEDICAL DEVICES IN EUROPE

    DEFF Research Database (Denmark)

    Hulstaert, F.; Neyt, M.; Vinck, I.

    2012-01-01

    Objectives: High-quality clinical evidence is most often lacking when novel high-risk devices enter the European market. At the same time, a randomized controlled trial (RCT) is often initiated as a requirement for obtaining market access in the US. Should coverage in Europe be postponed until RCT...... Bodies, Ethics Committees, and HTA agencies were consulted. We also discuss patient safety and the transparency of information. Results: In contrast to the US, there is no requirement in Europe to demonstrate the clinical efficacy of high-risk devices in the premarket phase. Patients in Europe can thus...

  8. A Boolean Consistent Fuzzy Inference System for Diagnosing Diseases and Its Application for Determining Peritonitis Likelihood

    Science.gov (United States)

    Dragović, Ivana; Turajlić, Nina; Pilčević, Dejan; Petrović, Bratislav; Radojević, Dragan

    2015-01-01

    Fuzzy inference systems (FIS) enable automated assessment and reasoning in a logically consistent manner akin to the way in which humans reason. However, since no conventional fuzzy set theory is in the Boolean frame, it is proposed that Boolean consistent fuzzy logic should be used in the evaluation of rules. The main distinction of this approach is that it requires the execution of a set of structural transformations before the actual values can be introduced, which can, in certain cases, lead to different results. While a Boolean consistent FIS could be used for establishing the diagnostic criteria for any given disease, in this paper it is applied for determining the likelihood of peritonitis, as the leading complication of peritoneal dialysis (PD). Given that patients could be located far away from healthcare institutions (as peritoneal dialysis is a form of home dialysis) the proposed Boolean consistent FIS would enable patients to easily estimate the likelihood of them having peritonitis (where a high likelihood would suggest that prompt treatment is indicated), when medical experts are not close at hand. PMID:27069500

  9. A Boolean Consistent Fuzzy Inference System for Diagnosing Diseases and Its Application for Determining Peritonitis Likelihood

    Directory of Open Access Journals (Sweden)

    Ivana Dragović

    2015-01-01

    Full Text Available Fuzzy inference systems (FIS enable automated assessment and reasoning in a logically consistent manner akin to the way in which humans reason. However, since no conventional fuzzy set theory is in the Boolean frame, it is proposed that Boolean consistent fuzzy logic should be used in the evaluation of rules. The main distinction of this approach is that it requires the execution of a set of structural transformations before the actual values can be introduced, which can, in certain cases, lead to different results. While a Boolean consistent FIS could be used for establishing the diagnostic criteria for any given disease, in this paper it is applied for determining the likelihood of peritonitis, as the leading complication of peritoneal dialysis (PD. Given that patients could be located far away from healthcare institutions (as peritoneal dialysis is a form of home dialysis the proposed Boolean consistent FIS would enable patients to easily estimate the likelihood of them having peritonitis (where a high likelihood would suggest that prompt treatment is indicated, when medical experts are not close at hand.

  10. A Boolean Consistent Fuzzy Inference System for Diagnosing Diseases and Its Application for Determining Peritonitis Likelihood.

    Science.gov (United States)

    Dragović, Ivana; Turajlić, Nina; Pilčević, Dejan; Petrović, Bratislav; Radojević, Dragan

    2015-01-01

    Fuzzy inference systems (FIS) enable automated assessment and reasoning in a logically consistent manner akin to the way in which humans reason. However, since no conventional fuzzy set theory is in the Boolean frame, it is proposed that Boolean consistent fuzzy logic should be used in the evaluation of rules. The main distinction of this approach is that it requires the execution of a set of structural transformations before the actual values can be introduced, which can, in certain cases, lead to different results. While a Boolean consistent FIS could be used for establishing the diagnostic criteria for any given disease, in this paper it is applied for determining the likelihood of peritonitis, as the leading complication of peritoneal dialysis (PD). Given that patients could be located far away from healthcare institutions (as peritoneal dialysis is a form of home dialysis) the proposed Boolean consistent FIS would enable patients to easily estimate the likelihood of them having peritonitis (where a high likelihood would suggest that prompt treatment is indicated), when medical experts are not close at hand.

  11. Computation of the Likelihood in Biallelic Diffusion Models Using Orthogonal Polynomials

    Directory of Open Access Journals (Sweden)

    Claus Vogl

    2014-11-01

    Full Text Available In population genetics, parameters describing forces such as mutation, migration and drift are generally inferred from molecular data. Lately, approximate methods based on simulations and summary statistics have been widely applied for such inference, even though these methods waste information. In contrast, probabilistic methods of inference can be shown to be optimal, if their assumptions are met. In genomic regions where recombination rates are high relative to mutation rates, polymorphic nucleotide sites can be assumed to evolve independently from each other. The distribution of allele frequencies at a large number of such sites has been called “allele-frequency spectrum” or “site-frequency spectrum” (SFS. Conditional on the allelic proportions, the likelihoods of such data can be modeled as binomial. A simple model representing the evolution of allelic proportions is the biallelic mutation-drift or mutation-directional selection-drift diffusion model. With series of orthogonal polynomials, specifically Jacobi and Gegenbauer polynomials, or the related spheroidal wave function, the diffusion equations can be solved efficiently. In the neutral case, the product of the binomial likelihoods with the sum of such polynomials leads to finite series of polynomials, i.e., relatively simple equations, from which the exact likelihoods can be calculated. In this article, the use of orthogonal polynomials for inferring population genetic parameters is investigated.

  12. Maximum Likelihood DOA Estimation of Multiple Wideband Sources in the Presence of Nonuniform Sensor Noise

    Directory of Open Access Journals (Sweden)

    K. Yao

    2007-12-01

    Full Text Available We investigate the maximum likelihood (ML direction-of-arrival (DOA estimation of multiple wideband sources in the presence of unknown nonuniform sensor noise. New closed-form expression for the direction estimation Cramér-Rao-Bound (CRB has been derived. The performance of the conventional wideband uniform ML estimator under nonuniform noise has been studied. In order to mitigate the performance degradation caused by the nonuniformity of the noise, a new deterministic wideband nonuniform ML DOA estimator is derived and two associated processing algorithms are proposed. The first algorithm is based on an iterative procedure which stepwise concentrates the log-likelihood function with respect to the DOAs and the noise nuisance parameters, while the second is a noniterative algorithm that maximizes the derived approximately concentrated log-likelihood function. The performance of the proposed algorithms is tested through extensive computer simulations. Simulation results show the stepwise-concentrated ML algorithm (SC-ML requires only a few iterations to converge and both the SC-ML and the approximately-concentrated ML algorithm (AC-ML attain a solution close to the derived CRB at high signal-to-noise ratio.

  13. Laser-Based Slam with Efficient Occupancy Likelihood Map Learning for Dynamic Indoor Scenes

    Science.gov (United States)

    Li, Li; Yao, Jian; Xie, Renping; Tu, Jinge; Feng, Chen

    2016-06-01

    Location-Based Services (LBS) have attracted growing attention in recent years, especially in indoor environments. The fundamental technique of LBS is the map building for unknown environments, this technique also named as simultaneous localization and mapping (SLAM) in robotic society. In this paper, we propose a novel approach for SLAMin dynamic indoor scenes based on a 2D laser scanner mounted on a mobile Unmanned Ground Vehicle (UGV) with the help of the grid-based occupancy likelihood map. Instead of applying scan matching in two adjacent scans, we propose to match current scan with the occupancy likelihood map learned from all previous scans in multiple scales to avoid the accumulation of matching errors. Due to that the acquisition of the points in a scan is sequential but not simultaneous, there unavoidably exists the scan distortion at different extents. To compensate the scan distortion caused by the motion of the UGV, we propose to integrate a velocity of a laser range finder (LRF) into the scan matching optimization framework. Besides, to reduce the effect of dynamic objects such as walking pedestrians often existed in indoor scenes as much as possible, we propose a new occupancy likelihood map learning strategy by increasing or decreasing the probability of each occupancy grid after each scan matching. Experimental results in several challenged indoor scenes demonstrate that our proposed approach is capable of providing high-precision SLAM results.

  14. A real-time maximum-likelihood heart-rate estimator for wearable textile sensors.

    Science.gov (United States)

    Cheng, Mu-Huo; Chen, Li-Chung; Hung, Ying-Che; Yang, Chang Ming

    2008-01-01

    This paper presents a real-time maximum-likelihood heart-rate estimator for ECG data measured via wearable textile sensors. The ECG signals measured from wearable dry electrodes are notorious for its susceptibility to interference from the respiration or the motion of wearing person such that the signal quality may degrade dramatically. To overcome these obstacles, in the proposed heart-rate estimator we first employ the subspace approach to remove the wandering baseline, then use a simple nonlinear absolute operation to reduce the high-frequency noise contamination, and finally apply the maximum likelihood estimation technique for estimating the interval of R-R peaks. A parameter derived from the byproduct of maximum likelihood estimation is also proposed as an indicator for signal quality. To achieve the goal of real-time, we develop a simple adaptive algorithm from the numerical power method to realize the subspace filter and apply the fast-Fourier transform (FFT) technique for realization of the correlation technique such that the whole estimator can be implemented in an FPGA system. Experiments are performed to demonstrate the viability of the proposed system.

  15. Comparison of sinogram- and image-domain penalized-likelihood image reconstruction estimators.

    Science.gov (United States)

    Vargas, Phillip A; La Rivière, Patrick J

    2011-08-01

    In recent years, the authors and others have been exploring the use of penalized-likelihood sinogram-domain smoothing and restoration approaches for emission and transmission tomography. The motivation for this strategy was initially pragmatic: to provide a more computationally feasible alternative to fully iterative penalized-likelihood image reconstruction involving expensive backprojections and reprojections, while still obtaining some of the benefits of the statistical modeling employed in penalized-likelihood approaches. In this work, the authors seek to compare the two approaches in greater detail. The sinogram-domain strategy entails estimating the "ideal" line integrals needed for reconstruction of an activity or attenuation distribution from the set of noisy, potentially degraded tomographic measurements by maximizing a penalized-likelihood objective function. The objective function models the data statistics as well as any degradation that can be represented in the sinogram domain. The estimated line integrals can then be input to analytic reconstruction algorithms such as filtered backprojection (FBP). The authors compare this to fully iterative approaches maximizing similar objective functions. The authors present mathematical analyses based on so-called equivalent optimization problems that establish that the approaches can be made precisely equivalent under certain restrictive conditions. More significantly, by use of resolution-variance tradeoff studies, the authors show that they can yield very similar performance under more relaxed, realistic conditions. The sinogram- and image-domain approaches are equivalent under certain restrictive conditions and can perform very similarly under more relaxed conditions. The match is particularly good for fully sampled, high-resolution CT geometries. One limitation of the sinogram-domain approach relative to the image-domain approach is the difficulty of imposing additional constraints, such as image non-negativity.

  16. Facial emotion perception differs in young persons at genetic and clinical high-risk for psychosis.

    Science.gov (United States)

    Kohler, Christian G; Richard, Jan A; Brensinger, Colleen M; Borgmann-Winter, Karin E; Conroy, Catherine G; Moberg, Paul J; Gur, Ruben C; Gur, Raquel E; Calkins, Monica E

    2014-05-15

    A large body of literature has documented facial emotion perception impairments in schizophrenia. More recently, emotion perception has been investigated in persons at genetic and clinical high-risk for psychosis. This study compared emotion perception abilities in groups of young persons with schizophrenia, clinical high-risk, genetic risk and healthy controls. Groups, ages 13-25, included 24 persons at clinical high-risk, 52 first-degree relatives at genetic risk, 91 persons with schizophrenia and 90 low risk persons who completed computerized testing of emotion recognition and differentiation. Groups differed by overall emotion recognition abilities and recognition of happy, sad, anger and fear expressions. Pairwise comparisons revealed comparable impairments in recognition of happy, angry, and fearful expressions for persons at clinical high-risk and schizophrenia, while genetic risk participants were less impaired, showing reduced recognition of fearful expressions. Groups also differed for differentiation of happy and sad expressions, but differences were mainly between schizophrenia and control groups. Emotion perception impairments are observable in young persons at-risk for psychosis. Preliminary results with clinical high-risk participants, when considered along findings in genetic risk relatives, suggest social cognition abilities to reflect pathophysiological processes involved in risk of schizophrenia. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  17. Prediction of transition from ultra-high risk to first-episode psychosis using a probabilistic model combining history, clinical assessment and fatty-acid biomarkers

    Science.gov (United States)

    Clark, S R; Baune, B T; Schubert, K O; Lavoie, S; Smesny, S; Rice, S M; Schäfer, M R; Benninger, F; Feucht, M; Klier, C M; McGorry, P D; Amminger, G P

    2016-01-01

    Current criteria identifying patients with ultra-high risk of psychosis (UHR) have low specificity, and less than one-third of UHR cases experience transition to psychosis within 3 years of initial assessment. We explored whether a Bayesian probabilistic multimodal model, combining baseline historical and clinical risk factors with biomarkers (oxidative stress, cell membrane fatty acids, resting quantitative electroencephalography (qEEG)), could improve this specificity. We analyzed data of a UHR cohort (n=40) with a 1-year transition rate of 28%. Positive and negative likelihood ratios were calculated for predictor variables with statistically significant receiver operating characteristic curves (ROCs), which excluded oxidative stress markers and qEEG parameters as significant predictors of transition. We clustered significant variables into historical (history of drug use), clinical (Positive and Negative Symptoms Scale positive, negative and general scores and Global Assessment of Function) and biomarker (total omega-3, nervonic acid) groups, and calculated the post-test probability of transition for each group and for group combinations using the odds ratio form of Bayes' rule. Combination of the three variable groups vastly improved the specificity of prediction (area under ROC=0.919, sensitivity=72.73%, specificity=96.43%). In this sample, our model identified over 70% of UHR patients who transitioned within 1 year, compared with 28% identified by standard UHR criteria. The model classified 77% of cases as very high or low risk (P>0.9, <0.1) based on history and clinical assessment, suggesting that a staged approach could be most efficient, reserving fatty-acid markers for 23% of cases remaining at intermediate probability following bedside interview. PMID:27648919

  18. Philosophy and phylogenetic inference: a comparison of likelihood and parsimony methods in the context of Karl Popper's writings on corroboration.

    Science.gov (United States)

    de Queiroz, K; Poe, S

    2001-06-01

    Advocates of cladistic parsimony methods have invoked the philosophy of Karl Popper in an attempt to argue for the superiority of those methods over phylogenetic methods based on Ronald Fisher's statistical principle of likelihood. We argue that the concept of likelihood in general, and its application to problems of phylogenetic inference in particular, are highly compatible with Popper's philosophy. Examination of Popper's writings reveals that his concept of corroboration is, in fact, based on likelihood. Moreover, because probabilistic assumptions are necessary for calculating the probabilities that define Popper's corroboration, likelihood methods of phylogenetic inference--with their explicit probabilistic basis--are easily reconciled with his concept. In contrast, cladistic parsimony methods, at least as described by certain advocates of those methods, are less easily reconciled with Popper's concept of corroboration. If those methods are interpreted as lacking probabilistic assumptions, then they are incompatible with corroboration. Conversely, if parsimony methods are to be considered compatible with corroboration, then they must be interpreted as carrying implicit probabilistic assumptions. Thus, the non-probabilistic interpretation of cladistic parsimony favored by some advocates of those methods is contradicted by an attempt by the same authors to justify parsimony methods in terms of Popper's concept of corroboration. In addition to being compatible with Popperian corroboration, the likelihood approach to phylogenetic inference permits researchers to test the assumptions of their analytical methods (models) in a way that is consistent with Popper's ideas about the provisional nature of background knowledge.

  19. Achieving organisational competence for clinical leadership: the role of high performance work systems.

    Science.gov (United States)

    Leggat, Sandra G; Balding, Cathy

    2013-01-01

    While there has been substantial discussion about the potential for clinical leadership in improving quality and safety in healthcare, there has been little robust study. The purpose of this paper is to present the results of a qualitative study with clinicians and clinician managers to gather opinions on the appropriate content of an educational initiative being planned to improve clinical leadership in quality and safety among medical, nursing and allied health professionals working in primary, community and secondary care. In total, 28 clinicians and clinician managers throughout the state of Victoria, Australia, participated in focus groups to provide advice on the development of a clinical leadership program in quality and safety. An inductive, thematic analysis was completed to enable the themes to emerge from the data. Overwhelmingly the participants conceptualised clinical leadership in relation to organisational factors. Only four individual factors, comprising emotional intelligence, resilience, self-awareness and understanding of other clinical disciplines, were identified as being important for clinical leaders. Conversely seven organisational factors, comprising role clarity and accountability, security and sustainability for clinical leaders, selective recruitment into clinical leadership positions, teamwork and decentralised decision making, training, information sharing, and transformational leadership, were seen as essential, but the participants indicated they were rarely addressed. The human resource management literature includes these seven components, with contingent reward, reduced status distinctions and measurement of management practices, as the essential organisational underpinnings of high performance work systems. The results of this study propose that clinical leadership is an organisational property, suggesting that capability frameworks and educational programs for clinical leadership need a broader organisation focus. The paper

  20. Paramedic clinical decision making during high acuity emergency calls: design and methodology of a Delphi study

    Directory of Open Access Journals (Sweden)

    Croskerry Pat

    2009-09-01

    Full Text Available Abstract Background The scope of practice of paramedics in Canada has steadily evolved to include increasingly complex interventions in the prehospital setting, which likely have repercussions on clinical outcome and patient safety. Clinical decision making has been evaluated in several health professions, but there is a paucity of work in this area on paramedics. This study will utilize the Delphi technique to establish consensus on the most important instances of paramedic clinical decision making during high acuity emergency calls, as they relate to clinical outcome and patient safety. Methods and design Participants in this multi-round survey study will be paramedic leaders and emergency medical services medical directors/physicians from across Canada. In the first round, participants will identify instances of clinical decision making they feel are important for patient outcome and safety. On the second round, the panel will rank each instance of clinical decision making in terms of its importance. On the third and potentially fourth round, participants will have the opportunity to revise the ranking they assigned to each instance of clinical decision making. Consensus will be considered achieved for the most important instances if 80% of the panel ranks it as important or extremely important. The most important instances of clinical decision making will be plotted on a process analysis map. Discussion The process analysis map that results from this Delphi study will enable the gaps in research, knowledge and practice to be identified.

  1. Predicting crash likelihood and severity on freeways with real-time loop detector data.

    Science.gov (United States)

    Xu, Chengcheng; Tarko, Andrew P; Wang, Wei; Liu, Pan

    2013-08-01

    Real-time crash risk prediction using traffic data collected from loop detector stations is useful in dynamic safety management systems aimed at improving traffic safety through application of proactive safety countermeasures. The major drawback of most of the existing studies is that they focus on the crash risk without consideration of crash severity. This paper presents an effort to develop a model that predicts the crash likelihood at different levels of severity with a particular focus on severe crashes. The crash data and traffic data used in this study were collected on the I-880 freeway in California, United States. This study considers three levels of crash severity: fatal/incapacitating injury crashes (KA), non-incapacitating/possible injury crashes (BC), and property-damage-only crashes (PDO). The sequential logit model was used to link the likelihood of crash occurrences at different severity levels to various traffic flow characteristics derived from detector data. The elasticity analysis was conducted to evaluate the effect of the traffic flow variables on the likelihood of crash and its severity.The results show that the traffic flow characteristics contributing to crash likelihood were quite different at different levels of severity. The PDO crashes were more likely to occur under congested traffic flow conditions with highly variable speed and frequent lane changes, while the KA and BC crashes were more likely to occur under less congested traffic flow conditions. High speed, coupled with a large speed difference between adjacent lanes under uncongested traffic conditions, was found to increase the likelihood of severe crashes (KA). This study applied the 20-fold cross-validation method to estimate the prediction performance of the developed models. The validation results show that the model's crash prediction performance at each severity level was satisfactory. The findings of this study can be used to predict the probabilities of crash at

  2. The likelihood of Latino women to seek help in response to interpersonalvictimization: An examination of individual, interpersonal and socioculturalinfluences

    Directory of Open Access Journals (Sweden)

    Chiara Sabina

    2014-07-01

    Full Text Available Help-seeking is a process that is influenced by individual, interpersonal, and sociocultural factors. Thecurrent study examined these influences on the likelihood of seeking help (police, pressing charges,medical services, social services, and informal help for interpersonal violence among a national sample ofLatino women. Women living in high-density Latino neighborhoods in the USA were interviewed by phonein their preferred language. Women reporting being, on average, between "somewhat likely" and "verylikely" to seek help should they experience interpersonal victimization. Sequential linear regression resultsindicated that individual (age, depression, interpersonal (having children, past victimization, andsociocultural factors (immigrant status, acculturation were associated with the self-reported likelihood ofseeking help for interpersonal violence. Having children was consistently related to a greater likelihood toseek all forms of help. Overall, women appear to respond to violence in ways that reflects their ecologicalcontext. Help-seeking is best understood within a multi-layered and dynamic context.

  3. Perioperative mortality in cats and dogs undergoing spay or castration at a high-volume clinic.

    Science.gov (United States)

    Levy, J K; Bard, K M; Tucker, S J; Diskant, P D; Dingman, P A

    2017-06-01

    High volume spay-neuter (spay-castration) clinics have been established to improve population control of cats and dogs to reduce the number of animals admitted to and euthanazed in animal shelters. The rise in the number of spay-neuter clinics in the USA has been accompanied by concern about the quality of animal care provided in high volume facilities, which focus on minimally invasive, time saving techniques, high throughput and simultaneous management of multiple animals under various stages of anesthesia. The aim of this study was to determine perioperative mortality for cats and dogs in a high volume spay-neuter clinic in the USA. Electronic medical records and a written mortality log were used to collect data for 71,557 cats and 42,349 dogs undergoing spay-neuter surgery from 2010 to 2016 at a single high volume clinic in Florida. Perioperative mortality was defined as deaths occurring in the 24h period starting with the administration of the first sedation or anesthetic drugs. Perioperative mortality was reported for 34 cats and four dogs for an overall mortality of 3.3 animals/10,000 surgeries (0.03%). The risk of mortality was more than twice as high for females (0.05%) as for males (0.02%) (P=0.008) and five times as high for cats (0.05%) as for dogs (0.009%) (P=0.0007). High volume spay-neuter surgery was associated with a lower mortality rate than that previously reported in low volume clinics, approaching that achieved in human surgery. This is likely to be due to the young, healthy population of dogs and cats, and the continuous refinement of techniques based on experience and the skills and proficiency of teams that specialize in a limited spectrum of procedures. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Early Course in Obstetrics Increases Likelihood of Practice Including Obstetrics.

    Science.gov (United States)

    Pearson, Jennifer; Westra, Ruth

    2016-10-01

    The Department of Family Medicine and Community Health Duluth has offered the Obstetrical Longitudinal Course (OBLC) as an elective for first-year medical students since 1999. The objective of the OBLC Impact Survey was to assess the effectiveness of the course over the past 15 years. A Qualtrics survey was emailed to participants enrolled in the course from 1999-2014. Data was compiled for the respondent group as a whole as well as four cohorts based on current level of training/practice. Cross-tabulations with Fisher's exact test were applied and odds ratios calculated for factors affecting likelihood of eventual practice including obstetrics. Participation in the OBLC was successful in increasing exposure, awareness, and comfort in caring for obstetrical patients and feeling more prepared for the OB-GYN Clerkship. A total of 50.5% of course participants felt the OBLC influenced their choice of specialty. For participants who are currently physicians, 51% are practicing family medicine with obstetrics or OB-GYN. Of the cohort of family physicians, 65.2% made the decision whether to include obstetrics in practice during medical school. Odds ratios show the likelihood of practicing obstetrics is higher when participants have completed the OBLC and also are practicing in a rural community. Early exposure to obstetrics, as provided by the OBLC, appears to increase the likelihood of including obstetrics in practice, especially if eventual practice is in a rural community. This course may be a tool to help create a pipeline for future rural family physicians providing obstetrical care.

  5. CUSUM control charts based on likelihood ratio for preliminary analysis

    Institute of Scientific and Technical Information of China (English)

    Yi DAI; Zhao-jun WANG; Chang-liang ZOU

    2007-01-01

    To detect and estimate a shift in either the mean and the deviation or both for the preliminary analysis, the statistical process control (SPC) tool, the control chart based on the likelihood ratio test (LRT), is the most popular method.Sullivan and woodall pointed out the test statistic lrt (n1, n2) is approximately distributed as x2 (2) as the sample size n, n1 and n2 are very large, and the value of n1 = 2, 3,..., n- 2 and that of n2 = n- n1.So it is inevitable that n1 or n2 is not large. In this paper the limit distribution of lrt(n1, n2) for fixed n1 or n2 is figured out, and the exactly analytic formulae for evaluating the expectation and the variance of the limit distribution are also obtained.In addition, the properties of the standardized likelihood ratio statistic slr(n1,n) are discussed in this paper. Although slr(n1, n) contains the most important information, slr(i, n)(i ≠ n1) also contains lots of information. The cumulative sum (CUSUM) control chart can obtain more information in this condition. So we propose two CUSUM control charts based on the likelihood ratio statistics for the preliminary analysis on the individual observations. One focuses on detecting the shifts in location in the historical data and the other is more general in detecting a shift in either the location and the scale or both.Moreover, the simulated results show that the proposed two control charts are, respectively, superior to their competitors not only in the detection of the sustained shifts but also in the detection of some other out-of-control situations considered in this paper.

  6. CUSUM control charts based on likelihood ratio for preliminary analysis

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    To detect and estimate a shift in either the mean and the deviation or both for the preliminary analysis, the statistical process control (SPC) tool, the control chart based on the likelihood ratio test (LRT), is the most popular method. Sullivan and woodall pointed out the test statistic lrt(n1, n2) is approximately distributed as x2(2) as the sample size n,n1 and n2 are very large, and the value of n1 = 2,3,..., n - 2 and that of n2 = n - n1. So it is inevitable that n1 or n2 is not large. In this paper the limit distribution of lrt(n1, n2) for fixed n1 or n2 is figured out, and the exactly analytic formulae for evaluating the expectation and the variance of the limit distribution are also obtained. In addition, the properties of the standardized likelihood ratio statistic slr(n1, n) are discussed in this paper. Although slr(n1, n) contains the most important information, slr(i, n)(i≠n1) also contains lots of information. The cumulative sum (CUSUM) control chart can obtain more information in this condition. So we propose two CUSUM control charts based on the likelihood ratio statistics for the preliminary analysis on the individual observations. One focuses on detecting the shifts in location in the historical data and the other is more general in detecting a shift in either the location and the scale or both. Moreover, the simulated results show that the proposed two control charts are, respectively, superior to their competitors not only in the detection of the sustained shifts but also in the detection of some other out-of-control situations considered in this paper.

  7. Molecular Profiling and Clinical Outcome of High-Grade Serous Ovarian Cancer Presenting with Low- versus High-Volume Ascites

    Directory of Open Access Journals (Sweden)

    Tomer Feigenberg

    2014-01-01

    Full Text Available Epithelial ovarian cancer consists of multiple histotypes differing in etiology and clinical course. The most prevalent histotype is high-grade serous ovarian cancer (HGSOC, which often presents at an advanced stage frequently accompanied with high-volume ascites. While some studies suggest that ascites is associated with poor clinical outcome, most reports have not differentiated between histological subtypes or tumor grade. We compared genome-wide gene expression profiles from a discovery cohort of ten patients diagnosed with stages III-IV HGSOC with high-volume ascites and nine patients with low-volume ascites. An upregulation of immune response genes was detected in tumors from patients presenting with low-volume ascites relative to those with high-volume ascites. Immunohistochemical studies performed on tissue microarrays confirmed higher expression of proteins encoded by immune response genes and increased tumorinfiltrating cells in tumors associated with low-volume ascites. Comparison of 149 advanced-stage HGSOC cases with differential ascites volume at time of primary surgery indicated low-volume ascites correlated with better surgical outcome and longer overall survival. These findings suggest that advanced stage HGSOC presenting with low-volume ascites reflects a unique subgroup of HGSOC, which is associated with upregulation of immune related genes, more abundant tumor infiltrating cells and better clinical outcomes.

  8. Making sense of high sensitivity troponin assays and their role in clinical care.

    Science.gov (United States)

    Daniels, Lori B

    2014-04-01

    Cardiac troponin assays have an established and undisputed role in the diagnosis and risk stratification of patients with acute myocardial infarction. As troponin assays gets more sensitive and more precise, the number of potential uses has rapidly expanded, but the use of this test has also become more complicated and controversial. Highly sensitive troponin assays can now detect troponin levels in most individuals, but accurate interpretation of these levels requires a clear understanding of the assay in the context of the clinical scenario. This paper provides a practical and up-to-date overview of the uses of highly sensitive troponin assays for diagnosis, prognosis, and risk stratification in clinical practice.

  9. A Non-standard Empirical Likelihood for Time Series

    DEFF Research Database (Denmark)

    Nordman, Daniel J.; Bunzel, Helle; Lahiri, Soumendra N.

    Standard blockwise empirical likelihood (BEL) for stationary, weakly dependent time series requires specifying a fixed block length as a tuning parameter for setting confidence regions. This aspect can be difficult and impacts coverage accuracy. As an alternative, this paper proposes a new version......-standard asymptotics and requires a significantly different development compared to standard BEL. We establish the large-sample distribution of log-ratio statistics from the new BEL method for calibrating confidence regions for mean or smooth function parameters of time series. This limit law is not the usual chi...

  10. Efficient maximum likelihood parameterization of continuous-time Markov processes

    CERN Document Server

    McGibbon, Robert T

    2015-01-01

    Continuous-time Markov processes over finite state-spaces are widely used to model dynamical processes in many fields of natural and social science. Here, we introduce an maximum likelihood estimator for constructing such models from data observed at a finite time interval. This estimator is drastically more efficient than prior approaches, enables the calculation of deterministic confidence intervals in all model parameters, and can easily enforce important physical constraints on the models such as detailed balance. We demonstrate and discuss the advantages of these models over existing discrete-time Markov models for the analysis of molecular dynamics simulations.

  11. Bayesian and maximum likelihood estimation of genetic maps

    DEFF Research Database (Denmark)

    York, Thomas L.; Durrett, Richard T.; Tanksley, Steven;

    2005-01-01

    There has recently been increased interest in the use of Markov Chain Monte Carlo (MCMC)-based Bayesian methods for estimating genetic maps. The advantage of these methods is that they can deal accurately with missing data and genotyping errors. Here we present an extension of the previous methods...... that makes the Bayesian method applicable to large data sets. We present an extensive simulation study examining the statistical properties of the method and comparing it with the likelihood method implemented in Mapmaker. We show that the Maximum A Posteriori (MAP) estimator of the genetic distances...

  12. Maximum Likelihood Localization of Radiation Sources with unknown Source Intensity

    CERN Document Server

    Baidoo-Williams, Henry E

    2016-01-01

    In this paper, we consider a novel and robust maximum likelihood approach to localizing radiation sources with unknown statistics of the source signal strength. The result utilizes the smallest number of sensors required theoretically to localize the source. It is shown, that should the source lie in the open convex hull of the sensors, precisely $N+1$ are required in $\\mathbb{R}^N, ~N \\in \\{1,\\cdots,3\\}$. It is further shown that the region of interest, the open convex hull of the sensors, is entirely devoid of false stationary points. An augmented gradient ascent algorithm with random projections should an estimate escape the convex hull is presented.

  13. Similar tests and the standardized log likelihood ratio statistic

    DEFF Research Database (Denmark)

    Jensen, Jens Ledet

    1986-01-01

    When testing an affine hypothesis in an exponential family the 'ideal' procedure is to calculate the exact similar test, or an approximation to this, based on the conditional distribution given the minimal sufficient statistic under the null hypothesis. By contrast to this there is a 'primitive......' approach in which the marginal distribution of a test statistic considered and any nuisance parameter appearing in the test statistic is replaced by an estimate. We show here that when using standardized likelihood ratio statistics the 'primitive' procedure is in fact an 'ideal' procedure to order O(n -3...

  14. Likelihood-Based Inference in Nonlinear Error-Correction Models

    DEFF Research Database (Denmark)

    Kristensen, Dennis; Rahbæk, Anders

    We consider a class of vector nonlinear error correction models where the transfer function (or loadings) of the stationary relation- ships is nonlinear. This includes in particular the smooth transition models. A general representation theorem is given which establishes the dynamic properties...... and a linear trend in general. Gaussian likelihood-based estimators are considered for the long- run cointegration parameters, and the short-run parameters. Asymp- totic theory is provided for these and it is discussed to what extend asymptotic normality and mixed normaity can be found. A simulation study...

  15. MAXIMUM LIKELIHOOD ESTIMATION FOR PERIODIC AUTOREGRESSIVE MOVING AVERAGE MODELS.

    Science.gov (United States)

    Vecchia, A.V.

    1985-01-01

    A useful class of models for seasonal time series that cannot be filtered or standardized to achieve second-order stationarity is that of periodic autoregressive moving average (PARMA) models, which are extensions of ARMA models that allow periodic (seasonal) parameters. An approximation to the exact likelihood for Gaussian PARMA processes is developed, and a straightforward algorithm for its maximization is presented. The algorithm is tested on several periodic ARMA(1, 1) models through simulation studies and is compared to moment estimation via the seasonal Yule-Walker equations. Applicability of the technique is demonstrated through an analysis of a seasonal stream-flow series from the Rio Caroni River in Venezuela.

  16. AN EFFICIENT APPROXIMATE MAXIMUM LIKELIHOOD SIGNAL DETECTION FOR MIMO SYSTEMS

    Institute of Scientific and Technical Information of China (English)

    Cao Xuehong

    2007-01-01

    This paper proposes an efficient approximate Maximum Likelihood (ML) detection method for Multiple-Input Multiple-Output (MIMO) systems, which searches local area instead of exhaustive search and selects valid search points in each transmit antenna signal constellation instead of all hyperplane. Both of the selection and search complexity can be reduced significantly. The method performs the tradeoff between computational complexity and system performance by adjusting the neighborhood size to select the valid search points. Simulation results show that the performance is comparable to that of the ML detection while the complexity is only as the small fraction of ML.

  17. Maximum likelihood characterization of rotationally symmetric distributions on the sphere

    OpenAIRE

    Duerinckx, Mitia; Ley, Christophe

    2012-01-01

    A classical characterization result, which can be traced back to Gauss, states that the maximum likelihood estimator (MLE) of the location parameter equals the sample mean for any possible univariate samples of any possible sizes n if and only if the samples are drawn from a Gaussian population. A similar result, in the two-dimensional case, is given in von Mises (1918) for the Fisher-von Mises-Langevin (FVML) distribution, the equivalent of the Gaussian law on the unit circle. Half a century...

  18. Maximum-likelihood analysis of the COBE angular correlation function

    Science.gov (United States)

    Seljak, Uros; Bertschinger, Edmund

    1993-01-01

    We have used maximum-likelihood estimation to determine the quadrupole amplitude Q(sub rms-PS) and the spectral index n of the density fluctuation power spectrum at recombination from the COBE DMR data. We find a strong correlation between the two parameters of the form Q(sub rms-PS) = (15.7 +/- 2.6) exp (0.46(1 - n)) microK for fixed n. Our result is slightly smaller than and has a smaller statistical uncertainty than the 1992 estimate of Smoot et al.

  19. Maximum Likelihood Joint Tracking and Association in Strong Clutter

    Directory of Open Access Journals (Sweden)

    Leonid I. Perlovsky

    2013-01-01

    Full Text Available We have developed a maximum likelihood formulation for a joint detection, tracking and association problem. An efficient non‐combinatorial algorithm for this problem is developed in case of strong clutter for radar data. By using an iterative procedure of the dynamic logic process “from vague‐to‐crisp” explained in the paper, the new tracker overcomes the combinatorial complexity of tracking in highly‐cluttered scenarios and results in an orders‐of‐magnitude improvement in signal‐ to‐clutter ratio.

  20. Adaptive quasi-likelihood estimate in generalized linear models

    Institute of Scientific and Technical Information of China (English)

    CHEN Xia; CHEN Xiru

    2005-01-01

    This paper gives a thorough theoretical treatment on the adaptive quasilikelihood estimate of the parameters in the generalized linear models. The unknown covariance matrix of the response variable is estimated by the sample. It is shown that the adaptive estimator defined in this paper is asymptotically most efficient in the sense that it is asymptotic normal, and the covariance matrix of the limit distribution coincides with the one for the quasi-likelihood estimator for the case that the covariance matrix of the response variable is completely known.

  1. Maximum likelihood characterization of rotationally symmetric distributions on the sphere

    OpenAIRE

    Duerinckx, Mitia; Ley, Christophe

    2012-01-01

    A classical characterization result, which can be traced back to Gauss, states that the maximum likelihood estimator (MLE) of the location parameter equals the sample mean for any possible univariate samples of any possible sizes n if and only if the samples are drawn from a Gaussian population. A similar result, in the two-dimensional case, is given in von Mises (1918) for the Fisher-von Mises-Langevin (FVML) distribution, the equivalent of the Gaussian law on the unit circle. Half a century...

  2. A review of factors associated with greater likelihood of suicide attempts and suicide deaths in bipolar disorder

    DEFF Research Database (Denmark)

    Schaffer, Ayal; Isometsä, Erkki T; Azorin, Jean-Michel

    2015-01-01

    OBJECTIVES: Many factors influence the likelihood of suicide attempts or deaths in persons with bipolar disorder. One key aim of the International Society for Bipolar Disorders Task Force on Suicide was to summarize the available literature on the presence and magnitude of effect of these factors....... METHODS: A systematic review of studies published from 1 January 1980 to 30 May 2014 identified using keywords 'bipolar disorder' and 'suicide attempts or suicide'. This specific paper examined all reports on factors putatively associated with suicide attempts or suicide deaths in bipolar disorder samples....... Factors were subcategorized into: (1) sociodemographics, (2) clinical characteristics of bipolar disorder, (3) comorbidities, and (4) other clinical variables. RESULTS: We identified 141 studies that examined how 20 specific factors influenced the likelihood of suicide attempts or deaths. While the level...

  3. Failed refutations: further comments on parsimony and likelihood methods and their relationship to Popper's degree of corroboration.

    Science.gov (United States)

    de Queiroz, Kevin; Poe, Steven

    2003-06-01

    Kluge's (2001, Syst. Biol. 50:322-330) continued arguments that phylogenetic methods based on the statistical principle of likelihood are incompatible with the philosophy of science described by Karl Popper are based on false premises related to Kluge's misrepresentations of Popper's philosophy. Contrary to Kluge's conjectures, likelihood methods are not inherently verificationist; they do not treat every instance of a hypothesis as confirmation of that hypothesis. The historical nature of phylogeny does not preclude phylogenetic hypotheses from being evaluated using the probability of evidence. The low absolute probabilities of hypotheses are irrelevant to the correct interpretation of Popper's concept termed degree of corroboration, which is defined entirely in terms of relative probabilities. Popper did not advocate minimizing background knowledge; in any case, the background knowledge of both parsimony and likelihood methods consists of the general assumption of descent with modification and additional assumptions that are deterministic, concerning which tree is considered most highly corroborated. Although parsimony methods do not assume (in the sense of entailing) that homoplasy is rare, they do assume (in the sense of requiring to obtain a correct phylogenetic inference) certain things about patterns of homoplasy. Both parsimony and likelihood methods assume (in the sense of implying by the manner in which they operate) various things about evolutionary processes, although violation of those assumptions does not always cause the methods to yield incorrect phylogenetic inferences. Test severity is increased by sampling additional relevant characters rather than by character reanalysis, although either interpretation is compatible with the use of phylogenetic likelihood methods. Neither parsimony nor likelihood methods assess test severity (critical evidence) when used to identify a most highly corroborated tree(s) based on a single method or model and a

  4. COMPET: High resolution high sensitivity MRI compatible pre-clinical PET scanner

    CERN Document Server

    Hines, Kim-Eigard; Skretting, Arne; Rohne, Ole; Bjaalie, Jan G; Volgyes, David; Rissi, Michael; Dorholt, Ole; Stapnes, Steinar

    2013-01-01

    COMPET is a pre-clinical MRI compatible PET scanner which decouples sensitivity and resolution by the use of a novel detector design. The detector has been built using 8 x 8 cm(2) square layers consisting of 30 LYSO crystals (2 x 3 x 80 mm(2)) interleaved with 24 Wavelength Shifting Fibers (WLS) (3 x 1 x 80 mm(3)). By stacking several layers into a module, the point-of-interaction (POI) can be measured in 3D. Four layers form a PET ring where the sensitivity can be increased by stacking several layers. The layers can be stacked so that no inter-crystal or inter-module gap is formed. COMPET has used four assembled layers for module and scanner characterization. The modules are connected to the COMPET data-acquisition chain and the reconstructed images are produced with the novel geometry-independent COMPET image reconstruction algorithm. Time and energy resolution have been resolved and found to be around 4 as and 14% respectively. Tests for MRI interference and count rate performance have been carried out The...

  5. LikeDM: likelihood calculator of dark matter detection

    CERN Document Server

    Huang, Xiaoyuan; Yuan, Qiang

    2016-01-01

    With the large progresses of searching for dark matter (DM) particles from indirect and direct methods, we develop a numerical tool which enables fast calculation of the likelihood of specified DM particle models given a number of observational data, such as charged cosmic rays from space-borne experiments (e.g., PAMELA, AMS-02), $\\gamma$-rays from Fermi space telescope, and the underground direct detection experiments. The purpose of this tool, \\likedm\\ --- likelihood calculator of dark matter detection, is to bridge the particle model of DM and the observational data. The intermediate steps between these two, including the astrophysical backgrounds, the propagation of charged particles, the analysis of Fermi $\\gamma$-ray data, as well as the DM velocity distribution and the nuclear form factor, have been dealt with in the code. We release the first version (v1.0) focusing on the constraints of charged cosmic and gamma rays and the direct detection part will be implemented in the next version. This manual de...

  6. Likelihood Analysis of the Minimal AMSB Model arXiv

    CERN Document Server

    Bagnaschi, E.; Sakurai, K.; Buchmueller, O.; Cavanaugh, R.; Chobanova, V.; Citron, M.; Costa, J.C.; De Roeck, A.; Dolan, M.J.; Ellis, J.R.; Flächer, H.; Heinemeyer, S.; Isidori, G.; Lucio, M.; Luo, F.; Martínez Santos, D.; Olive, K.A.; Richards, A.; Weiglein, G.

    We perform a likelihood analysis of the minimal Anomaly-Mediated Supersymmetry Breaking (mAMSB) model using constraints from cosmology and accelerator experiments. We find that a wino-like or a Higgsino-like neutralino LSP, $m_{\\tilde \\chi^0_{1}}$, may provide the cold dark matter (DM) with similar likelihood. The upper limit on the DM density from Planck and other experiments enforces $m_{\\tilde \\chi^0_{1}} \\lesssim 3~TeV$ after the inclusion of Sommerfeld enhancement in its annihilations. If most of the cold DM density is provided by the $\\tilde \\chi_0^1$, the measured value of the Higgs mass favours a limited range of $\\tan \\beta \\sim 5$ (or for $\\mu > 0$, $\\tan \\beta \\sim 45$) but the scalar mass $m_0$ is poorly constrained. In the wino-LSP case, $m_{3/2}$ is constrained to about $900~TeV$ and ${m_{\\tilde \\chi^0_{1}}}$ to $2.9\\pm0.1~TeV$, whereas in the Higgsino-LSP case $m_{3/2}$ has just a lower limit $\\gtrsim 650TeV$ ($\\gtrsim 480TeV$) and $m_{\\tilde \\chi^0_{1}}$ is constrained to $1.12 ~(1.13) \\pm0.02...

  7. Hybrid pairwise likelihood analysis of animal behavior experiments.

    Science.gov (United States)

    Cattelan, Manuela; Varin, Cristiano

    2013-12-01

    The study of the determinants of fights between animals is an important issue in understanding animal behavior. For this purpose, tournament experiments among a set of animals are often used by zoologists. The results of these tournament experiments are naturally analyzed by paired comparison models. Proper statistical analysis of these models is complicated by the presence of dependence between the outcomes of fights because the same animal is involved in different contests. This paper discusses two different model specifications to account for between-fights dependence. Models are fitted through the hybrid pairwise likelihood method that iterates between optimal estimating equations for the regression parameters and pairwise likelihood inference for the association parameters. This approach requires the specification of means and covariances only. For this reason, the method can be applied also when the computation of the joint distribution is difficult or inconvenient. The proposed methodology is investigated by simulation studies and applied to real data about adult male Cape Dwarf Chameleons. © 2013, The International Biometric Society.

  8. Menyoal Elaboration Likelihood Model (ELM dan Teori Retorika

    Directory of Open Access Journals (Sweden)

    Yudi Perbawaningsih

    2012-06-01

    Full Text Available Abstract: Persuasion is a communication process to establish or change attitudes, which can be understood through theory of Rhetoric and theory of Elaboration Likelihood Model (ELM. This study elaborates these theories in a Public Lecture series which to persuade the students in choosing their concentration of study. The result shows that in term of persuasion effectiveness it is not quite relevant to separate the message and its source. The quality of source is determined by the quality of the message, and vice versa. Separating the two routes of the persuasion process as described in the ELM theory would not be relevant. Abstrak: Persuasi adalah proses komunikasi untuk membentuk atau mengubah sikap, yang dapat dipahami dengan teori Retorika dan teori Elaboration Likelihood Model (ELM. Penelitian ini mengelaborasi teori tersebut dalam Kuliah Umum sebagai sarana mempersuasi mahasiswa untuk memilih konsentrasi studi studi yang didasarkan pada proses pengolahan informasi. Menggunakan metode survey, didapatkan hasil yaitu tidaklah cukup relevan memisahkan pesan dan narasumber dalam melihat efektivitas persuasi. Keduanya menyatu yang berarti bahwa kualitas narasumber ditentukan oleh kualitas pesan yang disampaikannya, dan sebaliknya. Memisahkan proses persuasi dalam dua lajur seperti yang dijelaskan dalam ELM teori menjadi tidak relevan.

  9. Likelihood free inference for Markov processes: a comparison.

    Science.gov (United States)

    Owen, Jamie; Wilkinson, Darren J; Gillespie, Colin S

    2015-04-01

    Approaches to Bayesian inference for problems with intractable likelihoods have become increasingly important in recent years. Approximate Bayesian computation (ABC) and "likelihood free" Markov chain Monte Carlo techniques are popular methods for tackling inference in these scenarios but such techniques are computationally expensive. In this paper we compare the two approaches to inference, with a particular focus on parameter inference for stochastic kinetic models, widely used in systems biology. Discrete time transition kernels for models of this type are intractable for all but the most trivial systems yet forward simulation is usually straightforward. We discuss the relative merits and drawbacks of each approach whilst considering the computational cost implications and efficiency of these techniques. In order to explore the properties of each approach we examine a range of observation regimes using two example models. We use a Lotka-Volterra predator-prey model to explore the impact of full or partial species observations using various time course observations under the assumption of known and unknown measurement error. Further investigation into the impact of observation error is then made using a Schlögl system, a test case which exhibits bi-modal state stability in some regions of parameter space.

  10. On the Likelihood of Supernova Enrichment of Protoplanetary Disks

    Science.gov (United States)

    Williams, Jonathan P.; Gaidos, Eric

    2007-07-01

    We estimate the likelihood of direct injection of supernova ejecta into protoplanetary disks using a model in which the number of stars with disks decreases linearly with time, and clusters expand linearly with time such that their surface density is independent of stellar number. The similarity of disk dissipation and main-sequence lifetimes implies that the typical supernova progenitor is very massive, ~75-100 Msolar. Such massive stars are found only in clusters with >~104 members. Moreover, there is only a small region around a supernova within which disks can survive the blast yet be enriched to the level observed in the solar system. These two factors limit the overall likelihood of supernova enrichment of a protoplanetary disk to radionucleides in meteorites is to be explained in this way, however, the solar system most likely formed in one of the largest clusters in the Galaxy, more than 2 orders of magnitude greater than Orion, where multiple supernovae impacted many disks in a short period of time.

  11. On the likelihood of supernova enrichment of protoplanetary disks

    CERN Document Server

    Williams, Jonathan P

    2007-01-01

    We estimate the likelihood of direct injection of supernova ejecta into protoplanetary disks using a model in which the number of stars with disks decreases linearly with time, and clusters expand linearly with time such that their surface density is independent of stellar number. The similarity of disk dissipation and main sequence lifetimes implies that the typical supernova progenitor is very massive, ~ 75-100 Msun. Such massive stars are found only in clusters with > 10^4 members. Moreover, there is only a small region around a supernova within which disks can survive the blast yet be enriched to the level observed in the Solar System. These two factors limit the overall likelihood of supernova enrichment of a protoplanetary disk to < 1%. If the presence of short lived radionucleides in meteorites is to be explained in this way, however, the Solar System most likely formed in one of the largest clusters in the Galaxy, more than two orders of magnitude greater than Orion, where multiple supernovae impac...

  12. Empirical likelihood ratio tests for multivariate regression models

    Institute of Scientific and Technical Information of China (English)

    WU Jianhong; ZHU Lixing

    2007-01-01

    This paper proposes some diagnostic tools for checking the adequacy of multivariate regression models including classical regression and time series autoregression. In statistical inference, the empirical likelihood ratio method has been well known to be a powerful tool for constructing test and confidence region. For model checking, however, the naive empirical likelihood (EL) based tests are not of Wilks' phenomenon. Hence, we make use of bias correction to construct the EL-based score tests and derive a nonparametric version of Wilks' theorem. Moreover, by the advantages of both the EL and score test method, the EL-based score tests share many desirable features as follows: They are self-scale invariant and can detect the alternatives that converge to the null at rate n-1/2, the possibly fastest rate for lack-of-fit testing; they involve weight functions, which provides us with the flexibility to choose scores for improving power performance, especially under directional alternatives. Furthermore, when the alternatives are not directional, we construct asymptotically distribution-free maximin tests for a large class of possible alternatives. A simulation study is carried out and an application for a real dataset is analyzed.

  13. Gauging the likelihood of stable cavitation from ultrasound contrast agents.

    Science.gov (United States)

    Bader, Kenneth B; Holland, Christy K

    2013-01-07

    The mechanical index (MI) was formulated to gauge the likelihood of adverse bioeffects from inertial cavitation. However, the MI formulation did not consider bubble activity from stable cavitation. This type of bubble activity can be readily nucleated from ultrasound contrast agents (UCAs) and has the potential to promote beneficial bioeffects. Here, the presence of stable cavitation is determined numerically by tracking the onset of subharmonic oscillations within a population of bubbles for frequencies up to 7 MHz and peak rarefactional pressures up to 3 MPa. In addition, the acoustic pressure rupture threshold of an UCA population was determined using the Marmottant model. The threshold for subharmonic emissions of optimally sized bubbles was found to be lower than the inertial cavitation threshold for all frequencies studied. The rupture thresholds of optimally sized UCAs were found to be lower than the threshold for subharmonic emissions for either single cycle or steady state acoustic excitations. Because the thresholds of both subharmonic emissions and UCA rupture are linearly dependent on frequency, an index of the form I(CAV) = P(r)/f (where P(r) is the peak rarefactional pressure in MPa and f is the frequency in MHz) was derived to gauge the likelihood of subharmonic emissions due to stable cavitation activity nucleated from UCAs.

  14. tmle : An R Package for Targeted Maximum Likelihood Estimation

    Directory of Open Access Journals (Sweden)

    Susan Gruber

    2012-11-01

    Full Text Available Targeted maximum likelihood estimation (TMLE is a general approach for constructing an efficient double-robust semi-parametric substitution estimator of a causal effect parameter or statistical association measure. tmle is a recently developed R package that implements TMLE of the effect of a binary treatment at a single point in time on an outcome of interest, controlling for user supplied covariates, including an additive treatment effect, relative risk, odds ratio, and the controlled direct effect of a binary treatment controlling for a binary intermediate variable on the pathway from treatment to the out- come. Estimation of the parameters of a marginal structural model is also available. The package allows outcome data with missingness, and experimental units that contribute repeated records of the point-treatment data structure, thereby allowing the analysis of longitudinal data structures. Relevant factors of the likelihood may be modeled or fit data-adaptively according to user specifications, or passed in from an external estimation procedure. Effect estimates, variances, p values, and 95% confidence intervals are provided by the software.

  15. Superfast maximum-likelihood reconstruction for quantum tomography

    Science.gov (United States)

    Shang, Jiangwei; Zhang, Zhengyun; Ng, Hui Khoon

    2017-06-01

    Conventional methods for computing maximum-likelihood estimators (MLE) often converge slowly in practical situations, leading to a search for simplifying methods that rely on additional assumptions for their validity. In this work, we provide a fast and reliable algorithm for maximum-likelihood reconstruction that avoids this slow convergence. Our method utilizes the state-of-the-art convex optimization scheme, an accelerated projected-gradient method, that allows one to accommodate the quantum nature of the problem in a different way than in the standard methods. We demonstrate the power of our approach by comparing its performance with other algorithms for n -qubit state tomography. In particular, an eight-qubit situation that purportedly took weeks of computation time in 2005 can now be completed in under a minute for a single set of data, with far higher accuracy than previously possible. This refutes the common claim that MLE reconstruction is slow and reduces the need for alternative methods that often come with difficult-to-verify assumptions. In fact, recent methods assuming Gaussian statistics or relying on compressed sensing ideas are demonstrably inapplicable for the situation under consideration here. Our algorithm can be applied to general optimization problems over the quantum state space; the philosophy of projected gradients can further be utilized for optimization contexts with general constraints.

  16. FPGA Acceleration of the phylogenetic likelihood function for Bayesian MCMC inference methods

    Directory of Open Access Journals (Sweden)

    Bakos Jason D

    2010-04-01

    Full Text Available Abstract Background Likelihood (ML-based phylogenetic inference has become a popular method for estimating the evolutionary relationships among species based on genomic sequence data. This method is used in applications such as RAxML, GARLI, MrBayes, PAML, and PAUP. The Phylogenetic Likelihood Function (PLF is an important kernel computation for this method. The PLF consists of a loop with no conditional behavior or dependencies between iterations. As such it contains a high potential for exploiting parallelism using micro-architectural techniques. In this paper, we describe a technique for mapping the PLF and supporting logic onto a Field Programmable Gate Array (FPGA-based co-processor. By leveraging the FPGA's on-chip DSP modules and the high-bandwidth local memory attached to the FPGA, the resultant co-processor can accelerate ML-based methods and outperform state-of-the-art multi-core processors. Results We use the MrBayes 3 tool as a framework for designing our co-processor. For large datasets, we estimate that our accelerated MrBayes, if run on a current-generation FPGA, achieves a 10× speedup relative to software running on a state-of-the-art server-class microprocessor. The FPGA-based implementation achieves its performance by deeply pipelining the likelihood computations, performing multiple floating-point operations in parallel, and through a natural log approximation that is chosen specifically to leverage a deeply pipelined custom architecture. Conclusions Heterogeneous computing, which combines general-purpose processors with special-purpose co-processors such as FPGAs and GPUs, is a promising approach for high-performance phylogeny inference as shown by the growing body of literature in this field. FPGAs in particular are well-suited for this task because of their low power consumption as compared to many-core processors and Graphics Processor Units (GPUs 1.

  17. High lipoprotein(a) as a possible cause of clinical familial hypercholesterolaemia

    DEFF Research Database (Denmark)

    Langsted, Anne; Kamstrup, Pia Rørbœk; Benn, Marianne

    2016-01-01

    BACKGROUND: The reason why lipoprotein(a) concentrations are raised in individuals with clinical familial hypercholesterolaemia is unclear. We tested the hypotheses that high lipoprotein(a) cholesterol and LPA risk genotypes are a possible cause of clinical familial hypercholesterolaemia......, and that individuals with both high lipoprotein(a) concentrations and clinical familial hypercholesterolaemia have the highest risk of myocardial infarction. METHODS: We did a prospective cohort study that included data from 46 200 individuals from the Copenhagen General Population Study who had lipoprotein......(a) measurements and were genotyped for common familial hypercholesterolaemia mutations. Individuals receiving cholesterol-lowering drugs had their concentrations of LDL and total cholesterol multiplied by 1·43, corresponding to an estimated 30% reduction in LDL cholesterol from the treatment. In lipoprotein...

  18. Correlation of findings in clinical and high resolution ultrasonography examinations of the painful shoulder

    Directory of Open Access Journals (Sweden)

    Raphael Micheroli

    2015-03-01

    Full Text Available Objective: High resolution ultrasonography is a non-painful and non-invasive imaging technique which is useful for the assessment of shoulder pain causes, as clinical examination often does not allow an exact diagnosis. The aim of this study was to compare the fi ndings of clinical examination and high resolution ultrasonography in patients presenting with painful shoulder. Methods: Non-interventional observational study of 100 adult patients suffering from unilateral shoulder pain. Exclusion criteria were shoulder fractures, prior shoulder joint surgery and shoulder injections in the past month. The physicians performing the most common clinical shoulder examinations were blinded to the results of the high resolution ultrasonography and vice versa. Results: In order to detect pathology of the m. supraspinatus tendon, the Hawkins and Kennedy impingement test showed the highest sensitivity (0.86 whereas the Jobe supraspinatus test showed the highest specifi city (0.55. To identify m. subscapularis tendon pathology the Gerber lift off test showed a sensitivity of 1, whereas the belly press test showed the higher specifi city (0.72. The infraspinatus test showed a high sensitivity (0.90 and specifi city (0.74. All AC tests (painful arc IIa, AC joint tendernessb, cross body adduction stress testc showed high specifi cities (a0.96, b0.99, c 0.96. Evaluating the long biceps tendon, the palm up test showed the highest sensitivity (0.47 and the Yergason test the highest specifi city (0.88. Conclusion: Knowledge of sensitivity and specifi city of various clinical tests is important for the interpretation of clinical examination test results. High resolution ultrasonography is needed in most cases to establish a clear diagnosis.

  19. Detection of High Frequency Oscillations by Hybrid Depth Electrodes in Standard Clinical Intracranial EEG Recordings

    Directory of Open Access Journals (Sweden)

    Efstathios D Kondylis

    2014-08-01

    Full Text Available High frequency oscillations (HFOs have been proposed as a novel marker for epileptogenic tissue, spurring tremendous research interest into the characterization of these transient events. A wealth of continuously recorded intracranial electroencephalographic (iEEG data is currently available from patients undergoing invasive monitoring for the surgical treatment of epilepsy. In contrast to data recorded on research-customized recording systems, data from clinical acquisition systems remain an underutilized resource for HFO detection in most centers. The effective and reliable use of this clinically obtained data would be an important advance in the ongoing study of HFOs and their relationship to ictogenesis. The diagnostic utility of HFOs ultimately will be limited by the ability of clinicians to detect these brief, sporadic, and low amplitude events in an electrically noisy clinical environment. Indeed, one of the most significant factors limiting the use of such clinical recordings for research purposes is their low signal to noise ratio, especially in the higher frequency bands. In order to investigate the presence of HFOs in clinical data, we first obtained continuous intracranial recordings in a typical clinical environment using a commercially available, commonly utilized data acquisition system and off the shelf hybrid macro/micro depth electrodes. This data was then inspected for the presence of HFOs using semi-automated methods and expert manual review. With targeted removal of noise frequency content, HFOs were detected on both macro- and micro-contacts, and preferentially localized to seizure onset zones. HFOs detected by the offline, semi-automated method were also validated in the clinical viewer, demonstrating that 1 this clinical system allows for the visualization of HFOs, and 2 with effective signal processing, clinical recordings can yield valuable information for offline analysis.

  20. High-throughput cell analysis and sorting technologies for clinical diagnostics and therapeutics

    Science.gov (United States)

    Leary, James F.; Reece, Lisa M.; Szaniszlo, Peter; Prow, Tarl W.; Wang, Nan

    2001-05-01

    A number of theoretical and practical limits of high-speed flow cytometry/cell sorting are important for clinical diagnostics and therapeutics. Three applications include: (1) stem cell isolation with tumor purging for minimal residual disease monitoring and treatment, (2) identification and isolation of human fetal cells from maternal blood for prenatal diagnostics and in-vitro therapeutics, and (3) high-speed library screening for recombinant vaccine production against unknown pathogens.

  1. EPA guidance on the early detection of clinical high risk states of psychoses

    DEFF Research Database (Denmark)

    Schultze-Lutter, F; Michel, C; Schmidt, S J

    2015-01-01

    The aim of this guidance paper of the European Psychiatric Association is to provide evidence-based recommendations on the early detection of a clinical high risk (CHR) for psychosis in patients with mental problems. To this aim, we conducted a meta-analysis of studies reporting on conversion rat...

  2. EPA guidance on the early intervention in clinical high risk states of psychoses

    DEFF Research Database (Denmark)

    Schmidt, S J; Schultze-Lutter, F; Schimmelmann, B G

    2015-01-01

    This guidance paper from the European Psychiatric Association (EPA) aims to provide evidence-based recommendations on early intervention in clinical high risk (CHR) states of psychosis, assessed according to the EPA guidance on early detection. The recommendations were derived from a meta-analysi...

  3. Clinical risk factors for gestational hypertensive disorders in pregnant women at high risk for developing preeclampsia

    NARCIS (Netherlands)

    Wong, Tsz Y.; Groen, Henk; Faas, Marijke M.; van Pampus, Maria G.

    2013-01-01

    Objectives: To evaluate clinical risk factors for the development of gestational hypertensive disorders in a group of pregnant women at high risk for developing preeclampsia. Secondly we evaluated the incidence and recurrence rate of preeclampsia and pregnancy-induced hypertension. Study design: A

  4. Evaluation of pulmonary embolism in a pediatric population with high clinical suspicion

    Energy Technology Data Exchange (ETDEWEB)

    Victoria, Teresa; Mong, Andrew; Altes, Talissa; Hernandez, Andrea; Gonzalez, Leonardo; Kramer, Sandra S. [Children' s Hospital of Philadelphia, Department of Radiology, Philadelphia, PA (United States); Jawad, Abbas F. [Children' s Hospital of Philadelphia, Department of Biostatistics and Epidemiology, Philadelphia, PA (United States); Raffini, Leslie [Children' s Hospital of Philadelphia, Department of Pediatrics, Division of Hematology, Philadelphia, PA (United States)

    2009-01-15

    Pulmonary embolism (PE) is an underdiagnosed entity in the pediatric population in part because of the low level of suspicion and awareness in the clinical world. To examine its relative prevalence, associated risk factors and imaging features in our pediatric population. A total of 92 patients age 21 years and younger with a high clinical suspicion of PE and who had available radiographic studies were identified from January 2003 to September 2006. Patients with a positive CT scan or a high probability ventilation/perfusion scan formed the case group; patients with a high clinical suspicion of PE and no radiographic evidence of PE or deep venous thrombosis (DVT), randomly matched in age and sex, became the matched control group. We reviewed the charts of both groups and analyzed the imaging studies. In our hospital, the prevalence of PE in patients with a strong suspicion of PE was 14%. The overall prevalence of thromboembolic disease (PE and/or DVT) was 25%. Recent surgery or orthopedic procedure, blood dyscrasias and contraceptive use were more common in patients with PE. No child died of PE in our study. The youngest child with PE in our study was 13 years. Girls were twice as likely to develop PE as boys. PE is a relatively common diagnosis in our tertiary care pediatric population when the clinical suspicion is high. We suggest increased awareness and index of suspicion in order to initiate prompt diagnostic imaging and treatment. (orig.)

  5. Clinical risk factors for gestational hypertensive disorders in pregnant women at high risk for developing preeclampsia

    NARCIS (Netherlands)

    Wong, Tsz Y.; Groen, Henk; Faas, Marijke M.; van Pampus, Maria G.

    2013-01-01

    Objectives: To evaluate clinical risk factors for the development of gestational hypertensive disorders in a group of pregnant women at high risk for developing preeclampsia. Secondly we evaluated the incidence and recurrence rate of preeclampsia and pregnancy-induced hypertension. Study design: A p

  6. Pharmacotherapy in Children and Adolescents at Clinical-High Risk for Psychosis and Bipolar Disorder.

    Science.gov (United States)

    Lambert, M; Niehaus, V; Correll, C

    2016-11-01

    This review aims to describe the importance of i) detecting individuals at clinical high-risk for psychosis (schizophrenia) or bipolar disorder, especially in children and adolescents, in order to enable early intervention, and ii) evaluating different intervention strategies, especially pharmacotherapy, during the subsyndromal or "prodromal" stages of these severe and often debilitating disorders. The different approaches regarding the psychotic and bipolar clinical high-risk state are discussed, including reasons and evidence for early (pharmacological) intervention and risks of treatment vs. non-treatment. Only 10 prospective studies of antipsychotics (randomized=4) and 6 prospective studies of non-antipsychotic pharmacologic agents (randomized=3, i. e., omega-3 fatty acids=2, glycine=1) for the psychotic clinical high-risk state and only 4 prospective studies of mood stabilizing medications for the bipolar clinical high-risk state (randomized=2, i. e., lithium=1, valproate=1) were detected. Based on the minimal efficacy data, adverse effect risks, especially in pediatric populations, nonspecific psychopathology, and unknown true risk for the development of either psychosis or bipolar disorder or of chronically disabling symptoms and disability, medication treatment currently remains second choice after psychosocial intervention. Additional research in this area is clearly needed in order to shed more light on the relevance and predictive value of potentially prodromal symptoms, their identification and most appropriate management options. © Georg Thieme Verlag KG Stuttgart · New York.

  7. Early Identification of High-Ability Students: Clinical Assessment of Behavior

    Science.gov (United States)

    Bracken, Bruce A.; Brown, Elissa F.

    2008-01-01

    This study investigated the ability of teachers to accurately rate the cognitive and academic functioning of 1,375 students in kindergarten through the third grade on the Clinical Assessment of Behavior (CAB), as compared to two objective cognitive ability tests. CAB teacher ratings were compared for high-ability students who were currently…

  8. Clinical profile of high-risk febrile neutropenia in a tertiary care hospital

    Directory of Open Access Journals (Sweden)

    Mohan V Bhojaraja

    2016-06-01

    Full Text Available Background Infection in the immunocompromised host has been a reason of concern in the clinical setting and a topic of debate for decades. In this study, the aim was to analyse the clinical profile of high-risk febrile neutropenic patients. Aims To study the clinical profile of high risk febrile neutropenia patients with the objective of identifying the most common associated malignancy, most common associated pathogen, the source of infection, to correlate the treatment and management with that of the Infectious Diseases Society of America (IDSA 2010 guidelines and to assess the clinical outcome. Methods A cross-sectional time bound study was carried out and a total of 80 episodes of high-risk febrile neutropenia were recorded among patients with malignancies from September 2011 to July 2013 with each episode being taken as a new case. Results Non-Hodgkin’s lymphoma (30 per cent was the most common malignancy associated, commonest source of infection was due to central venous catheters, the commonest pathogens were gram negative (52 per cent the treatment and management of each episode of high risk febrile neutropenia correlated with that of IDSA 2010 guidelines and the mortality rate was 13.75 per cent. Conclusion Febrile neutropenia is one of the major complications and cause of mortality in patients with malignancy and hence understanding its entire spectrum can help us reduce morbidity and mortality.

  9. A likelihood ratio test for species membership based on DNA sequence data

    DEFF Research Database (Denmark)

    Matz, Mikhail V.; Nielsen, Rasmus

    2005-01-01

    DNA barcoding as an approach for species identification is rapidly increasing in popularity. However, it remains unclear which statistical procedures should accompany the technique to provide a measure of uncertainty. Here we describe a likelihood ratio test which can be used to test if a sampled...... sequence is a member of an a priori specified species. We investigate the performance of the test using coalescence simulations, as well as using the real data from butterflies and frogs representing two kinds of challenge for DNA barcoding: extremely low and extremely high levels of sequence variability....

  10. Inference for the Sharpe Ratio Using a Likelihood-Based Approach

    Directory of Open Access Journals (Sweden)

    Ying Liu

    2012-01-01

    Full Text Available The Sharpe ratio is the prominent risk-adjusted performance measure used by practitioners. Statistical testing of this ratio using its asymptotic distribution has lagged behind its use. In this paper, highly accurate likelihood analysis is applied for inference on the Sharpe ratio. Both the one- and two-sample problems are considered. The methodology has O(n−3/2 distributional accuracy and can be implemented using any parametric return distribution structure. Simulations are provided to demonstrate the method's superior accuracy over existing methods used for testing in the literature.

  11. Modified likelihood ratio test for homogeneity in bivariate normal mixtures with presence of a structural parameter

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    This paper investigates the asymptotic properties of the modified likelihood ratio statistic for testing homogeneity in bivariate normal mixture models with an unknown structural parameter. It is shown that the modified likelihood ratio statistic has χ22 null limiting distribution.

  12. Modified likelihood ratio test for homogeneity in normal mixtures with two samples

    Institute of Scientific and Technical Information of China (English)

    QIN Yong-song; LEI Qing-zhu

    2008-01-01

    This paper investigates the modified likelihood ratio test(LRT) for homogeneity in normal mixtures of two samples with mixing proportions unknown. It is proved that the limit distribution of the modified likelihood ratio test is X2(1).

  13. Estimation for Non-Gaussian Locally Stationary Processes with Empirical Likelihood Method

    Directory of Open Access Journals (Sweden)

    Hiroaki Ogata

    2012-01-01

    Full Text Available An application of the empirical likelihood method to non-Gaussian locally stationary processes is presented. Based on the central limit theorem for locally stationary processes, we give the asymptotic distributions of the maximum empirical likelihood estimator and the empirical likelihood ratio statistics, respectively. It is shown that the empirical likelihood method enables us to make inferences on various important indices in a time series analysis. Furthermore, we give a numerical study and investigate a finite sample property.

  14. Fusing visual and clinical information for lung tissue classification in high-resolution computed tomography.

    Science.gov (United States)

    Depeursinge, Adrien; Racoceanu, Daniel; Iavindrasana, Jimison; Cohen, Gilles; Platon, Alexandra; Poletti, Pierre-Alexandre; Müller, Henning

    2010-09-01

    We investigate the influence of the clinical context of high-resolution computed tomography (HRCT) images of the chest on tissue classification. 2D regions of interest in HRCT axial slices from patients affected with an interstitial lung disease are automatically classified into five classes of lung tissue. Relevance of the clinical parameters is studied before fusing them with visual attributes. Two multimedia fusion techniques are compared: early versus late fusion. Early fusion concatenates features in one single vector, yielding a true multimedia feature space. Late fusion consisting of the combination of the probability outputs of two support vector machines. The late fusion scheme allowed a maximum of 84% correct predictions of testing instances among the five classes of lung tissue. This represents a significant improvement of 10% compared to a pure visual-based classification. Moreover, the late fusion scheme showed high robustness to the number of clinical parameters used, which suggests that it is appropriate for mining clinical attributes with missing values in clinical routine. Copyright (c) 2010 Elsevier B.V. All rights reserved.

  15. Further Evaluation of Covariate Analysis using Empirical Bayes Estimates in Population Pharmacokinetics: the Perception of Shrinkage and Likelihood Ratio Test.

    Science.gov (United States)

    Xu, Xu Steven; Yuan, Min; Yang, Haitao; Feng, Yan; Xu, Jinfeng; Pinheiro, Jose

    2017-01-01

    Covariate analysis based on population pharmacokinetics (PPK) is used to identify clinically relevant factors. The likelihood ratio test (LRT) based on nonlinear mixed effect model fits is currently recommended for covariate identification, whereas individual empirical Bayesian estimates (EBEs) are considered unreliable due to the presence of shrinkage. The objectives of this research were to investigate the type I error for LRT and EBE approaches, to confirm the similarity of power between the LRT and EBE approaches from a previous report and to explore the influence of shrinkage on LRT and EBE inferences. Using an oral one-compartment PK model with a single covariate impacting on clearance, we conducted a wide range of simulations according to a two-way factorial design. The results revealed that the EBE-based regression not only provided almost identical power for detecting a covariate effect, but also controlled the false positive rate better than the LRT approach. Shrinkage of EBEs is likely not the root cause for decrease in power or inflated false positive rate although the size of the covariate effect tends to be underestimated at high shrinkage. In summary, contrary to the current recommendations, EBEs may be a better choice for statistical tests in PPK covariate analysis compared to LRT. We proposed a three-step covariate modeling approach for population PK analysis to utilize the advantages of EBEs while overcoming their shortcomings, which allows not only markedly reducing the run time for population PK analysis, but also providing more accurate covariate tests.

  16. Prolonged clinical benefit of everolimus therapy in the management of high-grade pancreatic neuroendocrine carcinoma.

    Science.gov (United States)

    Fonseca, Paula J; Uriol, Esther; Galván, José A; Alvarez, Carlos; Pérez, Quionia; Villanueva, Noemi; Berros, José P; Izquierdo, Marta; Viéitez, José M

    2013-01-01

    Treatment options for patients with high-grade pancreatic neuroendocrine tumors (pNET) are limited, especially for those with progressive disease and for those who experience treatment failure. Everolimus, an oral inhibitor of mammalian target of rapamycin (mTOR), has been approved for the treatment of patients with low- or intermediate-grade advanced pNET. In the randomized phase III RADIANT-3 study in patients with low- or intermediate-grade advanced pNET, everolimus significantly increased progression-free survival (PFS) and decreased the relative risk for disease progression by 65% over placebo. This case report describes a heavily pretreated patient with high-grade pNET and liver and peritoneal metastases who achieved prolonged PFS, clinically relevant partial radiologic tumor response, and resolution of constitutional symptoms with improvement in Karnofsky performance status while receiving a combination of everolimus and octreotide long-acting repeatable (LAR). Radiologic and clinical responses were maintained for 19 months, with minimal toxicity over the course of treatment. This case supports the findings that the combination of everolimus plus octreotide LAR may be considered for use in patients with high-grade pNET and progressive disease. Although behavior and aggressiveness are different between low- or intermediate-grade and high-grade pNET, some high-grade pNET may express mTOR; hence, everolimus should be considered in a clinical trial.

  17. Prolonged Clinical Benefit of Everolimus Therapy in the Management of High-Grade Pancreatic Neuroendocrine Carcinoma

    Directory of Open Access Journals (Sweden)

    Paula J. Fonseca

    2013-08-01

    Full Text Available Treatment options for patients with high-grade pancreatic neuroendocrine tumors (pNET are limited, especially for those with progressive disease and for those who experience treatment failure. Everolimus, an oral inhibitor of mammalian target of rapamycin (mTOR, has been approved for the treatment of patients with low- or intermediate-grade advanced pNET. In the randomized phase III RADIANT-3 study in patients with low- or intermediate-grade advanced pNET, everolimus significantly increased progression-free survival (PFS and decreased the relative risk for disease progression by 65% over placebo. This case report describes a heavily pretreated patient with high-grade pNET and liver and peritoneal metastases who achieved prolonged PFS, clinically relevant partial radiologic tumor response, and resolution of constitutional symptoms with improvement in Karnofsky performance status while receiving a combination of everolimus and octreotide long-acting repeatable (LAR. Radiologic and clinical responses were maintained for 19 months, with minimal toxicity over the course of treatment. This case supports the findings that the combination of everolimus plus octreotide LAR may be considered for use in patients with high-grade pNET and progressive disease. Although behavior and aggressiveness are different between low- or intermediate-grade and high-grade pNET, some high-grade pNET may express mTOR; hence, everolimus should be considered in a clinical trial.

  18. High frame rate photoacoustic imaging at 7000 frames per second using clinical ultrasound system.

    Science.gov (United States)

    Sivasubramanian, Kathyayini; Pramanik, Manojit

    2016-02-01

    Photoacoustic tomography, a hybrid imaging modality combining optical and ultrasound imaging, is gaining attention in the field of medical imaging. Typically, a Q-switched Nd:YAG laser is used to excite the tissue and generate photoacoustic signals. But, such photoacoustic imaging systems are difficult to translate into clinical applications owing to their high cost, bulky size often requiring an optical table to house such lasers. Moreover, the low pulse repetition rate of few tens of hertz prevents them from being used in high frame rate photoacoustic imaging. In this work, we have demonstrated up to 7000 Hz photoacoustic imaging (B-mode) and measured the flow rate of a fast moving object. We used a ~140 nanosecond pulsed laser diode as an excitation source and a clinical ultrasound imaging system to capture and display the photoacoustic images. The excitation laser is ~803 nm in wavelength with ~1.4 mJ energy per pulse. So far, the reported 2-dimensional photoacoustic B-scan imaging is only a few tens of frames per second using a clinical ultrasound system. Therefore, this is the first report on 2-dimensional photoacoustic B-scan imaging with 7000 frames per second. We have demonstrated phantom imaging to view and measure the flow rate of ink solution inside a tube. This fast photoacoustic imaging can be useful for various clinical applications including cardiac related problems, where the blood flow rate is quite high, or other dynamic studies.

  19. Youth-caregiver Agreement on Clinical High-risk Symptoms of Psychosis

    Science.gov (United States)

    Golembo-Smith, Shana; Bachman, Peter; Senturk, Damla; Cannon, Tyrone D.; Bearden, Carrie E.

    2014-01-01

    Early identification of individuals who will go on to develop schizophrenia is a difficult endeavor. The variety of symptoms experienced by clinical high-risk youth make it difficult to identify who will eventually develop schizophrenia in the future. Efforts are being made, therefore, to more accurately identify at-risk individuals and factors that predict conversion to psychosis. As in most assessments of children and adolescents, however, both youth and parental report of symptomatology and resulting dysfunction are important to assess. The goals of the current study were to assess the extent of cross-informant agreement on the Structured Interview for Prodromal Symptoms (SIPS), a widely-used tool employed to determine clinical high-risk status. A total of 84 youth-caregiver pairs participated. Youth and caregiver raters displayed moderate overall agreement on SIPS-rated symptoms. Both youth and caregiver ratings of youth symptomatology contributed significantly to predicting conversion to psychosis. In addition, youth age and quality of youth-caregiver relationships appear to be related to cross-informant symptom ratings. Despite differences on individual SIPS domains, the majority of dyads agreed on youth clinical high-risk status. Results highlight the potential clinical utility of using caregiver informants to determine youth psychosis risk. PMID:24092494

  20. Is Primary Prostate Cancer Treatment Influenced by Likelihood of Extraprostatic Disease? A Surveillance, Epidemiology and End Results Patterns of Care Study

    Energy Technology Data Exchange (ETDEWEB)

    Holmes, Jordan A. [Department of Radiation Oncology, University of North Carolina at Chapel Hill, Chapel Hill, NC (United States); Wang, Andrew Z. [Department of Radiation Oncology, University of North Carolina at Chapel Hill, Chapel Hill, NC (United States); University of North Carolina-Lineberger Comprehensive Cancer Center, Chapel Hill, NC (United States); Hoffman, Karen E. [Department of Radiation Oncology, M. D. Anderson Cancer Center, Houston, TX (United States); Hendrix, Laura H. [Department of Radiation Oncology, University of North Carolina at Chapel Hill, Chapel Hill, NC (United States); Rosenman, Julian G. [Department of Radiation Oncology, University of North Carolina at Chapel Hill, Chapel Hill, NC (United States); University of North Carolina-Lineberger Comprehensive Cancer Center, Chapel Hill, NC (United States); Carpenter, William R. [University of North Carolina-Lineberger Comprehensive Cancer Center, Chapel Hill, NC (United States); Sheps Center for Health Services Research, University of North Carolina at Chapel Hill, Chapel Hill, NC (United States); Department of Health Policy and Management, University of North Carolina School of Public Health, Chapel Hill, NC (United States); Godley, Paul A. [University of North Carolina-Lineberger Comprehensive Cancer Center, Chapel Hill, NC (United States); Sheps Center for Health Services Research, University of North Carolina at Chapel Hill, Chapel Hill, NC (United States); Division of Hematology-Oncology, University of North Carolina at Chapel Hill, Chapel Hill, NC (United States); Chen, Ronald C., E-mail: ronald_chen@med.unc.edu [Department of Radiation Oncology, University of North Carolina at Chapel Hill, Chapel Hill, NC (United States); University of North Carolina-Lineberger Comprehensive Cancer Center, Chapel Hill, NC (United States); Sheps Center for Health Services Research, University of North Carolina at Chapel Hill, Chapel Hill, NC (United States)

    2012-09-01

    Purpose: To examine the patterns of primary treatment in a recent population-based cohort of prostate cancer patients, stratified by the likelihood of extraprostatic cancer as predicted by disease characteristics available at diagnosis. Methods and Materials: A total of 157,371 patients diagnosed from 2004 to 2008 with clinically localized and potentially curable (node-negative, nonmetastatic) prostate cancer, who have complete information on prostate-specific antigen, Gleason score, and clinical stage, were included. Patients with clinical T1/T2 disease were grouped into categories of <25%, 25%-50%, and >50% likelihood of having extraprostatic disease using the Partin nomogram. Clinical T3/T4 patients were examined separately as the highest-risk group. Logistic regression was used to examine the association between patient group and receipt of each primary treatment, adjusting for age, race, year of diagnosis, marital status, Surveillance, Epidemiology and End Results database region, and county-level education. Separate models were constructed for primary surgery, external-beam radiotherapy (RT), and conservative management. Results: On multivariable analysis, increasing likelihood of extraprostatic disease was significantly associated with increasing use of RT and decreased conservative management. Use of surgery also increased. Patients with >50% likelihood of extraprostatic cancer had almost twice the odds of receiving prostatectomy as those with <25% likelihood, and T3-T4 patients had 18% higher odds. Prostatectomy use increased in recent years. Patients aged 76-80 years were likely to be managed conservatively, even those with a >50% likelihood of extraprostatic cancer (34%) and clinical T3-T4 disease (24%). The proportion of patients who received prostatectomy or conservative management was approximately 50% or slightly higher in all groups. Conclusions: There may be underutilization of RT in older prostate cancer patients and those with likely

  1. Transfer Entropy as a Log-likelihood Ratio

    CERN Document Server

    Barnett, Lionel

    2012-01-01

    Transfer entropy, an information-theoretic measure of time-directed information transfer between joint processes, has steadily gained popularity in the neurosciences, econometrics and the analysis of complex system dynamics in diverse fields. We show that for a class of parametrised partial Markov models for jointly stochastic processes in discrete time, the log-likelihood ratio test statistic for the null hypothesis of zero transfer entropy is a consistent estimator for the transfer entropy itself. The result generalises the equivalence in the Gaussian case of transfer entropy and Granger causality, a statistical notion of causal influence based on prediction via vector autoregression. In the general case, an asymptotic $\\chi^2$ distribution for the model transfer entropy estimator is established.

  2. H.264 SVC Complexity Reduction Based on Likelihood Mode Decision

    Directory of Open Access Journals (Sweden)

    L. Balaji

    2015-01-01

    Full Text Available H.264 Advanced Video Coding (AVC was prolonged to Scalable Video Coding (SVC. SVC executes in different electronics gadgets such as personal computer, HDTV, SDTV, IPTV, and full-HDTV in which user demands various scaling of the same content. The various scaling is resolution, frame rate, quality, heterogeneous networks, bandwidth, and so forth. Scaling consumes more encoding time and computational complexity during mode selection. In this paper, to reduce encoding time and computational complexity, a fast mode decision algorithm based on likelihood mode decision (LMD is proposed. LMD is evaluated in both temporal and spatial scaling. From the results, we conclude that LMD performs well, when compared to the previous fast mode decision algorithms. The comparison parameters are time, PSNR, and bit rate. LMD achieve time saving of 66.65% with 0.05% detriment in PSNR and 0.17% increment in bit rate compared with the full search method.

  3. On the Performance of Maximum Likelihood Inverse Reinforcement Learning

    CERN Document Server

    Ratia, Héctor; Martinez-Cantin, Ruben

    2012-01-01

    Inverse reinforcement learning (IRL) addresses the problem of recovering a task description given a demonstration of the optimal policy used to solve such a task. The optimal policy is usually provided by an expert or teacher, making IRL specially suitable for the problem of apprenticeship learning. The task description is encoded in the form of a reward function of a Markov decision process (MDP). Several algorithms have been proposed to find the reward function corresponding to a set of demonstrations. One of the algorithms that has provided best results in different applications is a gradient method to optimize a policy squared error criterion. On a parallel line of research, other authors have presented recently a gradient approximation of the maximum likelihood estimate of the reward signal. In general, both approaches approximate the gradient estimate and the criteria at different stages to make the algorithm tractable and efficient. In this work, we provide a detailed description of the different metho...

  4. Maximum Likelihood Analysis of Low Energy CDMS II Germanium Data

    CERN Document Server

    Agnese, R; Balakishiyeva, D; Thakur, R Basu; Bauer, D A; Billard, J; Borgland, A; Bowles, M A; Brandt, D; Brink, P L; Bunker, R; Cabrera, B; Caldwell, D O; Cerdeno, D G; Chagani, H; Chen, Y; Cooley, J; Cornell, B; Crewdson, C H; Cushman, P; Daal, M; Di Stefano, P C F; Doughty, T; Esteban, L; Fallows, S; Figueroa-Feliciano, E; Fritts, M; Godfrey, G L; Golwala, S R; Graham, M; Hall, J; Harris, H R; Hertel, S A; Hofer, T; Holmgren, D; Hsu, L; Huber, M E; Jastram, A; Kamaev, O; Kara, B; Kelsey, M H; Kennedy, A; Kiveni, M; Koch, K; Leder, A; Loer, B; Asamar, E Lopez; Mahapatra, R; Mandic, V; Martinez, C; McCarthy, K A; Mirabolfathi, N; Moffatt, R A; Moore, D C; Nelson, R H; Oser, S M; Page, K; Page, W A; Partridge, R; Pepin, M; Phipps, A; Prasad, K; Pyle, M; Qiu, H; Rau, W; Redl, P; Reisetter, A; Ricci, Y; Rogers, H E; Saab, T; Sadoulet, B; Sander, J; Schneck, K; Schnee, R W; Scorza, S; Serfass, B; Shank, B; Speller, D; Upadhyayula, S; Villano, A N; Welliver, B; Wright, D H; Yellin, S; Yen, J J; Young, B A; Zhang, J

    2014-01-01

    We report on the results of a search for a Weakly Interacting Massive Particle (WIMP) signal in low-energy data of the Cryogenic Dark Matter Search (CDMS~II) experiment using a maximum likelihood analysis. A background model is constructed using GEANT4 to simulate the surface-event background from $^{210}$Pb decay-chain events, while using independent calibration data to model the gamma background. Fitting this background model to the data results in no statistically significant WIMP component. In addition, we perform fits using an analytic ad hoc background model proposed by Collar and Fields, who claimed to find a large excess of signal-like events in our data. We confirm the strong preference for a signal hypothesis in their analysis under these assumptions, but excesses are observed in both single- and multiple-scatter events, which implies the signal is not caused by WIMPs, but rather reflects the inadequacy of their background model.

  5. Maximum Likelihood Position Location with a Limited Number of References

    Directory of Open Access Journals (Sweden)

    D. Munoz-Rodriguez

    2011-04-01

    Full Text Available A Position Location (PL scheme for mobile users on the outskirts of coverage areas is presented. The proposedmethodology makes it possible to obtain location information with only two land-fixed references. We introduce ageneral formulation and show that maximum-likelihood estimation can provide adequate PL information in thisscenario. The Root Mean Square (RMS error and error-distribution characterization are obtained for differentpropagation scenarios. In addition, simulation results and comparisons to another method are provided showing theaccuracy and the robustness of the method proposed. We study accuracy limits of the proposed methodology fordifferent propagation environments and show that even in the case of mismatch in the error variances, good PLestimation is feasible.

  6. Does induction really reduce the likelihood of caesarean section?

    Science.gov (United States)

    Wickham, Sara

    2014-09-01

    Two recent systematic reviews have arrived at the same, rather surprising and somewhat counter-intuitive result. That is, contrary to the belief and experience of many people who work on labour wards every day, induction of labour doesn't increase the chance of caesarean section at all. In fact, the reviewers argue, their results demonstrate that induction of labour reduces the likelihood of caesarean section. It might be that our instincts are wrong, and that we need to reconsider what we think we know. But before we rush to recommend induction as the latest tool to promote normal birth, we might want to look a bit more closely at the evidence, as I am not at all certain that this apparently straightforward conclusion is quite as cut-and-dried as it sounds.

  7. Transfer Entropy as a Log-Likelihood Ratio

    Science.gov (United States)

    Barnett, Lionel; Bossomaier, Terry

    2012-09-01

    Transfer entropy, an information-theoretic measure of time-directed information transfer between joint processes, has steadily gained popularity in the analysis of complex stochastic dynamics in diverse fields, including the neurosciences, ecology, climatology, and econometrics. We show that for a broad class of predictive models, the log-likelihood ratio test statistic for the null hypothesis of zero transfer entropy is a consistent estimator for the transfer entropy itself. For finite Markov chains, furthermore, no explicit model is required. In the general case, an asymptotic χ2 distribution is established for the transfer entropy estimator. The result generalizes the equivalence in the Gaussian case of transfer entropy and Granger causality, a statistical notion of causal influence based on prediction via vector autoregression, and establishes a fundamental connection between directed information transfer and causality in the Wiener-Granger sense.

  8. The elaboration likelihood model and communication about food risks.

    Science.gov (United States)

    Frewer, L J; Howard, C; Hedderley, D; Shepherd, R

    1997-12-01

    Factors such as hazard type and source credibility have been identified as important in the establishment of effective strategies for risk communication. The elaboration likelihood model was adapted to investigate the potential impact of hazard type, information source, and persuasive content of information on individual engagement in elaborative, or thoughtful, cognitions about risk messages. One hundred sixty respondents were allocated to one of eight experimental groups, and the effects of source credibility, persuasive content of information and hazard type were systematically varied. The impact of the different factors on beliefs about the information and elaborative processing examined. Low credibility was particularly important in reducing risk perceptions, although persuasive content and hazard type were also influential in determining whether elaborative processing occurred.

  9. Sparse-posterior Gaussian Processes for general likelihoods

    CERN Document Server

    Yuan,; Abdel-Gawad, Ahmed H; Minka, Thomas P

    2012-01-01

    Gaussian processes (GPs) provide a probabilistic nonparametric representation of functions in regression, classification, and other problems. Unfortunately, exact learning with GPs is intractable for large datasets. A variety of approximate GP methods have been proposed that essentially map the large dataset into a small set of basis points. Among them, two state-of-the-art methods are sparse pseudo-input Gaussian process (SPGP) (Snelson and Ghahramani, 2006) and variablesigma GP (VSGP) Walder et al. (2008), which generalizes SPGP and allows each basis point to have its own length scale. However, VSGP was only derived for regression. In this paper, we propose a new sparse GP framework that uses expectation propagation to directly approximate general GP likelihoods using a sparse and smooth basis. It includes both SPGP and VSGP for regression as special cases. Plus as an EP algorithm, it inherits the ability to process data online. As a particular choice of approximating family, we blur each basis point with a...

  10. Evaluating maximum likelihood estimation methods to determine the hurst coefficients

    Science.gov (United States)

    Kendziorski, C. M.; Bassingthwaighte, J. B.; Tonellato, P. J.

    1999-12-01

    A maximum likelihood estimation method implemented in S-PLUS ( S-MLE) to estimate the Hurst coefficient ( H) is evaluated. The Hurst coefficient, with 0.5long memory time series by quantifying the rate of decay of the autocorrelation function. S-MLE was developed to estimate H for fractionally differenced (fd) processes. However, in practice it is difficult to distinguish between fd processes and fractional Gaussian noise (fGn) processes. Thus, the method is evaluated for estimating H for both fd and fGn processes. S-MLE gave biased results of H for fGn processes of any length and for fd processes of lengths less than 2 10. A modified method is proposed to correct for this bias. It gives reliable estimates of H for both fd and fGn processes of length greater than or equal to 2 11.

  11. Empirical likelihood for balanced ranked-set sampled data

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    Ranked-set sampling(RSS) often provides more efficient inference than simple random sampling(SRS).In this article,we propose a systematic nonparametric technique,RSS-EL,for hypoth-esis testing and interval estimation with balanced RSS data using empirical likelihood(EL).We detail the approach for interval estimation and hypothesis testing in one-sample and two-sample problems and general estimating equations.In all three cases,RSS is shown to provide more efficient inference than SRS of the same size.Moreover,the RSS-EL method does not require any easily violated assumptions needed by existing rank-based nonparametric methods for RSS data,such as perfect ranking,identical ranking scheme in two groups,and location shift between two population distributions.The merit of the RSS-EL method is also demonstrated through simulation studies.

  12. Marginal Maximum Likelihood Estimation of Item Response Models in R

    Directory of Open Access Journals (Sweden)

    Matthew S. Johnson

    2007-02-01

    Full Text Available Item response theory (IRT models are a class of statistical models used by researchers to describe the response behaviors of individuals to a set of categorically scored items. The most common IRT models can be classified as generalized linear fixed- and/or mixed-effect models. Although IRT models appear most often in the psychological testing literature, researchers in other fields have successfully utilized IRT-like models in a wide variety of applications. This paper discusses the three major methods of estimation in IRT and develops R functions utilizing the built-in capabilities of the R environment to find the marginal maximum likelihood estimates of the generalized partial credit model. The currently available R packages ltm is also discussed.

  13. Maximum likelihood identification of aircraft stability and control derivatives

    Science.gov (United States)

    Mehra, R. K.; Stepner, D. E.; Tyler, J. S.

    1974-01-01

    Application of a generalized identification method to flight test data analysis. The method is based on the maximum likelihood (ML) criterion and includes output error and equation error methods as special cases. Both the linear and nonlinear models with and without process noise are considered. The flight test data from lateral maneuvers of HL-10 and M2/F3 lifting bodies are processed to determine the lateral stability and control derivatives, instrumentation accuracies, and biases. A comparison is made between the results of the output error method and the ML method for M2/F3 data containing gusts. It is shown that better fits to time histories are obtained by using the ML method. The nonlinear model considered corresponds to the longitudinal equations of the X-22 VTOL aircraft. The data are obtained from a computer simulation and contain both process and measurement noise. The applicability of the ML method to nonlinear models with both process and measurement noise is demonstrated.

  14. Likelihood-Based Inference in Nonlinear Error-Correction Models

    DEFF Research Database (Denmark)

    Kristensen, Dennis; Rahbæk, Anders

    We consider a class of vector nonlinear error correction models where the transfer function (or loadings) of the stationary relation- ships is nonlinear. This includes in particular the smooth transition models. A general representation theorem is given which establishes the dynamic properties...... of the process in terms of stochastic and deter- ministic trends as well as stationary components. In particular, the behaviour of the cointegrating relations is described in terms of geo- metric ergodicity. Despite the fact that no deterministic terms are included, the process will have both stochastic trends...... and a linear trend in general. Gaussian likelihood-based estimators are considered for the long- run cointegration parameters, and the short-run parameters. Asymp- totic theory is provided for these and it is discussed to what extend asymptotic normality and mixed normaity can be found. A simulation study...

  15. Music genre classification via likelihood fusion from multiple feature models

    Science.gov (United States)

    Shiu, Yu; Kuo, C.-C. J.

    2005-01-01

    Music genre provides an efficient way to index songs in a music database, and can be used as an effective means to retrieval music of a similar type, i.e. content-based music retrieval. A new two-stage scheme for music genre classification is proposed in this work. At the first stage, we examine a couple of different features, construct their corresponding parametric models (e.g. GMM and HMM) and compute their likelihood functions to yield soft classification results. In particular, the timbre, rhythm and temporal variation features are considered. Then, at the second stage, these soft classification results are integrated to result in a hard decision for final music genre classification. Experimental results are given to demonstrate the performance of the proposed scheme.

  16. Analytical maximum likelihood estimation of stellar magnetic fields

    CERN Document Server

    González, M J Martínez; Ramos, A Asensio; Belluzzi, L

    2011-01-01

    The polarised spectrum of stellar radiation encodes valuable information on the conditions of stellar atmospheres and the magnetic fields that permeate them. In this paper, we give explicit expressions to estimate the magnetic field vector and its associated error from the observed Stokes parameters. We study the solar case where specific intensities are observed and then the stellar case, where we receive the polarised flux. In this second case, we concentrate on the explicit expression for the case of a slow rotator with a dipolar magnetic field geometry. Moreover, we also give explicit formulae to retrieve the magnetic field vector from the LSD profiles without assuming mean values for the LSD artificial spectral line. The formulae have been obtained assuming that the spectral lines can be described in the weak field regime and using a maximum likelihood approach. The errors are recovered by means of the hermitian matrix. The bias of the estimators are analysed in depth.

  17. Narrow band interference cancelation in OFDM: Astructured maximum likelihood approach

    KAUST Repository

    Sohail, Muhammad Sadiq

    2012-06-01

    This paper presents a maximum likelihood (ML) approach to mitigate the effect of narrow band interference (NBI) in a zero padded orthogonal frequency division multiplexing (ZP-OFDM) system. The NBI is assumed to be time variant and asynchronous with the frequency grid of the ZP-OFDM system. The proposed structure based technique uses the fact that the NBI signal is sparse as compared to the ZP-OFDM signal in the frequency domain. The structure is also useful in reducing the computational complexity of the proposed method. The paper also presents a data aided approach for improved NBI estimation. The suitability of the proposed method is demonstrated through simulations. © 2012 IEEE.

  18. Likelihood Approximation With Hierarchical Matrices For Large Spatial Datasets

    KAUST Repository

    Litvinenko, Alexander

    2017-09-03

    We use available measurements to estimate the unknown parameters (variance, smoothness parameter, and covariance length) of a covariance function by maximizing the joint Gaussian log-likelihood function. To overcome cubic complexity in the linear algebra, we approximate the discretized covariance function in the hierarchical (H-) matrix format. The H-matrix format has a log-linear computational cost and storage O(kn log n), where the rank k is a small integer and n is the number of locations. The H-matrix technique allows us to work with general covariance matrices in an efficient way, since H-matrices can approximate inhomogeneous covariance functions, with a fairly general mesh that is not necessarily axes-parallel, and neither the covariance matrix itself nor its inverse have to be sparse. We demonstrate our method with Monte Carlo simulations and an application to soil moisture data. The C, C++ codes and data are freely available.

  19. Accelerated maximum likelihood parameter estimation for stochastic biochemical systems

    Directory of Open Access Journals (Sweden)

    Daigle Bernie J

    2012-05-01

    Full Text Available Abstract Background A prerequisite for the mechanistic simulation of a biochemical system is detailed knowledge of its kinetic parameters. Despite recent experimental advances, the estimation of unknown parameter values from observed data is still a bottleneck for obtaining accurate simulation results. Many methods exist for parameter estimation in deterministic biochemical systems; methods for discrete stochastic systems are less well developed. Given the probabilistic nature of stochastic biochemical models, a natural approach is to choose parameter values that maximize the probability of the observed data with respect to the unknown parameters, a.k.a. the maximum likelihood parameter estimates (MLEs. MLE computation for all but the simplest models requires the simulation of many system trajectories that are consistent with experimental data. For models with unknown parameters, this presents a computational challenge, as the generation of consistent trajectories can be an extremely rare occurrence. Results We have developed Monte Carlo Expectation-Maximization with Modified Cross-Entropy Method (MCEM2: an accelerated method for calculating MLEs that combines advances in rare event simulation with a computationally efficient version of the Monte Carlo expectation-maximization (MCEM algorithm. Our method requires no prior knowledge regarding parameter values, and it automatically provides a multivariate parameter uncertainty estimate. We applied the method to five stochastic systems of increasing complexity, progressing from an analytically tractable pure-birth model to a computationally demanding model of yeast-polarization. Our results demonstrate that MCEM2 substantially accelerates MLE computation on all tested models when compared to a stand-alone version of MCEM. Additionally, we show how our method identifies parameter values for certain classes of models more accurately than two recently proposed computationally efficient methods

  20. Likelihood of tree topologies with fossils and diversification rate estimation.

    Science.gov (United States)

    Didier, Gilles; Fau, Marine; Laurin, Michel

    2017-04-18

    Since the diversification process cannot be directly observed at the human scale, it has to be studied from the information available, namely the extant taxa and the fossil record. In this sense, phylogenetic trees including both extant taxa and fossils are the most complete representations of the diversification process that one can get. Such phylogenetic trees can be reconstructed from molecular and morphological data, to some extent. Among the temporal information of such phylogenetic trees, fossil ages are by far the most precisely known (divergence times are inferences calibrated mostly with fossils). We propose here a method to compute the likelihood of a phylogenetic tree with fossils in which the only considered time information is the fossil ages, and apply it to the estimation of the diversification rates from such data. Since it is required in our computation, we provide a method for determining the probability of a tree topology under the standard diversification model.Testing 21 our approach on simulated data shows that the maximum likelihood rate estimates from the phylogenetic tree topology and the fossil dates are almost as accurate as those obtained by taking into account all the data, including the divergence times. Moreover, they are substantially more accurate than the estimates obtained only from the exact divergence times (without taking into account the fossil record).We also provide an empirical example composed of 50 Permo-carboniferous eupelycosaur (early synapsid) taxa ranging in age from about 315 Ma (Late Carboniferous) to 270 Ma (shortly after the end of the Early Permian). Our analyses suggest a speciation (cladogenesis, or birth) rate of about 0.1 per lineage and per My, a marginally lower extinction rate, and a considerable hidden paleobiodiversity of early synapsids. © The Author(s) 2017. Published by Oxford University Press, on behalf of the Society of Systematic Biologists. All rights reserved. For Permissions, please email

  1. Molecular clock fork phylogenies: closed form analytic maximum likelihood solutions.

    Science.gov (United States)

    Chor, Benny; Snir, Sagi

    2004-12-01

    Maximum likelihood (ML) is increasingly used as an optimality criterion for selecting evolutionary trees, but finding the global optimum is a hard computational task. Because no general analytic solution is known, numeric techniques such as hill climbing or expectation maximization (EM) are used in order to find optimal parameters for a given tree. So far, analytic solutions were derived only for the simplest model-three-taxa, two-state characters, under a molecular clock. Quoting Ziheng Yang, who initiated the analytic approach,"this seems to be the simplest case, but has many of the conceptual and statistical complexities involved in phylogenetic estimation."In this work, we give general analytic solutions for a family of trees with four-taxa, two-state characters, under a molecular clock. The change from three to four taxa incurs a major increase in the complexity of the underlying algebraic system, and requires novel techniques and approaches. We start by presenting the general maximum likelihood problem on phylogenetic trees as a constrained optimization problem, and the resulting system of polynomial equations. In full generality, it is infeasible to solve this system, therefore specialized tools for the molecular clock case are developed. Four-taxa rooted trees have two topologies-the fork (two subtrees with two leaves each) and the comb (one subtree with three leaves, the other with a single leaf). We combine the ultrametric properties of molecular clock fork trees with the Hadamard conjugation to derive a number of topology dependent identities. Employing these identities, we substantially simplify the system of polynomial equations for the fork. We finally employ symbolic algebra software to obtain closed formanalytic solutions (expressed parametrically in the input data). In general, four-taxa trees can have multiple ML points. In contrast, we can now prove that each fork topology has a unique(local and global) ML point.

  2. Effects of deceptive packaging and product involvement on purchase intention: an elaboration likelihood model perspective.

    Science.gov (United States)

    Lammers, H B

    2000-04-01

    From an Elaboration Likelihood Model perspective, it was hypothesized that postexposure awareness of deceptive packaging claims would have a greater negative effect on scores for purchase intention by consumers lowly involved rather than highly involved with a product (n = 40). Undergraduates who were classified as either highly or lowly (ns = 20 and 20) involved with M&Ms examined either a deceptive or non-deceptive package design for M&Ms candy and were subsequently informed of the deception employed in the packaging before finally rating their intention to purchase. As anticipated, highly deceived subjects who were low in involvement rated intention to purchase lower than their highly involved peers. Overall, the results attest to the robustness of the model and suggest that the model has implications beyond advertising effects and into packaging effects.

  3. Planck 2015 results. XI. CMB power spectra, likelihoods, and robustness of parameters

    Science.gov (United States)

    Planck Collaboration; Aghanim, N.; Arnaud, M.; Ashdown, M.; Aumont, J.; Baccigalupi, C.; Banday, A. J.; Barreiro, R. B.; Bartlett, J. G.; Bartolo, N.; Battaner, E.; Benabed, K.; Benoît, A.; Benoit-Lévy, A.; Bernard, J.-P.; Bersanelli, M.; Bielewicz, P.; Bock, J. J.; Bonaldi, A.; Bonavera, L.; Bond, J. R.; Borrill, J.; Bouchet, F. R.; Boulanger, F.; Bucher, M.; Burigana, C.; Butler, R. C.; Calabrese, E.; Cardoso, J.-F.; Catalano, A.; Challinor, A.; Chiang, H. C.; Christensen, P. R.; Clements, D. L.; Colombo, L. P. L.; Combet, C.; Coulais, A.; Crill, B. P.; Curto, A.; Cuttaia, F.; Danese, L.; Davies, R. D.; Davis, R. J.; de Bernardis, P.; de Rosa, A.; de Zotti, G.; Delabrouille, J.; Désert, F.-X.; Di Valentino, E.; Dickinson, C.; Diego, J. M.; Dolag, K.; Dole, H.; Donzelli, S.; Doré, O.; Douspis, M.; Ducout, A.; Dunkley, J.; Dupac, X.; Efstathiou, G.; Elsner, F.; Enßlin, T. A.; Eriksen, H. K.; Fergusson, J.; Finelli, F.; Forni, O.; Frailis, M.; Fraisse, A. A.; Franceschi, E.; Frejsel, A.; Galeotta, S.; Galli, S.; Ganga, K.; Gauthier, C.; Gerbino, M.; Giard, M.; Gjerløw, E.; González-Nuevo, J.; Górski, K. M.; Gratton, S.; Gregorio, A.; Gruppuso, A.; Gudmundsson, J. E.; Hamann, J.; Hansen, F. K.; Harrison, D. L.; Helou, G.; Henrot-Versillé, S.; Hernández-Monteagudo, C.; Herranz, D.; Hildebrandt, S. R.; Hivon, E.; Holmes, W. A.; Hornstrup, A.; Huffenberger, K. M.; Hurier, G.; Jaffe, A. H.; Jones, W. C.; Juvela, M.; Keihänen, E.; Keskitalo, R.; Kiiveri, K.; Knoche, J.; Knox, L.; Kunz, M.; Kurki-Suonio, H.; Lagache, G.; Lähteenmäki, A.; Lamarre, J.-M.; Lasenby, A.; Lattanzi, M.; Lawrence, C. R.; Le Jeune, M.; Leonardi, R.; Lesgourgues, J.; Levrier, F.; Lewis, A.; Liguori, M.; Lilje, P. B.; Lilley, M.; Linden-Vørnle, M.; Lindholm, V.; López-Caniego, M.; Macías-Pérez, J. F.; Maffei, B.; Maggio, G.; Maino, D.; Mandolesi, N.; Mangilli, A.; Maris, M.; Martin, P. G.; Martínez-González, E.; Masi, S.; Matarrese, S.; Meinhold, P. R.; Melchiorri, A.; Migliaccio, M.; Millea, M.; Mitra, S.; Miville-Deschênes, M.-A.; Moneti, A.; Montier, L.; Morgante, G.; Mortlock, D.; Mottet, S.; Munshi, D.; Murphy, J. A.; Narimani, A.; Naselsky, P.; Nati, F.; Natoli, P.; Noviello, F.; Novikov, D.; Novikov, I.; Oxborrow, C. A.; Paci, F.; Pagano, L.; Pajot, F.; Paoletti, D.; Partridge, B.; Pasian, F.; Patanchon, G.; Pearson, T. J.; Perdereau, O.; Perotto, L.; Pettorino, V.; Piacentini, F.; Piat, M.; Pierpaoli, E.; Pietrobon, D.; Plaszczynski, S.; Pointecouteau, E.; Polenta, G.; Ponthieu, N.; Pratt, G. W.; Prunet, S.; Puget, J.-L.; Rachen, J. P.; Reinecke, M.; Remazeilles, M.; Renault, C.; Renzi, A.; Ristorcelli, I.; Rocha, G.; Rossetti, M.; Roudier, G.; Rouillé d'Orfeuil, B.; Rubiño-Martín, J. A.; Rusholme, B.; Salvati, L.; Sandri, M.; Santos, D.; Savelainen, M.; Savini, G.; Scott, D.; Serra, P.; Spencer, L. D.; Spinelli, M.; Stolyarov, V.; Stompor, R.; Sunyaev, R.; Sutton, D.; Suur-Uski, A.-S.; Sygnet, J.-F.; Tauber, J. A.; Terenzi, L.; Toffolatti, L.; Tomasi, M.; Tristram, M.; Trombetti, T.; Tucci, M.; Tuovinen, J.; Umana, G.; Valenziano, L.; Valiviita, J.; Van Tent, F.; Vielva, P.; Villa, F.; Wade, L. A.; Wandelt, B. D.; Wehus, I. K.; Yvon, D.; Zacchei, A.; Zonca, A.

    2016-09-01

    This paper presents the Planck 2015 likelihoods, statistical descriptions of the 2-point correlationfunctions of the cosmic microwave background (CMB) temperature and polarization fluctuations that account for relevant uncertainties, both instrumental and astrophysical in nature. They are based on the same hybrid approach used for the previous release, i.e., a pixel-based likelihood at low multipoles (ℓpower spectra at higher multipoles. The main improvements are the use of more and better processed data and of Planck polarization information, along with more detailed models of foregrounds and instrumental uncertainties. The increased redundancy brought by more than doubling the amount of data analysed enables further consistency checks and enhanced immunity to systematic effects. It also improves the constraining power of Planck, in particular with regard to small-scale foreground properties. Progress in the modelling of foreground emission enables the retention of a larger fraction of the sky to determine the properties of the CMB, which also contributes to the enhanced precision of the spectra. Improvements in data processing and instrumental modelling further reduce uncertainties. Extensive tests establish the robustness and accuracy of the likelihood results, from temperature alone, from polarization alone, and from their combination. For temperature, we also perform a full likelihood analysis of realistic end-to-end simulations of the instrumental response to the sky, which were fed into the actual data processing pipeline; this does not reveal biases from residual low-level instrumental systematics. Even with the increase in precision and robustness, the ΛCDM cosmological model continues to offer a very good fit to the Planck data. The slope of the primordial scalar fluctuations, ns, is confirmed smaller than unity at more than 5σ from Planck alone. We further validate the robustness of the likelihood results against specific extensions to the baseline

  4. Transferring Aviation Practices into Clinical Medicine for the Promotion of High Reliability.

    Science.gov (United States)

    Powell-Dunford, Nicole; McPherson, Mark K; Pina, Joseph S; Gaydos, Steven J

    2017-05-01

    Aviation is a classic example of a high reliability organization (HRO)-an organization in which catastrophic events are expected to occur without control measures. As health care systems transition toward high reliability, aviation practices are increasingly transferred for clinical implementation. A PubMed search using the terms aviation, crew resource management, and patient safety was undertaken. Manuscripts authored by physician pilots and accident investigation regulations were analyzed. Subject matter experts involved in adoption of aviation practices into the medical field were interviewed. A PubMed search yielded 621 results with 22 relevant for inclusion. Improved clinical outcomes were noted in five research trials in which aviation practices were adopted, particularly with regard to checklist usage and crew resource-management training. Effectiveness of interventions was influenced by intensity of application, leadership involvement, and provision of staff training. The usefulness of incorporating mishap investigation techniques has not been established. Whereas aviation accident investigation is highly standardized, the investigation of medical error is characterized by variation. The adoption of aviation practices into clinical medicine facilitates an evolution toward high reliability. Evidence for the efficacy of the checklist and crew resource-management training is robust. Transference of aviation accident investigation practices is preliminary. A standardized, independent investigation process could facilitate the development of a safety culture commensurate with that achieved in the aviation industry.Powell-Dunford N, McPherson MK, Pina JS, Gaydos SJ. Transferring aviation practices into clinical medicine for the promotion of high reliability. Aerosp Med Hum Perform. 2017; 88(5):487-491.

  5. Empirical likelihood-based inference in a partially linear model for longitudinal data

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    A partially linear model with longitudinal data is considered, empirical likelihood to inference for the regression coefficients and the baseline function is investigated, the empirical log-likelihood ratios is proven to be asymptotically chi-squared, and the corresponding confidence regions for the parameters of interest are then constructed. Also by the empirical likelihood ratio functions, we can obtain the maximum empirical likelihood estimates of the regression coefficients and the baseline function, and prove the asymptotic normality. The numerical results are conducted to compare the performance of the empirical likelihood and the normal approximation-based method, and a real example is analysed.

  6. Empirical likelihood-based inference in a partially linear model for longitudinal data

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    A partially linear model with longitudinal data is considered, empirical likelihood to infer- ence for the regression coefficients and the baseline function is investigated, the empirical log-likelihood ratios is proven to be asymptotically chi-squared, and the corresponding confidence regions for the pa- rameters of interest are then constructed. Also by the empirical likelihood ratio functions, we can obtain the maximum empirical likelihood estimates of the regression coefficients and the baseline function, and prove the asymptotic normality. The numerical results are conducted to compare the performance of the empirical likelihood and the normal approximation-based method, and a real example is analysed.

  7. Clinical implementation of a novel applicator in high-dose-rate brachytherapy treatment of esophageal cancer

    Directory of Open Access Journals (Sweden)

    Ivan M. Buzurovic

    2016-08-01

    Full Text Available Purpose : In this study, we present the clinical implementation of a novel transoral balloon centering esophageal applicator (BCEA and the initial clinical experience in high-dose-rate (HDR brachytherapy treatment of esophageal cancer, using this applicator. Material and methods: Acceptance testing and commissioning of the BCEA were performed prior to clinical use. Full performance testing was conducted including measurements of the dimensions and the catheter diameter, evaluation of the inflatable balloon consistency, visibility of the radio-opaque markers, congruence of the markers, absolute and relative accuracy of the HDR source in the applicator using the radiochromic film and source position simulator, visibility and digitization of the applicator on the computed tomography (CT images under the clinical conditions, and reproducibility of the offset. Clinical placement of the applicator, treatment planning, treatment delivery, and patient’s response to the treatment were elaborated as well. Results : The experiments showed sub-millimeter accuracy in the source positioning with distal position at 1270 mm. The digitization (catheter reconstruction was uncomplicated due to the good visibility of markers. The treatment planning resulted in a favorable dose distribution. This finding was pronounced for the treatment of the curvy anatomy of the lesion due to the improved repeatability and consistency of the delivered fractional dose to the patient, since the radioactive source was placed centrally within the lumen with respect to the clinical target due to the five inflatable balloons. Conclusions : The consistency of the BCEA positioning resulted in the possibility to deliver optimized non-uniform dose along the catheter, which resulted in an increase of the dose to the cancerous tissue and lower doses to healthy tissue. A larger number of patients and long-term follow-up will be required to investigate if the delivered optimized treatment can

  8. Regression analysis based on conditional likelihood approach under semi-competing risks data.

    Science.gov (United States)

    Hsieh, Jin-Jian; Huang, Yu-Ting

    2012-07-01

    Medical studies often involve semi-competing risks data, which consist of two types of events, namely terminal event and non-terminal event. Because the non-terminal event may be dependently censored by the terminal event, it is not possible to make inference on the non-terminal event without extra assumptions. Therefore, this study assumes that the dependence structure on the non-terminal event and the terminal event follows a copula model, and lets the marginal regression models of the non-terminal event and the terminal event both follow time-varying effect models. This study uses a conditional likelihood approach to estimate the time-varying coefficient of the non-terminal event, and proves the large sample properties of the proposed estimator. Simulation studies show that the proposed estimator performs well. This study also uses the proposed method to analyze AIDS Clinical Trial Group (ACTG 320).

  9. Clinical trial end points for high-grade glioma: the evolving landscape*

    Science.gov (United States)

    Reardon, David A.; Galanis, Evanthia; DeGroot, John F.; Cloughesy, Timothy F.; Wefel, Jeffrey S.; Lamborn, Kathleen R.; Lassman, Andrew B.; Gilbert, Mark R.; Sampson, John H.; Wick, Wolfgang; Chamberlain, Marc C.; Macdonald, David R.; Mehta, Minesh P.; Vogelbaum, Michael A.; Chang, Susan M.; Van den Bent, Martin J.; Wen, Patrick Y.

    2011-01-01

    To review the strengths and weaknesses of primary and auxiliary end points for clinical trials among patients with high-grade glioma (HGG). Recent advances in outcome for patients with newly diagnosed and recurrent HGG, coupled with the development of multiple promising therapeutics with myriad antitumor actions, have led to significant growth in the number of clinical trials for patients with HGG. Appropriate clinical trial design and the incorporation of optimal end points are imperative to efficiently and effectively evaluate such agents and continue to advance outcome. Growing recognition of limitations weakening the reliability of traditional clinical trial primary end points has generated increasing uncertainty of how best to evaluate promising therapeutics for patients with HGG. The phenomena of pseudoprogression and pseudoresponse have made imaging-based end points, including overall radiographic response and progression-free survival, problematic. Although overall survival is considered the “gold-standard” end point, recently identified active salvage therapies such as bevacizumab may diminish the association between presalvage therapy and overall survival. Finally, advances in imaging as well as the assessment of patient function and well being have strengthened interest in auxiliary end points assessing these aspects of patient care and outcome. Better appreciation of the strengths and limitations of primary end points will lead to more effective clinical trial strategies. Technical advances in imaging as well as improved survival for patients with HGG support the further development of auxiliary end points evaluating novel imaging approaches as well as measures of patient function and well being. PMID:21310734

  10. High-fidelity nursing simulation: impact on student self-confidence and clinical competence.

    Science.gov (United States)

    Blum, Cynthia A; Borglund, Susan; Parcells, Dax

    2010-01-01

    Development of safe nursing practice in entry-level nursing students requires special consideration from nurse educators. The paucity of data supporting high-fidelity patient simulation effectiveness in this population informed the development of a quasi-experimental, quantitative study of the relationship between simulation and student self-confidence and clinical competence. Moreover, the study reports a novel approach to measuring self-confidence and competence of entry-level nursing students. Fifty-three baccalaureate students, enrolled in either a traditional or simulation-enhanced laboratory, participated during their first clinical rotation. Student self-confidence and faculty perception of student clinical competence were measured using selected scale items of the Lasater Clinical Judgment Rubric. The results indicated an overall improvement in self-confidence and competence across the semester, however, simulation did not significantly enhance these caring attributes. The study highlights the need for further examination of teaching strategies developed to promote the transfer of self-confidence and competence from the laboratory to the clinical setting.

  11. Human Cytomegalovirus UL138 Open Reading Frame Is Highly Conserved in Clinical Strains

    Institute of Scientific and Technical Information of China (English)

    Ying Qi; Rong He; Yan-ping Ma; Zheng-rong Sun; Yao-hua Ji; Qiang Ruan

    2009-01-01

    To investigate the variability of human cytomegalovirus (HCMV) UL138 open reading flame (ORF) in clinical strains.Methods HCMV UL138 ORF was amplified by polymerase chain reaction (PCR) and PCR amplification products were sequenced directly, and the data were analyzed in 19 clinical strains.Results UL138 ORF in all 30 clinical strains was amplified successfully. Compared with that of Toledo strain, the nucleotide and amino acid sequence identities of UL138 ORF in all strains were 97.41% to 99.41% and 98.24% to 99.42%, respectively. All of the nucleotide mutations were substitutions. The spatial structure and post-translational modification sites of UL138 encoded proteins were conserved. The result of phylogenetic tree showed that HCMV UL138 sequence variations were not definitely related with different clinical symptoms.Conclusion HCMV UL138 ORF in clinical strains is high conservation, which might be helpful for UL138 encoded protein to play a role in latent infection of HCMV.

  12. Estimating epidemiological parameters for bovine tuberculosis in British cattle using a Bayesian partial-likelihood approach.

    Science.gov (United States)

    O'Hare, A; Orton, R J; Bessell, P R; Kao, R R

    2014-05-22

    Fitting models with Bayesian likelihood-based parameter inference is becoming increasingly important in infectious disease epidemiology. Detailed datasets present the opportunity to identify subsets of these data that capture important characteristics of the underlying epidemiology. One such dataset describes the epidemic of bovine tuberculosis (bTB) in British cattle, which is also an important exemplar of a disease with a wildlife reservoir (the Eurasian badger). Here, we evaluate a set of nested dynamic models of bTB transmission, including individual- and herd-level transmission heterogeneity and assuming minimal prior knowledge of the transmission and diagnostic test parameters. We performed a likelihood-based bootstrapping operation on the model to infer parameters based only on the recorded numbers of cattle testing positive for bTB at the start of each herd outbreak considering high- and low-risk areas separately. Models without herd heterogeneity are preferred in both areas though there is some evidence for super-spreading cattle. Similar to previous studies, we found low test sensitivities and high within-herd basic reproduction numbers (R0), suggesting that there may be many unobserved infections in cattle, even though the current testing regime is sufficient to control within-herd epidemics in most cases. Compared with other, more data-heavy approaches, the summary data used in our approach are easily collected, making our approach attractive for other systems.

  13. Human cytomegalovirus UL145 gene is highly conserved among clinical strains

    Indian Academy of Sciences (India)

    Zhengrong Sun; Ying Lu; Qiang Ruan; Yaohua Ji; Rong He; Ying Qi; Yanping Ma; Yujing Huang

    2007-09-01

    Human cytomegalovirus (HCMV), a ubiquitous human pathogen, is the leading cause of birth defects in newborns. A region (referred to as UL/b′) present in the Toledo strain of HCMV and low-passage clinical isolates) contains 22 additional genes, which are absent in the highly passaged laboratory strain AD169. One of these genes, UL145 open reading frame (ORF), is located between the highly variable genes UL144 and UL146. To assess the structure of the UL145 gene, the UL145 ORF was amplified by PCR and sequenced from 16 low-passage clinical isolates and 15 non-passage strains from suspected congenitally infected infants. Nine UL145 sequences previously published in the GenBank were used for sequence comparison. The identities of the gene and the similarities of its putative protein among all strains were 95.9–100% and 96.6–100%, respectively. The post-translational modification motifs of the UL145 putative protein in clinical strains were conserved, comprising the protein kinase C phosphorylation motif (PKC) and casein kinase II phosphorylation site (CK-II). We conclude that the structure of the UL145 gene and its putative protein are relatively conserved among clinical strains, irrespective of whether the strains come from patients with different manifestations, from different areas of the world, or were passaged or not in human embryonic lung fibroblast (HELF) cells.

  14. Development and clinical performance of high throughput loop-mediated isothermal amplification for detection of malaria

    OpenAIRE

    Perera, Rushini S.; Ding, Xavier C; Tully, Frank; Oliver, James; Bright, Nigel; Bell, David; Chiodini, Peter L; Gonzalez, Iveth J.; Spencer D Polley

    2017-01-01

    Background Accurate and efficient detection of sub-microscopic malaria infections is crucial for enabling rapid treatment and interruption of transmission. Commercially available malaria LAMP kits have excellent diagnostic performance, though throughput is limited by the need to prepare samples individually. Here, we evaluate the clinical performance of a newly developed high throughput (HTP) sample processing system for use in conjunction with the Eiken malaria LAMP kit. Methods The HTP syst...

  15. Emergence of a Streptococcus pneumoniae clinical isolate highly resistant to telithromycin and fluoroquinolones.

    Science.gov (United States)

    Faccone, Diego; Andres, Patricia; Galas, Marcelo; Tokumoto, Marta; Rosato, Adriana; Corso, Alejandra

    2005-11-01

    Streptococcus pneumoniae is a major pathogen causing community-acquired pneumonia and acute bronchitis. Macrolides, fluoroquinolones (FQs), and, recently, telithromycin (TEL) constitute primary therapeutic options, and rare cases of resistance have been reported. In this report, we describe the emergence of an S. pneumoniae clinical isolate with high-level TEL resistance (MIC, 256 microg/ml) and simultaneous resistance to FQs. Ongoing studies are oriented to elucidate the precise mechanism of resistance to TEL.

  16. Emergence of a Streptococcus pneumoniae Clinical Isolate Highly Resistant to Telithromycin and Fluoroquinolones

    OpenAIRE

    Faccone, Diego; Andres, Patricia; Galas, Marcelo; Tokumoto, Marta; Rosato, Adriana; Corso, Alejandra

    2005-01-01

    Streptococcus pneumoniae is a major pathogen causing community-acquired pneumonia and acute bronchitis. Macrolides, fluoroquinolones (FQs), and, recently, telithromycin (TEL) constitute primary therapeutic options, and rare cases of resistance have been reported. In this report, we describe the emergence of an S. pneumoniae clinical isolate with high-level TEL resistance (MIC, 256 μg/ml) and simultaneous resistance to FQs. Ongoing studies are oriented to elucidate the precise mechanism of res...

  17. Development and clinical performance of high throughput loop-mediated isothermal amplification for detection of malaria.

    OpenAIRE

    Perera, RS; Ding, XC; Tully, F.; Oliver, J.; Bright, N; Bell, D.; Chiodini, PL; Gonzalez, IJ; Polley, SD

    2017-01-01

    Background Accurate and efficient detection of sub-microscopic malaria infections is crucial for enabling rapid treatment and interruption of transmission. Commercially available malaria LAMP kits have excellent diagnostic performance, though throughput is limited by the need to prepare samples individually. Here, we evaluate the clinical performance of a newly developed high throughput (HTP) sample processing system for use in conjunction with the Eiken malaria LAMP kit. Methods The HTP syst...

  18. Clinical analysis of high serum IgE in autoimmune pancreatitis

    Institute of Scientific and Technical Information of China (English)

    Kenji; Hirano; Minoru; Tada; Hiroyuki; Isayama; Kazumichi; Kawakubo; Hiroshi; Yagioka; Takashi; Sasaki; Hirofumi; Kogure; Yousuke; Nakai; Naoki; Sasahira; Takeshi; Tsujino; Nobuo; Toda; Kazuhiko; Koike

    2010-01-01

    AIM: To clarify the clinical significance of high serum IgE in autoimmune pancreatitis (AIP). METHODS: Forty-two AIP patients, whose IgE was measured before steroid treatment, were analyzed. To evaluate the relationship between IgE levels and the disease activity of AIP, we examined (1) Frequency of high IgE (> 170 IU/mL) and concomitant allergic dis-eases requiring treatment; (2) Correlations between IgG, IgG4, and IgE; (3) Relationship between the presence of extrapancreatic lesions and IgE; (4) Re-lation...

  19. Clinical evaluation of low vision and central foveal thickness in highly myopic cataract eyes after phacoemulsification

    Directory of Open Access Journals (Sweden)

    Ji-Li Chen

    2015-07-01

    Full Text Available AIM:To retrospectively evaluate central foveal thickness in highly myopic eyes with best correct visual acuity(BCVAMETHODS: In this retrospective clinical study, we consecutively recruited 70 low highly myopic cataract subjects(70 eyesunderwent Phaco. Postoperative visits were performed at 1wk, 1 and 3mo. Postoperative BCVA were recorded and further divided into 2 groups with BCVARESULTS: The ratio of BCVAPr=-0.716, PCONCLUSION: In this study, BCVA is improved after 3mo follow up. There has significant correlation between postoperative BCVA and central foveal thickness.

  20. A high-throughput clinical assay for testing drug facilitation of exposure therapy.

    Science.gov (United States)

    Rodebaugh, Thomas L; Levinson, Cheri A; Lenze, Eric J

    2013-07-01

    Several studies have demonstrated that D-cycloserine (DCS) facilitates exposure therapy. We developed a standardized test of this facilitation (i.e., a clinical assay), with the goal of testing for facilitation more quickly and inexpensively than a full clinical trial. We developed a standardized brief exposure in which participants with social anxiety disorder gave a videotaped speech. Participants were randomized to receive a single capsule of 250 mg DCS or a matching placebo prior to preparation for the speech. Distress levels were rated during the speech and again, approximately 1 week later, during a speech in an identical situation. Our primary measure of DCS's exposure-facilitating effect was between-session habituation: whether or not the participants showed less distress during the second speech compared to the first. We also measured levels of subjective anxiety and fear of scrutiny. Subjects randomized to receive DCS prior to their first speech were more likely to show between-session habituation than those who received placebo. We also found greater reduction of performance-related fear overall in the DCS group. Our clinical assay was able to detect exposure facilitation effects rapidly and in a highly standardized way, and is estimated to take a fraction of the time and costs of a clinical trial. Given the increasing interest in using medications to enhance learning-based psychotherapy, this high-throughput clinical assay approach may be a favorable method for testing novel mechanisms of action, and clarifying optimal parameters, for therapy facilitation. © 2013 Wiley Periodicals, Inc.

  1. High resolution genotyping of clinical Aspergillus flavus isolates from India using microsatellites.

    Directory of Open Access Journals (Sweden)

    Shivaprakash M Rudramurthy

    Full Text Available BACKGROUND: Worldwide, Aspergillus flavus is the second leading cause of allergic, invasive and colonizing fungal diseases in humans. However, it is the most common species causing fungal rhinosinusitis and eye infections in tropical countries. Despite the growing challenges due to A. flavus, the molecular epidemiology of this fungus has not been well studied. We evaluated the use of microsatellites for high resolution genotyping of A. flavus from India and a possible connection between clinical presentation and genotype of the involved isolate. METHODOLOGY/PRINCIPAL FINDINGS: A panel of nine microsatellite markers were selected from the genome of A. flavus NRRL 3357. These markers were used to type 162 clinical isolates of A. flavus. All nine markers proved to be polymorphic displaying up to 33 alleles per marker. Thirteen isolates proved to be a mixture of different genotypes. Among the 149 pure isolates, 124 different genotypes could be recognized. The discriminatory power (D for the individual markers ranged from 0.657 to 0.954. The D value of the panel of nine markers combined was 0.997. The multiplex multicolor approach was instrumental in rapid typing of a large number of isolates. There was no correlation between genotype and the clinical presentation of the infection. CONCLUSIONS/SIGNIFICANCE: There is a large genotypic diversity in clinical A. flavus isolates from India. The presence of more than one genotype in clinical samples illustrates the possibility that persons may be colonized by multiple genotypes and that any isolate from a clinical specimen is not necessarily the one actually causing infection. Microsatellites are excellent typing targets for discriminating between A. flavus isolates from various origins.

  2. [High-resolution distortion-product otoacoustic emissions: method and clinical applications].

    Science.gov (United States)

    Janssen, T; Lodwig, A; Müller, J; Oswald, H

    2014-10-01

    Unlike pure tone thresholds that assess both peripheral and central sound processing, distortion-product otoacoustic emissions (DPOAEs) selectively mirror the functioning of the cochlear amplifier. High resolution DPOAEs are missing in the toolbox of routine audiometry due to the fact that high resolution DPOAE measurements are more time-consuming when compared to normal clinical DP grams with rough frequency resolution. Measurements of high resolution DPOAEs allow an early assessment of beginning sensory cell damage due to sound overexposure or administration of ototoxic drugs. When using a rough grid, sensory cell damage would be overlooked as in the early state damage only appears at some distinct cochlear sites. A review is given on the method and application of high resolution DPOAEs.

  3. DNA Methylation-Guided Prediction of Clinical Failure in High-Risk Prostate Cancer.

    Directory of Open Access Journals (Sweden)

    Kirill Litovkin

    Full Text Available Prostate cancer (PCa is a very heterogeneous disease with respect to clinical outcome. This study explored differential DNA methylation in a priori selected genes to diagnose PCa and predict clinical failure (CF in high-risk patients.A quantitative multiplex, methylation-specific PCR assay was developed to assess promoter methylation of the APC, CCND2, GSTP1, PTGS2 and RARB genes in formalin-fixed, paraffin-embedded tissue samples from 42 patients with benign prostatic hyperplasia and radical prostatectomy specimens of patients with high-risk PCa, encompassing training and validation cohorts of 147 and 71 patients, respectively. Log-rank tests, univariate and multivariate Cox models were used to investigate the prognostic value of the DNA methylation.Hypermethylation of APC, CCND2, GSTP1, PTGS2 and RARB was highly cancer-specific. However, only GSTP1 methylation was significantly associated with CF in both independent high-risk PCa cohorts. Importantly, trichotomization into low, moderate and high GSTP1 methylation level subgroups was highly predictive for CF. Patients with either a low or high GSTP1 methylation level, as compared to the moderate methylation groups, were at a higher risk for CF in both the training (Hazard ratio [HR], 3.65; 95% CI, 1.65 to 8.07 and validation sets (HR, 4.27; 95% CI, 1.03 to 17.72 as well as in the combined cohort (HR, 2.74; 95% CI, 1.42 to 5.27 in multivariate analysis.Classification of primary high-risk tumors into three subtypes based on DNA methylation can be combined with clinico-pathological parameters for a more informative risk-stratification of these PCa patients.

  4. Maximum-Likelihood Semiblind Equalization of Doubly Selective Channels Using the EM Algorithm

    Directory of Open Access Journals (Sweden)

    Gideon Kutz

    2010-01-01

    Full Text Available Maximum-likelihood semi-blind joint channel estimation and equalization for doubly selective channels and single-carrier systems is proposed. We model the doubly selective channel as an FIR filter where each filter tap is modeled as a linear combination of basis functions. This channel description is then integrated in an iterative scheme based on the expectation-maximization (EM principle that converges to the channel description vector estimation. We discuss the selection of the basis functions and compare various functions sets. To alleviate the problem of convergence to a local maximum, we propose an initialization scheme to the EM iterations based on a small number of pilot symbols. We further derive a pilot positioning scheme targeted to reduce the probability of convergence to a local maximum. Our pilot positioning analysis reveals that for high Doppler rates it is better to spread the pilots evenly throughout the data block (and not to group them even for frequency-selective channels. The resulting equalization algorithm is shown to be superior over previously proposed equalization schemes and to perform in many cases close to the maximum-likelihood equalizer with perfect channel knowledge. Our proposed method is also suitable for coded systems and as a building block for Turbo equalization algorithms.

  5. Taming outliers in pulsar-timing datasets with hierarchical likelihoods and Hamiltonian sampling

    Science.gov (United States)

    Vallisneri, Michele; van Haasteren, Rutger

    2017-01-01

    Pulsar-timing datasets have been analyzed with great success using probabilistic treatments based on Gaussian distributions, with applications ranging from studies of neutron-star structure to tests of general relativity and searches for nanosecond gravitational waves. As for other applications of Gaussian distributions, outliers in timing measurements pose a significant challenge to statistical inference, since they can bias the estimation of timing and noise parameters, and affect reported parameter uncertainties. We describe and demonstrate a practical end-to-end approach to perform Bayesian inference of timing and noise parameters robustly in the presence of outliers, and to identify these probabilistically. The method is fully consistent (i.e., outlier-ness probabilities vary in tune with the posterior distributions of the timing and noise parameters), and it relies on the efficient sampling of the hierarchical form of the pulsar-timing likelihood. Such sampling has recently become possible with a "no-U-turn" Hamiltonian sampler coupled to a highly customized reparametrization of the likelihood; this code is described elsewhere, but it is already available online. We recommend our method as a standard step in the preparation of pulsar-timing-array datasets: even if statistical inference is not affected, follow-up studies of outlier candidates can reveal unseen problems in radio observations and timing measurements; furthermore, confidence in the results of gravitational-wave searches will only benefit from stringent statistical evidence that datasets are clean and outlier-free.

  6. Observation Likelihood Model Design and Failure Recovery Scheme toward Reliable Localization of Mobile Robots

    Directory of Open Access Journals (Sweden)

    Chang-bae Moon

    2011-01-01

    Full Text Available Although there have been many researches on mobile robot localization, it is still difficult to obtain reliable localization performance in a human co-existing real environment. Reliability of localization is highly dependent upon developer's experiences because uncertainty is caused by a variety of reasons. We have developed a range sensor based integrated localization scheme for various indoor service robots. Through the experience, we found out that there are several significant experimental issues. In this paper, we provide useful solutions for following questions which are frequently faced with in practical applications: 1 How to design an observation likelihood model? 2 How to detect the localization failure? 3 How to recover from the localization failure? We present design guidelines of observation likelihood model. Localization failure detection and recovery schemes are presented by focusing on abrupt wheel slippage. Experiments were carried out in a typical office building environment. The proposed scheme to identify the localizer status is useful in practical environments. Moreover, the semi-global localization is a computationally efficient recovery scheme from localization failure. The results of experiments and analysis clearly present the usefulness of proposed solutions.

  7. A probabilistic assessment of the likelihood of vegetation drought under varying climate conditions across China.

    Science.gov (United States)

    Liu, Zhiyong; Li, Chao; Zhou, Ping; Chen, Xiuzhi

    2016-10-07

    Climate change significantly impacts the vegetation growth and terrestrial ecosystems. Using satellite remote sensing observations, here we focus on investigating vegetation dynamics and the likelihood of vegetation-related drought under varying climate conditions across China. We first compare temporal trends of Normalized Difference Vegetation Index (NDVI) and climatic variables over China. We find that in fact there is no significant change in vegetation over the cold regions where warming is significant. Then, we propose a joint probability model to estimate the likelihood of vegetation-related drought conditioned on different precipitation/temperature scenarios in growing season across China. To the best of our knowledge, this study is the first to examine the vegetation-related drought risk over China from a perspective based on joint probability. Our results demonstrate risk patterns of vegetation-related drought under both low and high precipitation/temperature conditions. We further identify the variations in vegetation-related drought risk under different climate conditions and the sensitivity of drought risk to climate variability. These findings provide insights for decision makers to evaluate drought risk and vegetation-related develop drought mitigation strategies over China in a warming world. The proposed methodology also has a great potential to be applied for vegetation-related drought risk assessment in other regions worldwide.

  8. Application of Artificial Bee Colony Algorithm to Maximum Likelihood DOA Estimation

    Institute of Scientific and Technical Information of China (English)

    Zhicheng Zhang; Jun Lin; Yaowu Shi

    2013-01-01

    Maximum Likelihood (ML) method has an excellent performance for Direction-Of-Arrival (DOA) estimation,but a multidimensional nonlinear solution search is required which complicates the computation and prevents the method from practical use.To reduce the high computational burden of ML method and make it more suitable to engineering applications,we apply the Artificial Bee Colony (ABC) algorithm to maximize the likelihood function for DOA estimation.As a recently proposed bio-inspired computing algorithm,ABC algorithm is originally used to optimize multivariable functions by imitating the behavior of bee colony finding excellent nectar sources in the nature environment.It offers an excellent alternative to the conventional methods in ML-DOA estimation.The performance of ABC-based ML and other popular meta-heuristic-based ML methods for DOA estimation are compared for various scenarios of convergence,Signal-to-Noise Ratio (SNR),and number of iterations.The computation loads of ABC-based ML and the conventional ML methods for DOA estimation are also investigated.Simulation results demonstrate that the proposed ABC based method is more efficient in computation and statistical performance than other ML-based DOA estimation methods.

  9. Taming outliers in pulsar-timing data sets with hierarchical likelihoods and Hamiltonian sampling

    Science.gov (United States)

    Vallisneri, Michele; van Haasteren, Rutger

    2017-04-01

    Pulsar-timing data sets have been analysed with great success using probabilistic treatments based on Gaussian distributions, with applications ranging from studies of neutron-star structure to tests of general relativity and searches for nanosecond gravitational waves. As for other applications of Gaussian distributions, outliers in timing measurements pose a significant challenge to statistical inference, since they can bias the estimation of timing and noise parameters, and affect reported parameter uncertainties. We describe and demonstrate a practical end-to-end approach to perform Bayesian inference of timing and noise parameters robustly in the presence of outliers, and to identify these probabilistically. The method is fully consistent (i.e. outlier-ness probabilities vary in tune with the posterior distributions of the timing and noise parameters), and it relies on the efficient sampling of the hierarchical form of the pulsar-timing likelihood. Such sampling has recently become possible with a 'no-U-turn' Hamiltonian sampler coupled to a highly customized reparametrization of the likelihood; this code is described elsewhere, but it is already available online. We recommend our method as a standard step in the preparation of pulsar-timing-array data sets: even if statistical inference is not affected, follow-up studies of outlier candidates can reveal unseen problems in radio observations and timing measurements; furthermore, confidence in the results of gravitational-wave searches will only benefit from stringent statistical evidence that data sets are clean and outlier-free.

  10. A probabilistic assessment of the likelihood of vegetation drought under varying climate conditions across China

    Science.gov (United States)

    Liu, Zhiyong; Li, Chao; Zhou, Ping; Chen, Xiuzhi

    2016-10-01

    Climate change significantly impacts the vegetation growth and terrestrial ecosystems. Using satellite remote sensing observations, here we focus on investigating vegetation dynamics and the likelihood of vegetation-related drought under varying climate conditions across China. We first compare temporal trends of Normalized Difference Vegetation Index (NDVI) and climatic variables over China. We find that in fact there is no significant change in vegetation over the cold regions where warming is significant. Then, we propose a joint probability model to estimate the likelihood of vegetation-related drought conditioned on different precipitation/temperature scenarios in growing season across China. To the best of our knowledge, this study is the first to examine the vegetation-related drought risk over China from a perspective based on joint probability. Our results demonstrate risk patterns of vegetation-related drought under both low and high precipitation/temperature conditions. We further identify the variations in vegetation-related drought risk under different climate conditions and the sensitivity of drought risk to climate variability. These findings provide insights for decision makers to evaluate drought risk and vegetation-related develop drought mitigation strategies over China in a warming world. The proposed methodology also has a great potential to be applied for vegetation-related drought risk assessment in other regions worldwide.

  11. The Intersection of Care Seeking and Clinical Capacity for Patients With Highly Pathogenic Avian Influenza A (H5N1) Virus in Indonesia: Knowledge and Treatment Practices of the Public and Physicians.

    Science.gov (United States)

    Kreslake, Jennifer M; Wahyuningrum, Yunita; Iuliano, Angela D; Storms, Aaron D; Lafond, Kathryn E; Mangiri, Amalya; Praptiningsih, Catharina Y; Safi, Basil; Uyeki, Timothy M; Storey, J Douglas

    2016-12-01

    Indonesia has the highest human mortality from highly pathogenic avian influenza (HPAI) A (H5N1) virus infection in the world. A survey of households (N=2520) measured treatment sources and beliefs among symptomatic household members. A survey of physicians (N=554) in various types of health care facilities measured knowledge, assessment and testing behaviors, and perceived clinical capacity. Households reported confidence in health care system capacity but infrequently sought treatment for potential HPAI H5N1 signs/symptoms. More clinicians were confident in their knowledge of diagnosis and treatment than in the adequacy of related equipment and resources at their facilities. Physicians expressed awareness of the HPAI H5N1 suspect case definition, yet expressed only moderate knowledge in questioning symptomatic patients about exposures. Self-reported likelihood of testing for HPAI H5N1 virus was high after learning of certain exposures. Knowledge of antiviral treatment was moderate, but it was higher among clinicians in puskesmas. Physicians in private outpatient clinics, the most heavily used facilities, reported the lowest confidence in their diagnostic and treatment capabilities. Educational campaigns can encourage recall of possible poultry exposure when patients are experiencing signs/symptoms and can raise awareness of the effectiveness of antivirals to drive people to seek health care. Clinicians may benefit from training regarding exposure assessment and referral procedures, particularly in private clinics. (Disaster Med Public Health Preparedness. 2016;10:838-847).

  12. High-dose intravenous immunoglobulin in inflammatory myopathies: experience based on controlled clinical trials.

    Science.gov (United States)

    Dalakas, M C

    2003-10-01

    Controlled clinical trials with high-dose intravenous immunoglobulin (IVIg) have been conducted in patients with DM and IBM, but not PM. A double-blind placebo-controlled study in DM patients, resistant or partially responsive to conventional therapies, showed that IVIg is very effective in improving both the muscle strength and the skin rash. The clinical benefit, which was impressive in patients with early disease, was associated with improvement in the muscle cytoarchitecture. Quantitative histological studies in repeated muscle biopsies showed a statistically significant increased in the size of muscle fibers and the number of capillaries with normalization of the capillary diameter. Resolution of the aberrant immunopathological parameters including interception of complement activation products and downregulation of T cells, ICAM-I, VCAM, TGF-beta and MHC-I molecules was also noted. In IBM, IVIg showed marginal, and non statistically significant, improvements in muscle strength. Up to 20% of patients however, demonstrated clinical improvement with increased activities of daily living while certain muscle groups, such as the muscles of swallowing, showed significant improvements compared to placebo implying mild regional benefits. In PM, small uncontrolled series have shown improvements in muscle strength in up to 70% of the IVIg-treated patients. Because PM, as a stand-alone clinical entity, is a very rare disease, completion of controlled trials will be very difficult.

  13. Comparative behaviour of the Dynamically Penalized Likelihood algorithm in inverse radiation therapy planning

    Energy Technology Data Exchange (ETDEWEB)

    Llacer, Jorge [EC Engineering Consultants, LLC, Los Gatos, CA (United States)]. E-mail: jllacer@home.com; Solberg, Timothy D. [Department of Radiation Oncology, University of California, Los Angeles, CA (United States)]. E-mail: Solberg@radonc.ucla.edu; Promberger, Claus [BrainLAB AG, Heimstetten (Germany)]. E-mail: promberg@brainlab.com

    2001-10-01

    This paper presents a description of tests carried out to compare the behaviour of five algorithms in inverse radiation therapy planning: (1) The Dynamically Penalized Likelihood (DPL), an algorithm based on statistical estimation theory; (2) an accelerated version of the same algorithm; (3) a new fast adaptive simulated annealing (ASA) algorithm; (4) a conjugate gradient method; and (5) a Newton gradient method. A three-dimensional mathematical phantom and two clinical cases have been studied in detail. The phantom consisted of a U-shaped tumour with a partially enclosed 'spinal cord'. The clinical examples were a cavernous sinus meningioma and a prostate case. The algorithms have been tested in carefully selected and controlled conditions so as to ensure fairness in the assessment of results. It has been found that all five methods can yield relatively similar optimizations, except when a very demanding optimization is carried out. For the easier cases, the differences are principally in robustness, ease of use and optimization speed. In the more demanding case, there are significant differences in the resulting dose distributions. The accelerated DPL emerges as possibly the algorithm of choice for clinical practice. An appendix describes the differences in behaviour between the new ASA method and the one based on a patent by the Nomos Corporation. (author)

  14. Comparative behaviour of the Dynamically Penalized Likelihood algorithm in inverse radiation therapy planning

    Science.gov (United States)

    Llacer, Jorge; Solberg, Timothy D.; Promberger, Claus

    2001-10-01

    This paper presents a description of tests carried out to compare the behaviour of five algorithms in inverse radiation therapy planning: (1) The Dynamically Penalized Likelihood (DPL), an algorithm based on statistical estimation theory; (2) an accelerated version of the same algorithm; (3) a new fast adaptive simulated annealing (ASA) algorithm; (4) a conjugate gradient method; and (5) a Newton gradient method. A three-dimensional mathematical phantom and two clinical cases have been studied in detail. The phantom consisted of a U-shaped tumour with a partially enclosed 'spinal cord'. The clinical examples were a cavernous sinus meningioma and a prostate case. The algorithms have been tested in carefully selected and controlled conditions so as to ensure fairness in the assessment of results. It has been found that all five methods can yield relatively similar optimizations, except when a very demanding optimization is carried out. For the easier cases, the differences are principally in robustness, ease of use and optimization speed. In the more demanding case, there are significant differences in the resulting dose distributions. The accelerated DPL emerges as possibly the algorithm of choice for clinical practice. An appendix describes the differences in behaviour between the new ASA method and the one based on a patent by the Nomos Corporation.

  15. Evidence for extra radiation? Profile likelihood versus Bayesian posterior

    CERN Document Server

    Hamann, Jan

    2011-01-01

    A number of recent analyses of cosmological data have reported hints for the presence of extra radiation beyond the standard model expectation. In order to test the robustness of these claims under different methods of constructing parameter constraints, we perform a Bayesian posterior-based and a likelihood profile-based analysis of current data. We confirm the presence of a slight discrepancy between posterior- and profile-based constraints, with the marginalised posterior preferring higher values of the effective number of neutrino species N_eff. This can be traced back to a volume effect occurring during the marginalisation process, and we demonstrate that the effect is related to the fact that cosmic microwave background (CMB) data constrain N_eff only indirectly via the redshift of matter-radiation equality. Once present CMB data are combined with external information about, e.g., the Hubble parameter, the difference between the methods becomes small compared to the uncertainty of N_eff. We conclude tha...

  16. Scale invariant for one-sided multivariate likelihood ratio tests

    Directory of Open Access Journals (Sweden)

    Samruam Chongcharoen

    2010-07-01

    Full Text Available Suppose 1 2 , ,..., n X X X is a random sample from Np ( ,V distribution. Consider 0 1 2 : ... 0 p H      and1 : 0 for 1, 2,..., i H   i  p , let 1 0 H  H denote the hypothesis that 1 H holds but 0 H does not, and let ~ 0 H denote thehypothesis that 0 H does not hold. Because the likelihood ratio test (LRT of 0 H versus 1 0 H  H is complicated, severalad hoc tests have been proposed. Tang, Gnecco and Geller (1989 proposed an approximate LRT, Follmann (1996 suggestedrejecting 0 H if the usual test of 0 H versus ~ 0 H rejects 0 H with significance level 2 and a weighted sum of the samplemeans is positive, and Chongcharoen, Singh and Wright (2002 modified Follmann’s test to include information about thecorrelation structure in the sum of the sample means. Chongcharoen and Wright (2007, 2006 give versions of the Tang-Gnecco-Geller tests and Follmann-type tests, respectively, with invariance properties. With LRT’s scale invariant desiredproperty, we investigate its powers by using Monte Carlo techniques and compare them with the tests which we recommendin Chongcharoen and Wright (2007, 2006.

  17. Maximum likelihood based classification of electron tomographic data.

    Science.gov (United States)

    Stölken, Michael; Beck, Florian; Haller, Thomas; Hegerl, Reiner; Gutsche, Irina; Carazo, Jose-Maria; Baumeister, Wolfgang; Scheres, Sjors H W; Nickell, Stephan

    2011-01-01

    Classification and averaging of sub-tomograms can improve the fidelity and resolution of structures obtained by electron tomography. Here we present a three-dimensional (3D) maximum likelihood algorithm--MLTOMO--which is characterized by integrating 3D alignment and classification into a single, unified processing step. The novelty of our approach lies in the way we calculate the probability of observing an individual sub-tomogram for a given reference structure. We assume that the reference structure is affected by a 'compound wedge', resulting from the summation of many individual missing wedges in distinct orientations. The distance metric underlying our probability calculations effectively down-weights Fourier components that are observed less frequently. Simulations demonstrate that MLTOMO clearly outperforms the 'constrained correlation' approach and has advantages over existing approaches in cases where the sub-tomograms adopt preferred orientations. Application of our approach to cryo-electron tomographic data of ice-embedded thermosomes revealed distinct conformations that are in good agreement with results obtained by previous single particle studies.

  18. On the shape and likelihood of oceanic rogue waves.

    Science.gov (United States)

    Benetazzo, Alvise; Ardhuin, Fabrice; Bergamasco, Filippo; Cavaleri, Luigi; Guimarães, Pedro Veras; Schwendeman, Michael; Sclavo, Mauro; Thomson, Jim; Torsello, Andrea

    2017-08-15

    We consider the observation and analysis of oceanic rogue waves collected within spatio-temporal (ST) records of 3D wave fields. This class of records, allowing a sea surface region to be retrieved, is appropriate for the observation of rogue waves, which come up as a random phenomenon that can occur at any time and location of the sea surface. To verify this aspect, we used three stereo wave imaging systems to gather ST records of the sea surface elevation, which were collected in different sea conditions. The wave with the ST maximum elevation (happening to be larger than the rogue threshold 1.25H s) was then isolated within each record, along with its temporal profile. The rogue waves show similar profiles, in agreement with the theory of extreme wave groups. We analyze the rogue wave probability of occurrence, also in the context of ST extreme value distributions, and we conclude that rogue waves are more likely than previously reported; the key point is coming across them, in space as well as in time. The dependence of the rogue wave profile and likelihood on the sea state conditions is also investigated. Results may prove useful in predicting extreme wave occurrence probability and strength during oceanic storms.

  19. Maximum Likelihood Sequence Detection Receivers for Nonlinear Optical Channels

    Directory of Open Access Journals (Sweden)

    Gabriel N. Maggio

    2015-01-01

    Full Text Available The space-time whitened matched filter (ST-WMF maximum likelihood sequence detection (MLSD architecture has been recently proposed (Maggio et al., 2014. Its objective is reducing implementation complexity in transmissions over nonlinear dispersive channels. The ST-WMF-MLSD receiver (i drastically reduces the number of states of the Viterbi decoder (VD and (ii offers a smooth trade-off between performance and complexity. In this work the ST-WMF-MLSD receiver is investigated in detail. We show that the space compression of the nonlinear channel is an instrumental property of the ST-WMF-MLSD which results in a major reduction of the implementation complexity in intensity modulation and direct detection (IM/DD fiber optic systems. Moreover, we assess the performance of ST-WMF-MLSD in IM/DD optical systems with chromatic dispersion (CD and polarization mode dispersion (PMD. Numerical results for a 10 Gb/s, 700 km, and IM/DD fiber-optic link with 50 ps differential group delay (DGD show that the number of states of the VD in ST-WMF-MLSD can be reduced ~4 times compared to an oversampled MLSD. Finally, we analyze the impact of the imperfect channel estimation on the performance of the ST-WMF-MLSD. Our results show that the performance degradation caused by channel estimation inaccuracies is low and similar to that achieved by existing MLSD schemes (~0.2 dB.

  20. Covariance of maximum likelihood evolutionary distances between sequences aligned pairwise.

    Science.gov (United States)

    Dessimoz, Christophe; Gil, Manuel

    2008-06-23

    The estimation of a distance between two biological sequences is a fundamental process in molecular evolution. It is usually performed by maximum likelihood (ML) on characters aligned either pairwise or jointly in a multiple sequence alignment (MSA). Estimators for the covariance of pairs from an MSA are known, but we are not aware of any solution for cases of pairs aligned independently. In large-scale analyses, it may be too costly to compute MSAs every time distances must be compared, and therefore a covariance estimator for distances estimated from pairs aligned independently is desirable. Knowledge of covariances improves any process that compares or combines distances, such as in generalized least-squares phylogenetic tree building, orthology inference, or lateral gene transfer detection. In this paper, we introduce an estimator for the covariance of distances from sequences aligned pairwise. Its performance is analyzed through extensive Monte Carlo simulations, and compared to the well-known variance estimator of ML distances. Our covariance estimator can be used together with the ML variance estimator to form covariance matrices. The estimator performs similarly to the ML variance estimator. In particular, it shows no sign of bias when sequence divergence is below 150 PAM units (i.e. above ~29% expected sequence identity). Above that distance, the covariances tend to be underestimated, but then ML variances are also underestimated.

  1. Likelihood analysis of the Local Group acceleration revisited

    CERN Document Server

    Ciecielag, P

    2004-01-01

    We reexamine likelihood analyzes of the Local Group (LG) acceleration, paying particular attention to nonlinear effects. Under the approximation that the joint distribution of the LG acceleration and velocity is Gaussian, two quantities describing nonlinear effects enter these analyzes. The first one is the coherence function, i.e. the cross-correlation coefficient of the Fourier modes of gravity and velocity fields. The second one is the ratio of velocity power spectrum to gravity power spectrum. To date, in all analyzes of the LG acceleration the second quantity was not accounted for. Extending our previous work, we study both the coherence function and the ratio of the power spectra. With the aid of numerical simulations we obtain expressions for the two as functions of wavevector and \\sigma_8. Adopting WMAP's best determination of \\sigma_8, we estimate the most likely value of the parameter \\beta and its errors. As the observed values of the LG velocity and gravity, we adopt respectively a CMB-based estim...

  2. A Maximum Likelihood Approach to Least Absolute Deviation Regression

    Directory of Open Access Journals (Sweden)

    Yinbo Li

    2004-09-01

    Full Text Available Least absolute deviation (LAD regression is an important tool used in numerous applications throughout science and engineering, mainly due to the intrinsic robust characteristics of LAD. In this paper, we show that the optimization needed to solve the LAD regression problem can be viewed as a sequence of maximum likelihood estimates (MLE of location. The derived algorithm reduces to an iterative procedure where a simple coordinate transformation is applied during each iteration to direct the optimization procedure along edge lines of the cost surface, followed by an MLE of location which is executed by a weighted median operation. Requiring weighted medians only, the new algorithm can be easily modularized for hardware implementation, as opposed to most of the other existing LAD methods which require complicated operations such as matrix entry manipulations. One exception is Wesolowsky's direct descent algorithm, which among the top algorithms is also based on weighted median operations. Simulation shows that the new algorithm is superior in speed to Wesolowsky's algorithm, which is simple in structure as well. The new algorithm provides a better tradeoff solution between convergence speed and implementation complexity.

  3. Maximum-Likelihood Continuity Mapping (MALCOM): An Alternative to HMMs

    Energy Technology Data Exchange (ETDEWEB)

    Nix, D.A.; Hogden, J.E.

    1998-12-01

    The authors describe Maximum-Likelihood Continuity Mapping (MALCOM) as an alternative to hidden Markov models (HMMs) for processing sequence data such as speech. While HMMs have a discrete ''hidden'' space constrained by a fixed finite-automata architecture, MALCOM has a continuous hidden space (a continuity map) that is constrained only by a smoothness requirement on paths through the space. MALCOM fits into the same probabilistic framework for speech recognition as HMMs, but it represents a far more realistic model of the speech production process. The authors support this claim by generating continuity maps for three speakers and using the resulting MALCOM paths to predict measured speech articulator data. The correlations between the MALCOM paths (obtained from only the speech acoustics) and the actual articulator movements average 0.77 on an independent test set not used to train MALCOM nor the predictor. On average, this unsupervised model achieves 92% of performance obtained using the corresponding supervised method.

  4. Maximum likelihood estimation for cytogenetic dose-response curves

    Energy Technology Data Exchange (ETDEWEB)

    Frome, E.L; DuFrain, R.J.

    1983-10-01

    In vitro dose-response curves are used to describe the relation between the yield of dicentric chromosome aberrations and radiation dose for human lymphocytes. The dicentric yields follow the Poisson distribution, and the expected yield depends on both the magnitude and the temporal distribution of the dose for low LET radiation. A general dose-response model that describes this relation has been obtained by Kellerer and Rossi using the theory of dual radiation action. The yield of elementary lesions is kappa(..gamma..d + g(t, tau)d/sup 2/), where t is the time and d is dose. The coefficient of the d/sup 2/ term is determined by the recovery function and the temporal mode of irradiation. Two special cases of practical interest are split-dose and continuous exposure experiments, and the resulting models are intrinsically nonlinear in the parameters. A general purpose maximum likelihood estimation procedure is described and illustrated with numerical examples from both experimental designs. Poisson regression analysis is used for estimation, hypothesis testing, and regression diagnostics. Results are discussed in the context of exposure assessment procedures for both acute and chronic human radiation exposure.

  5. Maximum likelihood sequence estimation for optical complex direct modulation.

    Science.gov (United States)

    Che, Di; Yuan, Feng; Shieh, William

    2017-04-17

    Semiconductor lasers are versatile optical transmitters in nature. Through the direct modulation (DM), the intensity modulation is realized by the linear mapping between the injection current and the light power, while various angle modulations are enabled by the frequency chirp. Limited by the direct detection, DM lasers used to be exploited only as 1-D (intensity or angle) transmitters by suppressing or simply ignoring the other modulation. Nevertheless, through the digital coherent detection, simultaneous intensity and angle modulations (namely, 2-D complex DM, CDM) can be realized by a single laser diode. The crucial technique of CDM is the joint demodulation of intensity and differential phase with the maximum likelihood sequence estimation (MLSE), supported by a closed-form discrete signal approximation of frequency chirp to characterize the MLSE transition probability. This paper proposes a statistical method for the transition probability to significantly enhance the accuracy of the chirp model. Using the statistical estimation, we demonstrate the first single-channel 100-Gb/s PAM-4 transmission over 1600-km fiber with only 10G-class DM lasers.

  6. Empirical Likelihood-Based ANOVA for Trimmed Means

    Science.gov (United States)

    Velina, Mara; Valeinis, Janis; Greco, Luca; Luta, George

    2016-01-01

    In this paper, we introduce an alternative to Yuen’s test for the comparison of several population trimmed means. This nonparametric ANOVA type test is based on the empirical likelihood (EL) approach and extends the results for one population trimmed mean from Qin and Tsao (2002). The results of our simulation study indicate that for skewed distributions, with and without variance heterogeneity, Yuen’s test performs better than the new EL ANOVA test for trimmed means with respect to control over the probability of a type I error. This finding is in contrast with our simulation results for the comparison of means, where the EL ANOVA test for means performs better than Welch’s heteroscedastic F test. The analysis of a real data example illustrates the use of Yuen’s test and the new EL ANOVA test for trimmed means for different trimming levels. Based on the results of our study, we recommend the use of Yuen’s test for situations involving the comparison of population trimmed means between groups of interest. PMID:27690063

  7. Approximate Maximum Likelihood Commercial Bank Loan Management Model

    Directory of Open Access Journals (Sweden)

    Godwin N.O.   Asemota

    2009-01-01

    Full Text Available Problem statement: Loan management is a very complex and yet, a vitally important aspect of any commercial bank operations. The balance sheet position shows the main sources of funds as deposits and shareholders contributions. Approach: In order to operate profitably, remain solvent and consequently grow, a commercial bank needs to properly manage its excess cash to yield returns in the form of loans. Results: The above are achieved if the bank can honor depositors withdrawals at all times and also grant loans to credible borrowers. This is so because loans are the main portfolios of a commercial bank that yield the highest rate of returns. Commercial banks and the environment in which they operate are dynamic. So, any attempt to model their behavior without including some elements of uncertainty would be less than desirable. The inclusion of uncertainty factor is now possible with the advent of stochastic optimal control theories. Thus, approximate maximum likelihood algorithm with variable forgetting factor was used to model the loan management behavior of a commercial bank in this study. Conclusion: The results showed that uncertainty factor employed in the stochastic modeling, enable us to adaptively control loan demand as well as fluctuating cash balances in the bank. However, this loan model can also visually aid commercial bank managers planning decisions by allowing them to competently determine excess cash and invest this excess cash as loans to earn more assets without jeopardizing public confidence.

  8. Maximum likelihood reconstruction for Ising models with asynchronous updates

    CERN Document Server

    Zeng, Hong-Li; Aurell, Erik; Hertz, John; Roudi, Yasser

    2012-01-01

    We describe how the couplings in a non-equilibrium Ising model can be inferred from observing the model history. Two cases of an asynchronous update scheme are considered: one in which we know both the spin history and the update times (times at which an attempt was made to flip a spin) and one in which we only know the spin history (i.e., the times at which spins were actually flipped). In both cases, maximizing the likelihood of the data leads to exact learning rules for the couplings in the model. For the first case, we show that one can average over all possible choices of update times to obtain a learning rule that depends only on spin correlations and not on the specific spin history. For the second case, the same rule can be derived within a further decoupling approximation. We study all methods numerically for fully asymmetric Sherrington-Kirkpatrick models, varying the data length, system size, temperature, and external field. Good convergence is observed in accordance with the theoretical expectatio...

  9. Quantifying uncertainty, variability and likelihood for ordinary differential equation models

    LENUS (Irish Health Repository)

    Weisse, Andrea Y

    2010-10-28

    Abstract Background In many applications, ordinary differential equation (ODE) models are subject to uncertainty or variability in initial conditions and parameters. Both, uncertainty and variability can be quantified in terms of a probability density function on the state and parameter space. Results The partial differential equation that describes the evolution of this probability density function has a form that is particularly amenable to application of the well-known method of characteristics. The value of the density at some point in time is directly accessible by the solution of the original ODE extended by a single extra dimension (for the value of the density). This leads to simple methods for studying uncertainty, variability and likelihood, with significant advantages over more traditional Monte Carlo and related approaches especially when studying regions with low probability. Conclusions While such approaches based on the method of characteristics are common practice in other disciplines, their advantages for the study of biological systems have so far remained unrecognized. Several examples illustrate performance and accuracy of the approach and its limitations.

  10. Maximum-likelihood estimation of circle parameters via convolution.

    Science.gov (United States)

    Zelniker, Emanuel E; Clarkson, I Vaughan L

    2006-04-01

    The accurate fitting of a circle to noisy measurements of circumferential points is a much studied problem in the literature. In this paper, we present an interpretation of the maximum-likelihood estimator (MLE) and the Delogne-Kåsa estimator (DKE) for circle-center and radius estimation in terms of convolution on an image which is ideal in a certain sense. We use our convolution-based MLE approach to find good estimates for the parameters of a circle in digital images. In digital images, it is then possible to treat these estimates as preliminary estimates into various other numerical techniques which further refine them to achieve subpixel accuracy. We also investigate the relationship between the convolution of an ideal image with a "phase-coded kernel" (PCK) and the MLE. This is related to the "phase-coded annulus" which was introduced by Atherton and Kerbyson who proposed it as one of a number of new convolution kernels for estimating circle center and radius. We show that the PCK is an approximate MLE (AMLE). We compare our AMLE method to the MLE and the DKE as well as the Cramér-Rao Lower Bound in ideal images and in both real and synthetic digital images.

  11. Is your system calibrated? MRI gradient system calibration for pre-clinical, high-resolution imaging.

    Directory of Open Access Journals (Sweden)

    James O'Callaghan

    Full Text Available High-field, pre-clinical MRI systems are widely used to characterise tissue structure and volume in small animals, using high resolution imaging. Both applications rely heavily on the consistent, accurate calibration of imaging gradients, yet such calibrations are typically only performed during maintenance sessions by equipment manufacturers, and potentially with acceptance limits that are inadequate for phenotyping. To overcome this difficulty, we present a protocol for gradient calibration quality assurance testing, based on a 3D-printed, open source, structural phantom that can be customised to the dimensions of individual scanners and RF coils. In trials on a 9.4 T system, the gradient scaling errors were reduced by an order of magnitude, and displacements of greater than 100 µm, caused by gradient non-linearity, were corrected using a post-processing technique. The step-by-step protocol can be integrated into routine pre-clinical MRI quality assurance to measure and correct for these errors. We suggest that this type of quality assurance is essential for robust pre-clinical MRI experiments that rely on accurate imaging gradients, including small animal phenotyping and diffusion MR.

  12. High heart rate: more than a risk factor. Lessons from a clinical practice survey.

    Science.gov (United States)

    Barrios, Vivencio; Escobar, Carlos; Bertomeu, Vicente; Murga, Nekane; de Pablo, Carmen; Asín, Enrique

    2009-11-12

    Several epidemiological studies have reported that an elevated heart rate (HR) is associated with coronary atherosclerosis independently of other risk factors. Nevertheless, it is still unclear whether HR is itself the cause or there is merely an association between HR and mortality in this population. A total of 1686 patients with hypertension and chronic ischemic heart disease were included in this study. According to the resting HR, the patients were distributed in 3 groups (group 1: HR82 bpm). 580 patients (34.4%) belonged to group 1; 936 (55.5%) to group 2 and 170 (10.1%) to group 3. Patients with high HR exhibited a poorer prognosis not only due to a worse clinical profile (more concomitant cardiovascular risk factors and organ damage), but suggestively because despite the use of a similar number of drugs, patients with higher HR were associated with lesser risk control rates in daily clinical practice. Despite current guidelines that do not still recognize HR as a cardiovascular risk factor, it appears that physicians should pay more attention to it in clinical practice since high HR is warning about an increased risk.

  13. Cannabidiol is a partial agonist at dopamine D2High receptors, predicting its antipsychotic clinical dose

    Science.gov (United States)

    Seeman, P

    2016-01-01

    Although all current antipsychotics act by interfering with the action of dopamine at dopamine D2 receptors, two recent reports showed that 800 to 1000 mg of cannabidiol per day alleviated the signs and symptoms of schizophrenia, although cannabidiol is not known to act on dopamine receptors. Because these recent clinical findings may indicate an important exception to the general rule that all antipsychotics interfere with dopamine at dopamine D2 receptors, the present study examined whether cannabidiol acted directly on D2 receptors, using tritiated domperidone to label rat brain striatal D2 receptors. It was found that cannabidiol inhibited the binding of radio-domperidone with dissociation constants of 11 nm at dopamine D2High receptors and 2800 nm at dopamine D2Low receptors, in the same biphasic manner as a dopamine partial agonist antipsychotic drug such as aripiprazole. The clinical doses of cannabidiol are sufficient to occupy the functional D2High sites. it is concluded that the dopamine partial agonist action of cannabidiol may account for its clinical antipsychotic effects. PMID:27754480

  14. Clinical significance of high anti-entamoeba histolytica antibody titer in asymptomatic HIV-1-infected individuals.

    Science.gov (United States)

    Watanabe, Koji; Aoki, Takahiro; Nagata, Naoyoshi; Tanuma, Junko; Kikuchi, Yoshimi; Oka, Shinichi; Gatanaga, Hiroyuki

    2014-06-01

    Anti-Entamoeba histolytica antibody (anti- E. histolytica) is widely used in seroprevalence studies though its clinical significance has not been assessed previously. Anti-E. histolytica titer was measured at first visit to our clinic (baseline) in 1303 patients infected with human immunodeficiency virus type 1 (HIV-1). The time to diagnosis of invasive amebiasis was assessed by Kaplan-Meier method and risk factors for the development of invasive amebiasis were assessed by Cox proportional-hazards regression analysis. For patients who developed invasive amebiasis, anti-E. histolytica titers at onset were compared with those at baseline and after treatment. The anti-E. histolytica seroprevalence in the study population was 21.3% (277/1303). Eighteen patients developed invasive amebiasis during the treatment-free period among 1207 patients who had no history of previous treatment with nitroimidazole. Patients with high anti-E. histolytica titer at baseline developed invasive amebiasis more frequently than those with low anti-E. histolytica titer. Most cases of invasive amebiasis who had high anti-E. histolytica titer at baseline developed within 1 year. High anti-E. histolytica titer was the only independent predictor of future invasive amebiasis. Anti-E. histolytica titer was elevated at the onset of invasive amebiasis in patients with low anti-E. histolytica titer at baseline. Asymptomatic HIV-1-infected individuals with high anti-E. histolytica titer are at risk of invasive amebiasis probably due to exacerbation of subclinical amebiasis.

  15. Assessment of high blood pressure patients in the third year’s Surgical Clinic of the Dentistry course at Cesumar

    OpenAIRE

    Menin, Cristiane; Bortoloto, Flávia Gongora; Gustavo Jacobucci FARAH; Filho, Liogi Iwaki; Iwaki, Lílian Cristina Vessoni; Leite, Pablo C. Comelli; Gentini, Raquel Forlani

    2007-01-01

    With the increase of arterial hypertension in the Brazilian population, it has become essential to point out to undergraduate students the need for a thorough clinical examination of patients, and the special care with high blood pressure patients, especially in a surgical clinic where complications may be severe. The objective of this work has been to assess the number of high blood pressure patients that come the Surgical Clinic of the Dentistry course of CESUMAR, and find out if these pati...

  16. Fourier ptychographic reconstruction using Poisson maximum likelihood and truncated Wirtinger gradient

    CERN Document Server

    Bian, Liheng; Chung, Jaebum; Ou, Xiaoze; Yang, Changhuei; Chen, Feng; Dai, Qionghai

    2016-01-01

    Fourier ptychographic microscopy (FPM) is a novel computational coherent imaging technique for high space-bandwidth product imaging. Mathematically, Fourier ptychographic (FP) reconstruction can be implemented as a phase retrieval optimization process, in which we only obtain low resolution intensity images corresponding to the sub-bands of the sample's high resolution (HR) spatial spectrum, and aim to retrieve the complex HR spectrum. In real setups, the measurements always suffer from various degenerations such as Gaussian noise, Poisson noise, speckle noise and pupil location error, which would largely degrade the reconstruction. To efficiently address these degenerations, we propose a novel FP reconstruction method under a gradient descent optimization framework in this paper. The technique utilizes Poisson maximum likelihood for better signal modeling, and truncated Wirtinger gradient for error removal. Results on both simulated data and real data captured using our laser FPM setup show that the proposed...

  17. Maximum likelihood q-estimator reveals nonextensivity regulated by extracellular potassium in the mammalian neuromuscular junction

    CERN Document Server

    da Silva, A J; Santos, D O C; Lima, R F

    2013-01-01

    Recently, we demonstrated the existence of nonextensivity in neuromuscular transmission [Phys. Rev. E 84, 041925 (2011)]. In the present letter, we propose a general criterion based on the q-calculus foundations and nonextensive statistics to estimate the values for both scale factor and q-index using the maximum likelihood q-estimation method (MLqE). We next applied our theoretical findings to electrophysiological recordings from neuromuscular junction (NMJ) where spontaneous miniature end plate potentials (MEPP) were analyzed. These calculations were performed in both normal and high extracellular potassium concentration, [K+]o. This protocol was assumed to test the validity of the q-index in electrophysiological conditions closely resembling physiological stimuli. Surprisingly, the analysis showed a significant difference between the q-index in high and normal [K+]o, where the magnitude of nonextensivity was increased. Our letter provides a general way to obtain the best q-index from the q-Gaussian distrib...

  18. Likelihood based observability analysis and confidence intervals for predictions of dynamic models

    CERN Document Server

    Kreutz, Clemens; Timmer, Jens

    2011-01-01

    Mechanistic dynamic models of biochemical networks such as Ordinary Differential Equations (ODEs) contain unknown parameters like the reaction rate constants and the initial concentrations of the compounds. The large number of parameters as well as their nonlinear impact on the model responses hamper the determination of confidence regions for parameter estimates. At the same time, classical approaches translating the uncertainty of the parameters into confidence intervals for model predictions are hardly feasible. In this article it is shown that a so-called prediction profile likelihood yields reliable confidence intervals for model predictions, despite arbitrarily complex and high-dimensional shapes of the confidence regions for the estimated parameters. Prediction confidence intervals of the dynamic states allow a data-based observability analysis. The approach renders the issue of sampling a high-dimensional parameter space into evaluating one-dimensional prediction spaces. The method is also applicable ...

  19. Analysis of allergen immunotherapy studies shows increased clinical efficacy in highly symptomatic patients

    DEFF Research Database (Denmark)

    Howarth, P; Malling, Hans-Jørgen; Molimard, M;

    2011-01-01

    To cite this article: Howarth P, Malling H-J, Molimard M, Devillier P. Analysis of allergen immunotherapy studies shows increased clinical efficacy in highly symptomatic patients. Allergy 2012; 67: 321-327. ABSTRACT: Background:  The assessment of allergen immunotherapy (AIT) efficacy...... in the treatment for seasonal allergic rhinoconjunctivitis (SAR) symptoms is challenging. Allergen immunotherapy differs from symptomatic therapy in that while symptomatic therapy treats patients after symptoms appear and aims to reduce symptoms, AIT is administered before symptoms are present and aims to prevent...... them. Thus, clinical studies of AIT can neither establish baseline symptom levels nor limit the enrolment of patients to those with the most severe symptoms. Allergen immunotherapy treatment effects are therefore diluted by patients with low symptoms for a particular pollen season. The objective...

  20. High Prevalence and Clinical Relevance of Genes Affected by Chromosomal Breaks in Colorectal Cancer.

    Directory of Open Access Journals (Sweden)

    Evert van den Broek

    Full Text Available Cancer is caused by somatic DNA alterations such as gene point mutations, DNA copy number aberrations (CNA and structural variants (SVs. Genome-wide analyses of SVs in large sample series with well-documented clinical information are still scarce. Consequently, the impact of SVs on carcinogenesis and patient outcome remains poorly understood. This study aimed to perform a systematic analysis of genes that are affected by CNA-associated chromosomal breaks in colorectal cancer (CRC and to determine the clinical relevance of recurrent breakpoint genes.Primary CRC samples of patients with metastatic disease from CAIRO and CAIRO2 clinical trials were previously characterized by array-comparative genomic hybridization. These data were now used to determine the prevalence of CNA-associated chromosomal breaks within genes across 352 CRC samples. In addition, mutation status of the commonly affected APC, TP53, KRAS, PIK3CA, FBXW7, SMAD4, BRAF and NRAS genes was determined for 204 CRC samples by targeted massive parallel sequencing. Clinical relevance was assessed upon stratification of patients based on gene mutations and gene breakpoints that were observed in >3% of CRC cases.In total, 748 genes were identified that were recurrently affected by chromosomal breaks (FDR 3% of cases, indicating that prevalence of gene breakpoints is comparable to the prevalence of well-known gene point mutations. Patient stratification based on gene breakpoints and point mutations revealed one CRC subtype with very poor prognosis.We conclude that CNA-associated chromosomal breaks within genes represent a highly prevalent and clinically relevant subset of SVs in CRC.

  1. Scoring clinical signs can help diagnose canine visceral leishmaniasis in a highly endemic area in Brazil

    Science.gov (United States)

    da Silva, Kleverton Ribeiro; de Mendonça, Vitor Rosa Ramos; Silva, Kellen Matuzzy; do Nascimento, Leopoldo Fabrício Marçal; Mendes-Sousa, Antonio Ferreira; de Pinho, Flaviane Alves; Barral-Netto, Manoel; Barral, Aldina Maria Prado; Cruz, Maria do Socorro Pires e

    2017-01-01

    Canine visceral leishmaniasis (CVL) diagnosis is still a challenge in endemic areas with limited diagnostic resources. This study proposes a score with the potential to distinguish positive CVL cases from negative ones. We studied 265 dogs that tested positive for CVL on ELISA and parasitological tests. A score ranging between 0 and 19 was recorded on the basis of clinical signs. Dogs with CVL had an overall higher positivity of the majority of clinical signs than did dogs without CVL or with ehrlichiosis. Clinical signs such as enlarged lymph nodes (83.93%), muzzle/ear lesions (55.36%), nutritional status (51.79%), bristle condition (57.14%), pale mucosal colour (48.21%), onychogryphosis (58.93%), skin lesion (39.28%), bleeding (12.50%), muzzle depigmentation (41.07%), alopecia (39.29%), blepharitis (21.43%), and keratoconjunctivitis (42.86%) were more frequent in dogs with CVL than in dogs with ehrlichiosis or without CVL. Moreover, the clinical score increased according to the positivity of all diagnostic tests (ELISA, p < 0.001; parasite culture, p = 0.0021; and smear, p = 0.0003). Onychogryphosis (long nails) [odds ratio (OR): 3.529; 95% confidence interval (CI): 1.832-6.796; p < 0.001], muzzle depigmentation (OR: 4.651; 95% CI: 2.218-9.750; p < 0.001), and keratoconjunctivitis (OR: 5.400; 95% CI: 2.549-11.441; p < 0.001) were highly associated with CVL. Interestingly, a score cut-off value ≥ 6 had an area under the curve of 0.717 (p < 0.0001), sensitivity of 60.71%, and specificity of 73.64% for CVL diagnosis. The clinical sign-based score for CVL diagnosis suggested herein can help veterinarians reliably identify dogs with CVL in endemic areas with limited diagnostic resources. PMID:28076469

  2. HIGH VARIABILITY OF HUMAN CYTOMEGALOVIRUS UL150 OPEN READING FRAME IN LOW-PASSAGED CLINICAL ISOLATES

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Objective To investigate the polymorphism of human cytomegalovirus (HCMV) UL150 open reading frame(ORF) in low-passaged clinical isolates, and to study the relationship between the polymorphism and different pathogenesis of congenital HCMV infection.Methods PCR was performed to amplify the entire HCMV UL150 ORF region of 29 clinical isolates, which hadbeen proven containing detectable HCMV-DNA using fluorescence quantitative PCR. PCR amplifcation products weresequenced directly, and the data were analyzed.Results Totally 25 among 29 isolates were amplified, and 18 isolates were sequenced successfully. HCMVUL150 ORF sequences derived from congenitally infected infants were high variability. The UL150 ORF in all 18 clinical isolates shifted backward by 8 nucleotides leading to frame-shift, and contained a single nucleotide deletion at nucleotide position 226 compared with that of Toledo strain. The nucleotide diversity was 0. 1% to 6. 8% and the amino acid diversity was 0. 2% to 19. 2% related to Toledo strain. However, the nucleotide diversity was 0. 1% to 6.4% and amino acid diversity was 0. 2% to 8.3% by compared with Merlin strain. Compared with Toledo, 4 new cysteine residues and 13 additional posttranslational modification sites were observed in UL150 putative proteins of clinical isolates. Moreover, the UL150 putative protein contained an additional transmembrane helix at position of 4-17 amino acid related to Toledo.Conclusion HCMV UL150 ORF and deduced amino acid sequences of clinical strains are hypervariability. No obvious linkage between the polymorphism and different pathogenesis of congenital HCMV infection is found.

  3. Recurrence or rebound of clinical relapses after discontinuation of natalizumab therapy in highly active MS patients

    DEFF Research Database (Denmark)

    Sorensen, Per Soelberg; Koch-Henriksen, Nils; Petersen, Thor;

    2014-01-01

    A number of studies have reported flare-up of multiple sclerosis (MS) disease activity after cessation of natalizumab, increasing to a level beyond the pre-natalizumab treatment level. Our aim was to describe the development in clinical disease activity following cessation of natalizumab therapy...... in a large unselected cohort of highly active patients. We studied 375 highly active patients who had suffered at least two significant relapses within 1 year or three relapses within 2 years, or had been treated with mitoxantrone for highly active disease. All patients had discontinued therapy...... with natalizumab after at least 24 weeks on therapy, and had been followed 3-12 months (mean 8.9 months) after cessation of natalizumab therapy. The annualised relapse rate before start of natalizumab therapy was 0.94 (95 % confidence interval [CI] 0.88-1.00), 0.47 (95 % CI 0.43-0.52) during natalizumab therapy, 0...

  4. A weighted combination of pseudo-likelihood estimators for longitudinal binary data subject to nonignorable non-monotone missingness

    Science.gov (United States)

    Troxel, Andrea B.; Lipsitz, Stuart R.; Fitzmaurice, Garrett M.; Ibrahim, Joseph G.; Sinha, Debajyoti; Molenberghs, Geert

    2010-01-01

    SUMMARY For longitudinal binary data with non-monotone non-ignorably missing outcomes over time, a full likelihood approach is complicated algebraically, and with many follow-up times, maximum likelihood estimation can be computationally prohibitive. As alternatives, two pseudo-likelihood approaches have been proposed that use minimal parametric assumptions. One formulation requires specification of the marginal distributions of the outcome and missing data mechanism at each time point, but uses an “independence working assumption,” i.e., an assumption that observations are independent over time. Another method avoids having to estimate the missing data mechanism by formulating a “protective estimator.” In simulations, these two estimators can be very inefficient, both for estimating time trends in the first case and for estimating both time-varying and time-stationary effects in the second. In this paper, we propose use of the optimal weighted combination of these two estimators, and in simulations we show that the optimal weighted combination can be much more efficient than either estimator alone. Finally, the proposed method is used to analyze data from two longitudinal clinical trials of HIV-infected patients. PMID:20205269

  5. Hail the impossible: p-values, evidence, and likelihood.

    Science.gov (United States)

    Johansson, Tobias

    2011-04-01

    Significance testing based on p-values is standard in psychological research and teaching. Typically, research articles and textbooks present and use p as a measure of statistical evidence against the null hypothesis (the Fisherian interpretation), although using concepts and tools based on a completely different usage of p as a tool for controlling long-term decision errors (the Neyman-Pearson interpretation). There are four major problems with using p as a measure of evidence and these problems are often overlooked in the domain of psychology. First, p is uniformly distributed under the null hypothesis and can therefore never indicate evidence for the null. Second, p is conditioned solely on the null hypothesis and is therefore unsuited to quantify evidence, because evidence is always relative in the sense of being evidence for or against a hypothesis relative to another hypothesis. Third, p designates probability of obtaining evidence (given the null), rather than strength of evidence. Fourth, p depends on unobserved data and subjective intentions and therefore implies, given the evidential interpretation, that the evidential strength of observed data depends on things that did not happen and subjective intentions. In sum, using p in the Fisherian sense as a measure of statistical evidence is deeply problematic, both statistically and conceptually, while the Neyman-Pearson interpretation is not about evidence at all. In contrast, the likelihood ratio escapes the above problems and is recommended as a tool for psychologists to represent the statistical evidence conveyed by obtained data relative to two hypotheses. © 2010 The Author. Scandinavian Journal of Psychology © 2010 The Scandinavian Psychological Associations.

  6. Verifying likelihoods for low template DNA profiles using multiple replicates

    Science.gov (United States)

    Steele, Christopher D.; Greenhalgh, Matthew; Balding, David J.

    2014-01-01

    To date there is no generally accepted method to test the validity of algorithms used to compute likelihood ratios (LR) evaluating forensic DNA profiles from low-template and/or degraded samples. An upper bound on the LR is provided by the inverse of the match probability, which is the usual measure of weight of evidence for standard DNA profiles not subject to the stochastic effects that are the hallmark of low-template profiles. However, even for low-template profiles the LR in favour of a true prosecution hypothesis should approach this bound as the number of profiling replicates increases, provided that the queried contributor is the major contributor. Moreover, for sufficiently many replicates the standard LR for mixtures is often surpassed by the low-template LR. It follows that multiple LTDNA replicates can provide stronger evidence for a contributor to a mixture than a standard analysis of a good-quality profile. Here, we examine the performance of the likeLTD software for up to eight replicate profiling runs. We consider simulated and laboratory-generated replicates as well as resampling replicates from a real crime case. We show that LRs generated by likeLTD usually do exceed the mixture LR given sufficient replicates, are bounded above by the inverse match probability and do approach this bound closely when this is expected. We also show good performance of likeLTD even when a large majority of alleles are designated as uncertain, and suggest that there can be advantages to using different profiling sensitivities for different replicates. Overall, our results support both the validity of the underlying mathematical model and its correct implementation in the likeLTD software. PMID:25082140

  7. Theory of Mind in Patients at Clinical High Risk for Psychosis

    Science.gov (United States)

    Stanford, Arielle D.; Messinger, Julie; Malaspina, Dolores; Corcoran, Cheryl M.

    2011-01-01

    Background Patients with schizophrenia have a decreased ability to interpret the intentions of other individuals, called Theory of Mind (ToM). As capacity for ToM normally advances with brain maturation, research on ToM in individuals at heightened clinical risk for psychosis may reveal developmental differences independent of disease based differences. Methods We examined ToM in at clinical high risk and schizophrenia patients as well as healthy controls: 1) 63 clinical high risk (CHR) patients and 24 normal youths ascertained by a CHR program; and 2) in 13 schizophrenia cases and 14 normal adults recruited through a schizophrenia program. ToM measures included first- and second-order false belief cartoon tasks (FBT) and two “higher order” tasks (“Strange Stories Task” (SST) and the “Reading the Mind in the Eyes” task). In the first study, CHR patients and normal youths were also assessed for cognition, “prodromal” symptoms and social function. Results Errors on first- and second-order false belief tasks were made primarily by patients. CHR patients and their young comparison group had equivalent performance on higher order ToM, which was not significantly different from the worse ToM performance of schizophrenia patients and the higher performance of normal adult controls. In the combined dataset from both studies, all levels of ToM were associated with IQ, controlling for age and sex. ToM bore no relation to explicit memory, prodromal symptoms, social function, or later transition to psychosis. Conclusions Higher order ToM capacity was equally undeveloped in high risk cases and younger controls, suggesting performance on these tasks is not fully achieved until adulthood. This study also replicates the association of IQ with ToM performance described in previous studies of schizophrenia. PMID:21757324

  8. Conditional Likelihood Estimators for Hidden Markov Models and Stochastic Volatility Models

    OpenAIRE

    Genon-Catalot, Valentine; Jeantheau, Thierry; Laredo, Catherine

    2003-01-01

    ABSTRACT. This paper develops a new contrast process for parametric inference of general hidden Markov models, when the hidden chain has a non-compact state space. This contrast is based on the conditional likelihood approach, often used for ARCH-type models. We prove the strong consistency of the conditional likelihood estimators under appropriate conditions. The method is applied to the Kalman filter (for which this contrast and the exact likelihood lead to asymptotically equivalent estimat...

  9. Asymptotic behavior of the likelihood function of covariance matrices of spatial Gaussian processes

    DEFF Research Database (Denmark)

    Zimmermann, Ralf

    2010-01-01

    The covariance structure of spatial Gaussian predictors (aka Kriging predictors) is generally modeled by parameterized covariance functions; the associated hyperparameters in turn are estimated via the method of maximum likelihood. In this work, the asymptotic behavior of the maximum likelihood......: optimally trained nondegenerate spatial Gaussian processes cannot feature arbitrary ill-conditioned correlation matrices. The implication of this theorem on Kriging hyperparameter optimization is exposed. A nonartificial example is presented, where maximum likelihood-based Kriging model training...

  10. Discriminative likelihood score weighting based on acoustic-phonetic classification for speaker identification

    Science.gov (United States)

    Suh, Youngjoo; Kim, Hoirin

    2014-12-01

    In this paper, a new discriminative likelihood score weighting technique is proposed for speaker identification. The proposed method employs a discriminative weighting of frame-level log-likelihood scores with acoustic-phonetic classification in the Gaussian mixture model (GMM)-based speaker identification. Experiments performed on the Aurora noise-corrupted TIMIT database showed that the proposed approach provides meaningful performance improvement with an overall relative error reduction of 15.8% over the maximum likelihood-based baseline GMM approach.

  11. Clinical potentials of methylator phenotype in stage 4 high-risk neuroblastoma: an open challenge.

    Directory of Open Access Journals (Sweden)

    Barbara Banelli

    Full Text Available Approximately 20% of stage 4 high-risk neuroblastoma patients are alive and disease-free 5 years after disease onset while the remaining experience rapid and fatal progression. Numerous findings underline the prognostic role of methylation of defined target genes in neuroblastoma without taking into account the clinical and biological heterogeneity of this disease. In this report we have investigated the methylation of the PCDHB cluster, the most informative member of the "Methylator Phenotype" in neuroblastoma, hypothesizing that if this epigenetic mark can predict overall and progression free survival in high-risk stage 4 neuroblastoma, it could be utilized to improve the risk stratification of the patients, alone or in conjunction with the previously identified methylation of the SFN gene (14.3.3sigma that can accurately predict outcome in these patients. We have utilized univariate and multivariate models to compare the prognostic power of PCDHB methylation in terms of overall and progression free survival, quantitatively determined by pyrosequencing, with that of other markers utilized for the patients' stratification utilizing methylation thresholds calculated on neuroblastoma at stage 1-4 and only on stage 4, high-risk patients. Our results indicate that PCDHB accurately distinguishes between high- and intermediate/low risk stage 4 neuroblastoma in agreement with the established risk stratification criteria. However PCDHB cannot predict outcome in the subgroup of stage 4 patients at high-risk whereas methylation levels of SFN are suggestive of a "methylation gradient" associated with tumor aggressiveness as suggested by the finding of a higher threshold that defines a subset of patients with an extremely severe disease (OS <24 months. Because of the heterogeneity of neuroblastoma we believe that clinically relevant methylation markers should be selected and tested on homogeneous groups of patients rather than on patients at all stages.

  12. dPIRPLE: a joint estimation framework for deformable registration and penalized-likelihood CT image reconstruction using prior images

    Science.gov (United States)

    Dang, H.; Wang, A. S.; Sussman, Marc S.; Siewerdsen, J. H.; Stayman, J. W.

    2014-09-01

    Sequential imaging studies are conducted in many clinical scenarios. Prior images from previous studies contain a great deal of patient-specific anatomical information and can be used in conjunction with subsequent imaging acquisitions to maintain image quality while enabling radiation dose reduction (e.g., through sparse angular sampling, reduction in fluence, etc). However, patient motion between images in such sequences results in misregistration between the prior image and current anatomy. Existing prior-image-based approaches often include only a simple rigid registration step that can be insufficient for capturing complex anatomical motion, introducing detrimental effects in subsequent image reconstruction. In this work, we propose a joint framework that estimates the 3D deformation between an unregistered prior image and the current anatomy (based on a subsequent data acquisition) and reconstructs the current anatomical image using a model-based reconstruction approach that includes regularization based on the deformed prior image. This framework is referred to as deformable prior image registration, penalized-likelihood estimation (dPIRPLE). Central to this framework is the inclusion of a 3D B-spline-based free-form-deformation model into the joint registration-reconstruction objective function. The proposed framework is solved using a maximization strategy whereby alternating updates to the registration parameters and image estimates are applied allowing for improvements in both the registration and reconstruction throughout the optimization process. Cadaver experiments were conducted on a cone-beam CT testbench emulating a lung nodule surveillance scenario. Superior reconstruction accuracy and image quality were demonstrated using the dPIRPLE algorithm as compared to more traditional reconstruction methods including filtered backprojection, penalized-likelihood estimation (PLE), prior image penalized-likelihood estimation (PIPLE) without registration, and

  13. Refractory coeliac disease in a country with a high prevalence of clinically-diagnosed coeliac disease.

    Science.gov (United States)

    Ilus, T; Kaukinen, K; Virta, L J; Huhtala, H; Mäki, M; Kurppa, K; Heikkinen, M; Heikura, M; Hirsi, E; Jantunen, K; Moilanen, V; Nielsen, C; Puhto, M; Pölkki, H; Vihriälä, I; Collin, P

    2014-02-01

    Refractory coeliac disease (RCD) is thought to be a rare disorder, but the accurate prevalence is unknown. We aimed to identify the prevalence of and the risk factors for developing RCD in a Finnish population where the clinical detection rate of coeliac disease is high. The study involved 11 hospital districts in Finland where the number of treated RCD patients (n = 44), clinically diagnosed coeliac disease patients (n = 12 243) and adult inhabitants (n = 1.7 million) was known. Clinical characteristics at diagnosis of coeliac disease between the RCD patients and patients with uncomplicated disease were compared. The prevalence of RCD was 0.31% among diagnosed coeliac disease patients and 0.002% in the general population. Of the enrolled 44 RCD patients, 68% had type I and 23% type II; in 9% the type was undetermined. Comparing 886 patients with uncomplicated coeliac disease with these 44 patients that developed RCD later in life, the latter were significantly older (median 56 vs 44 years, P coeliac disease. Patients with evolving RCD had more severe symptoms at the diagnosis of coeliac disease, including weight loss in 36% (vs. 16%, P = 0.001) and diarrhoea in 54% (vs. 38%, P = 0.050). Refractory coeliac disease is very rare in the general population. Patients of male gender, older age, severe symptoms or seronegativity at the diagnosis of coeliac disease are at risk of future refractory coeliac disease and should be followed up carefully. © 2014 John Wiley & Sons Ltd.

  14. Pilot Clinical Trial of Indocyanine Green Fluorescence-Augmented Colonoscopy in High Risk Patients

    Directory of Open Access Journals (Sweden)

    Rahul A. Sheth

    2016-01-01

    Full Text Available White light colonoscopy is the current gold standard for early detection and treatment of colorectal cancer, but emerging data suggest that this approach is inherently limited. Even the most experienced colonoscopists, under optimal conditions, miss at least 15–25% of adenomas. There is an unmet clinical need for an adjunctive modality to white light colonoscopy with improved lesion detection and characterization. Optical molecular imaging with exogenously administered organic fluorochromes is a burgeoning imaging modality poised to advance the capabilities of colonoscopy. In this proof-of-principle clinical trial, we investigated the ability of a custom-designed fluorescent colonoscope and indocyanine green, a clinically approved fluorescent blood pool imaging agent, to visualize polyps in high risk patients with polyposis syndromes or known distal colonic masses. We demonstrate (1 the successful performance of real-time, wide-field fluorescence endoscopy using off-the-shelf equipment, (2 the ability of this system to identify polyps as small as 1 mm, and (3 the potential for fluorescence imaging signal intensity to differentiate between neoplastic and benign polyps.

  15. High diagnostic yield of clinical exome sequencing in Middle Eastern patients with Mendelian disorders.

    Science.gov (United States)

    Yavarna, Tarunashree; Al-Dewik, Nader; Al-Mureikhi, Mariam; Ali, Rehab; Al-Mesaifri, Fatma; Mahmoud, Laila; Shahbeck, Noora; Lakhani, Shenela; AlMulla, Mariam; Nawaz, Zafar; Vitazka, Patrik; Alkuraya, Fowzan S; Ben-Omran, Tawfeg

    2015-09-01

    Clinical exome sequencing (CES) has become an increasingly popular diagnostic tool in patients with heterogeneous genetic disorders, especially in those with neurocognitive phenotypes. Utility of CES in consanguineous populations has not yet been determined on a large scale. A clinical cohort of 149 probands from Qatar with suspected Mendelian, mainly neurocognitive phenotypes, underwent CES from July 2012 to June 2014. Intellectual disability and global developmental delay were the most common clinical presentations but our cohort displayed other phenotypes, such as epilepsy, dysmorphism, microcephaly and other structural brain anomalies and autism. A pathogenic or likely pathogenic mutation, including pathogenic CNVs, was identified in 89 probands for a diagnostic yield of 60%. Consanguinity and positive family history predicted a higher diagnostic yield. In 5% (7/149) of cases, CES implicated novel candidate disease genes (MANF, GJA9, GLG1, COL15A1, SLC35F5, MAGE4, NEUROG1). CES uncovered two coexisting genetic disorders in 4% (6/149) and actionable incidental findings in 2% (3/149) of cases. Average time to diagnosis was reduced from 27 to 5 months. CES, which already has the highest diagnostic yield among all available diagnostic tools in the setting of Mendelian disorders, appears to be particularly helpful diagnostically in the highly consanguineous Middle Eastern population.

  16. Highly reliable heterologous system for evaluating resistance of clinical herpes simplex virus isolates to nucleoside analogues.

    Science.gov (United States)

    Bestman-Smith, J; Schmit, I; Papadopoulou, B; Boivin, G

    2001-04-01

    Clinical resistance of herpes simplex virus (HSV) types 1 and 2 to acyclovir (ACV) is usually caused by the presence of point mutations within the coding region of the viral thymidine kinase (TK) gene. The distinction between viral TK mutations involved in ACV resistance or part of viral polymorphism can be difficult to evaluate with current methodologies based on transfection and homologous recombination. We have developed and validated a new heterologous system based on the expression of the viral TK gene by the protozoan parasite Leishmania, normally devoid of TK activity. The viral TK genes from 5 ACV-susceptible and 13 ACV-resistant clinical HSV isolates and from the reference strains MS2 (type 2) and KOS (type 1) were transfected as part of an episomal expression vector in Leishmania. The susceptibility of TK-recombinant parasites to ganciclovir (GCV), a closely related nucleoside analogue, was evaluated by a simple measurement of the absorbance of Leishmania cultures grown in the presence of the drug. Expression of the TK gene from ACV-susceptible clinical isolates resulted in Leishmania susceptibility to GCV, whereas expression of a TK gene with frameshift mutations or nucleotide substitutions from ACV-resistant isolates gave rise to parasites with high levels of GCV resistance. The expression of the HSV TK gene in Leishmania provides an easy, reliable, and sensitive assay for evaluating HSV susceptibility to nucleoside analogues and for assessing the role of specific viral TK mutations.

  17. Fast inference in generalized linear models via expected log-likelihoods.

    Science.gov (United States)

    Ramirez, Alexandro D; Paninski, Liam

    2014-04-01

    Generalized linear models play an essential role in a wide variety of statistical applications. This paper discusses an approximation of the likelihood in these models that can greatly facilitate computation. The basic idea is to replace a sum that appears in the exact log-likelihood by an expectation over the model covariates; the resulting "expected log-likelihood" can in many cases be computed significantly faster than the exact log-likelihood. In many neuroscience experiments the distribution over model covariates is controlled by the experimenter and the expected log-likelihood approximation becomes particularly useful; for example, estimators based on maximizing this expected log-likelihood (or a penalized version thereof) can often be obtained with orders of magnitude computational savings compared to the exact maximum likelihood estimators. A risk analysis establishes that these maximum EL estimators often come with little cost in accuracy (and in some cases even improved accuracy) compared to standard maximum likelihood estimates. Finally, we find that these methods can significantly decrease the computation time of marginal likelihood calculations for model selection and of Markov chain Monte Carlo methods for sampling from the posterior parameter distribution. We illustrate our results by applying these methods to a computationally-challenging dataset of neural spike trains obtained via large-scale multi-electrode recordings in the primate retina.

  18. Improved Likelihood Ratio Tests for Cointegration Rank in the VAR Model

    DEFF Research Database (Denmark)

    Boswijk, H. Peter; Jansson, Michael; Nielsen, Morten Ørregaard

    . The power gains relative to existing tests are due to two factors. First, instead of basing our tests on the conditional (with respect to the initial observations) likelihood, we follow the recent unit root literature and base our tests on the full likelihood as in, e.g., Elliott, Rothenberg, and Stock......We suggest improved tests for cointegration rank in the vector autoregressive (VAR) model and develop asymptotic distribution theory and local power results. The tests are (quasi-)likelihood ratio tests based on a Gaussian likelihood, but of course the asymptotic results apply more generally...

  19. Precise Estimation of Cosmological Parameters Using a More Accurate Likelihood Function

    Science.gov (United States)

    Sato, Masanori; Ichiki, Kiyotomo; Takeuchi, Tsutomu T.

    2010-12-01

    The estimation of cosmological parameters from a given data set requires a construction of a likelihood function which, in general, has a complicated functional form. We adopt a Gaussian copula and constructed a copula likelihood function for the convergence power spectrum from a weak lensing survey. We show that the parameter estimation based on the Gaussian likelihood erroneously introduces a systematic shift in the confidence region, in particular, for a parameter of the dark energy equation of state w. Thus, the copula likelihood should be used in future cosmological observations.

  20. Maximum Likelihood Estimation and Inference With Examples in R, SAS and ADMB

    CERN Document Server

    Millar, Russell B

    2011-01-01

    This book takes a fresh look at the popular and well-established method of maximum likelihood for statistical estimation and inference. It begins with an intuitive introduction to the concepts and background of likelihood, and moves through to the latest developments in maximum likelihood methodology, including general latent variable models and new material for the practical implementation of integrated likelihood using the free ADMB software. Fundamental issues of statistical inference are also examined, with a presentation of some of the philosophical debates underlying the choice of statis