WorldWideScience

Sample records for high clinical likelihood

  1. Constructing diagnostic likelihood: clinical decisions using subjective versus statistical probability.

    Science.gov (United States)

    Kinnear, John; Jackson, Ruth

    2017-07-01

    Although physicians are highly trained in the application of evidence-based medicine, and are assumed to make rational decisions, there is evidence that their decision making is prone to biases. One of the biases that has been shown to affect accuracy of judgements is that of representativeness and base-rate neglect, where the saliency of a person's features leads to overestimation of their likelihood of belonging to a group. This results in the substitution of 'subjective' probability for statistical probability. This study examines clinicians' propensity to make estimations of subjective probability when presented with clinical information that is considered typical of a medical condition. The strength of the representativeness bias is tested by presenting choices in textual and graphic form. Understanding of statistical probability is also tested by omitting all clinical information. For the questions that included clinical information, 46.7% and 45.5% of clinicians made judgements of statistical probability, respectively. Where the question omitted clinical information, 79.9% of clinicians made a judgement consistent with statistical probability. There was a statistically significant difference in responses to the questions with and without representativeness information (χ2 (1, n=254)=54.45, pprobability. One of the causes for this representativeness bias may be the way clinical medicine is taught where stereotypic presentations are emphasised in diagnostic decision making. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  2. Maximum likelihood positioning algorithm for high-resolution PET scanners

    International Nuclear Information System (INIS)

    Gross-Weege, Nicolas; Schug, David; Hallen, Patrick; Schulz, Volkmar

    2016-01-01

    Purpose: In high-resolution positron emission tomography (PET), lightsharing elements are incorporated into typical detector stacks to read out scintillator arrays in which one scintillator element (crystal) is smaller than the size of the readout channel. In order to identify the hit crystal by means of the measured light distribution, a positioning algorithm is required. One commonly applied positioning algorithm uses the center of gravity (COG) of the measured light distribution. The COG algorithm is limited in spatial resolution by noise and intercrystal Compton scatter. The purpose of this work is to develop a positioning algorithm which overcomes this limitation. Methods: The authors present a maximum likelihood (ML) algorithm which compares a set of expected light distributions given by probability density functions (PDFs) with the measured light distribution. Instead of modeling the PDFs by using an analytical model, the PDFs of the proposed ML algorithm are generated assuming a single-gamma-interaction model from measured data. The algorithm was evaluated with a hot-rod phantom measurement acquired with the preclinical HYPERION II D PET scanner. In order to assess the performance with respect to sensitivity, energy resolution, and image quality, the ML algorithm was compared to a COG algorithm which calculates the COG from a restricted set of channels. The authors studied the energy resolution of the ML and the COG algorithm regarding incomplete light distributions (missing channel information caused by detector dead time). Furthermore, the authors investigated the effects of using a filter based on the likelihood values on sensitivity, energy resolution, and image quality. Results: A sensitivity gain of up to 19% was demonstrated in comparison to the COG algorithm for the selected operation parameters. Energy resolution and image quality were on a similar level for both algorithms. Additionally, the authors demonstrated that the performance of the ML

  3. High-order Composite Likelihood Inference for Max-Stable Distributions and Processes

    KAUST Repository

    Castruccio, Stefano; Huser, Raphaë l; Genton, Marc G.

    2015-01-01

    In multivariate or spatial extremes, inference for max-stable processes observed at a large collection of locations is a very challenging problem in computational statistics, and current approaches typically rely on less expensive composite likelihoods constructed from small subsets of data. In this work, we explore the limits of modern state-of-the-art computational facilities to perform full likelihood inference and to efficiently evaluate high-order composite likelihoods. With extensive simulations, we assess the loss of information of composite likelihood estimators with respect to a full likelihood approach for some widely-used multivariate or spatial extreme models, we discuss how to choose composite likelihood truncation to improve the efficiency, and we also provide recommendations for practitioners. This article has supplementary material online.

  4. High-order Composite Likelihood Inference for Max-Stable Distributions and Processes

    KAUST Repository

    Castruccio, Stefano

    2015-09-29

    In multivariate or spatial extremes, inference for max-stable processes observed at a large collection of locations is a very challenging problem in computational statistics, and current approaches typically rely on less expensive composite likelihoods constructed from small subsets of data. In this work, we explore the limits of modern state-of-the-art computational facilities to perform full likelihood inference and to efficiently evaluate high-order composite likelihoods. With extensive simulations, we assess the loss of information of composite likelihood estimators with respect to a full likelihood approach for some widely-used multivariate or spatial extreme models, we discuss how to choose composite likelihood truncation to improve the efficiency, and we also provide recommendations for practitioners. This article has supplementary material online.

  5. Supplementary Material for: High-Order Composite Likelihood Inference for Max-Stable Distributions and Processes

    KAUST Repository

    Castruccio, Stefano; Huser, Raphaë l; Genton, Marc G.

    2016-01-01

    In multivariate or spatial extremes, inference for max-stable processes observed at a large collection of points is a very challenging problem and current approaches typically rely on less expensive composite likelihoods constructed from small subsets of data. In this work, we explore the limits of modern state-of-the-art computational facilities to perform full likelihood inference and to efficiently evaluate high-order composite likelihoods. With extensive simulations, we assess the loss of information of composite likelihood estimators with respect to a full likelihood approach for some widely used multivariate or spatial extreme models, we discuss how to choose composite likelihood truncation to improve the efficiency, and we also provide recommendations for practitioners. This article has supplementary material online.

  6. Approximate Likelihood

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Most physics results at the LHC end in a likelihood ratio test. This includes discovery and exclusion for searches as well as mass, cross-section, and coupling measurements. The use of Machine Learning (multivariate) algorithms in HEP is mainly restricted to searches, which can be reduced to classification between two fixed distributions: signal vs. background. I will show how we can extend the use of ML classifiers to distributions parameterized by physical quantities like masses and couplings as well as nuisance parameters associated to systematic uncertainties. This allows for one to approximate the likelihood ratio while still using a high dimensional feature vector for the data. Both the MEM and ABC approaches mentioned above aim to provide inference on model parameters (like cross-sections, masses, couplings, etc.). ABC is fundamentally tied Bayesian inference and focuses on the “likelihood free” setting where only a simulator is available and one cannot directly compute the likelihood for the dat...

  7. Quantitative comparison of OSEM and penalized likelihood image reconstruction using relative difference penalties for clinical PET

    International Nuclear Information System (INIS)

    Ahn, Sangtae; Asma, Evren; Cheng, Lishui; Manjeshwar, Ravindra M; Ross, Steven G; Miao, Jun; Jin, Xiao; Wollenweber, Scott D

    2015-01-01

    Ordered subset expectation maximization (OSEM) is the most widely used algorithm for clinical PET image reconstruction. OSEM is usually stopped early and post-filtered to control image noise and does not necessarily achieve optimal quantitation accuracy. As an alternative to OSEM, we have recently implemented a penalized likelihood (PL) image reconstruction algorithm for clinical PET using the relative difference penalty with the aim of improving quantitation accuracy without compromising visual image quality. Preliminary clinical studies have demonstrated visual image quality including lesion conspicuity in images reconstructed by the PL algorithm is better than or at least as good as that in OSEM images. In this paper we evaluate lesion quantitation accuracy of the PL algorithm with the relative difference penalty compared to OSEM by using various data sets including phantom data acquired with an anthropomorphic torso phantom, an extended oval phantom and the NEMA image quality phantom; clinical data; and hybrid clinical data generated by adding simulated lesion data to clinical data. We focus on mean standardized uptake values and compare them for PL and OSEM using both time-of-flight (TOF) and non-TOF data. The results demonstrate improvements of PL in lesion quantitation accuracy compared to OSEM with a particular improvement in cold background regions such as lungs. (paper)

  8. Diagonal Likelihood Ratio Test for Equality of Mean Vectors in High-Dimensional Data

    KAUST Repository

    Hu, Zongliang; Tong, Tiejun; Genton, Marc G.

    2017-01-01

    We propose a likelihood ratio test framework for testing normal mean vectors in high-dimensional data under two common scenarios: the one-sample test and the two-sample test with equal covariance matrices. We derive the test statistics under the assumption that the covariance matrices follow a diagonal matrix structure. In comparison with the diagonal Hotelling's tests, our proposed test statistics display some interesting characteristics. In particular, they are a summation of the log-transformed squared t-statistics rather than a direct summation of those components. More importantly, to derive the asymptotic normality of our test statistics under the null and local alternative hypotheses, we do not require the assumption that the covariance matrix follows a diagonal matrix structure. As a consequence, our proposed test methods are very flexible and can be widely applied in practice. Finally, simulation studies and a real data analysis are also conducted to demonstrate the advantages of our likelihood ratio test method.

  9. A Scoring Tool to Identify East African HIV-1 Serodiscordant Partnerships with a High Likelihood of Pregnancy.

    Directory of Open Access Journals (Sweden)

    Renee Heffron

    Full Text Available HIV-1 prevention programs targeting HIV-1 serodiscordant couples need to identify couples that are likely to become pregnant to facilitate discussions about methods to minimize HIV-1 risk during pregnancy attempts (i.e. safer conception or effective contraception when pregnancy is unintended. A clinical prediction tool could be used to identify HIV-1 serodiscordant couples with a high likelihood of pregnancy within one year.Using standardized clinical prediction methods, we developed and validated a tool to identify heterosexual East African HIV-1 serodiscordant couples with an increased likelihood of becoming pregnant in the next year. Datasets were from three prospectively followed cohorts, including nearly 7,000 couples from Kenya and Uganda participating in HIV-1 prevention trials and delivery projects.The final score encompassed the age of the woman, woman's number of children living, partnership duration, having had condomless sex in the past month, and non-use of an effective contraceptive. The area under the curve (AUC for the probability of the score to correctly predict pregnancy was 0.74 (95% CI 0.72-0.76. Scores ≥ 7 predicted a pregnancy incidence of >17% per year and captured 78% of the pregnancies. Internal and external validation confirmed the predictive ability of the score.A pregnancy likelihood score encompassing basic demographic, clinical and behavioral factors defined African HIV-1 serodiscordant couples with high one-year pregnancy incidence rates. This tool could be used to engage African HIV-1 serodiscordant couples in counseling discussions about fertility intentions in order to offer services for safer conception or contraception that align with their reproductive goals.

  10. A Scoring Tool to Identify East African HIV-1 Serodiscordant Partnerships with a High Likelihood of Pregnancy.

    Science.gov (United States)

    Heffron, Renee; Cohen, Craig R; Ngure, Kenneth; Bukusi, Elizabeth; Were, Edwin; Kiarie, James; Mugo, Nelly; Celum, Connie; Baeten, Jared M

    2015-01-01

    HIV-1 prevention programs targeting HIV-1 serodiscordant couples need to identify couples that are likely to become pregnant to facilitate discussions about methods to minimize HIV-1 risk during pregnancy attempts (i.e. safer conception) or effective contraception when pregnancy is unintended. A clinical prediction tool could be used to identify HIV-1 serodiscordant couples with a high likelihood of pregnancy within one year. Using standardized clinical prediction methods, we developed and validated a tool to identify heterosexual East African HIV-1 serodiscordant couples with an increased likelihood of becoming pregnant in the next year. Datasets were from three prospectively followed cohorts, including nearly 7,000 couples from Kenya and Uganda participating in HIV-1 prevention trials and delivery projects. The final score encompassed the age of the woman, woman's number of children living, partnership duration, having had condomless sex in the past month, and non-use of an effective contraceptive. The area under the curve (AUC) for the probability of the score to correctly predict pregnancy was 0.74 (95% CI 0.72-0.76). Scores ≥ 7 predicted a pregnancy incidence of >17% per year and captured 78% of the pregnancies. Internal and external validation confirmed the predictive ability of the score. A pregnancy likelihood score encompassing basic demographic, clinical and behavioral factors defined African HIV-1 serodiscordant couples with high one-year pregnancy incidence rates. This tool could be used to engage African HIV-1 serodiscordant couples in counseling discussions about fertility intentions in order to offer services for safer conception or contraception that align with their reproductive goals.

  11. High-dose regions versus likelihood of cure after prostate brachytherapy

    International Nuclear Information System (INIS)

    Wallner, Kent; Merrick, Gregory; Sutlief, Steven; True, Laurence; Butler, Wayne

    2005-01-01

    Purpose: To analyze the effect of high-dose regions on biochemical cancer control rates after prostate brachytherapy. Methods and Materials: Patients with 1997 American Joint Committee on Cancer clinical Stage T1c-T2a prostate carcinoma (Gleason grade 5-6, prostate-specific antigen level 4-10 ng/mL) were randomized to implantation with 125 I (144 Gy) vs. 103 Pd (125 Gy, National Institute of Standards and Technology 1999). Isotope implantation was performed by standard techniques, using a modified peripheral loading pattern. Of the 313 patients entered in the protocol, 270 were included in this analysis. The 125 I source strength ranged from 0.4 to 0.89 mCi (median, 0.55 mCi), and the 103 Pd source strength ranged from 1.3 to 1.6 mCi (median, 1.5 mCi). CT was performed within 4 h after implantation. The dosimetric parameters analyzed included the percentage of the postimplant prostate volume covered by the 100%, 150%, 200%, and 300% prescription dose (V 100 , V 150 , V 200 , and V 300 , respectively). The median time to the last follow-up for patients without failure was 2.7 years. Freedom from biochemical failure was defined as a serum prostate-specific antigen level of ≤0.5 ng/mL at last follow-up. Patients were censored at last follow-up if their serum prostate-specific antigen level was still decreasing. Results: The mean V 100 , V 150 , V 200 , and V 300 value was 90% (±8%), 63% (±14), 35% (±13%), and 14% (±7%), respectively. Patients with a V 100 of ≥90% had a 3-year freedom from biochemical failure rate of 96% vs. 87% for those with a V 100 of 100 of ≥90% were analyzed, no relationship was found between higher dose regions and the likelihood of cancer control. This lack of effect on biochemical control was apparent for both isotopes. Conclusion: High-dose regions do not appear to affect cancer control rates, as long as >90% of the prostate volume is covered by the prescription dose

  12. Clinical Paresthesia Atlas Illustrates Likelihood of Coverage Based on Spinal Cord Stimulator Electrode Location.

    Science.gov (United States)

    Taghva, Alexander; Karst, Edward; Underwood, Paul

    2017-08-01

    Concordant paresthesia coverage is an independent predictor of pain relief following spinal cord stimulation (SCS). Using aggregate data, our objective is to produce a map of paresthesia coverage as a function of electrode location in SCS. This retrospective analysis used x-rays, SCS programming data, and paresthesia coverage maps from the EMPOWER registry of SCS implants for chronic neuropathic pain. Spinal level of dorsal column stimulation was determined by x-ray adjudication and active cathodes in patient programs. Likelihood of paresthesia coverage was determined as a function of stimulating electrode location. Segments of paresthesia coverage were grouped anatomically. Fisher's exact test was used to identify significant differences in likelihood of paresthesia coverage as a function of spinal stimulation level. In the 178 patients analyzed, the most prevalent areas of paresthesia coverage were buttocks, anterior and posterior thigh (each 98%), and low back (94%). Unwanted paresthesia at the ribs occurred in 8% of patients. There were significant differences in the likelihood of achieving paresthesia, with higher thoracic levels (T5, T6, and T7) more likely to achieve low back coverage but also more likely to introduce paresthesia felt at the ribs. Higher levels in the thoracic spine were associated with greater coverage of the buttocks, back, and thigh, and with lesser coverage of the leg and foot. This paresthesia atlas uses real-world, aggregate data to determine likelihood of paresthesia coverage as a function of stimulating electrode location. It represents an application of "big data" techniques, and a step toward achieving personalized SCS therapy tailored to the individual's chronic pain. © 2017 International Neuromodulation Society.

  13. Diagonal Likelihood Ratio Test for Equality of Mean Vectors in High-Dimensional Data

    KAUST Repository

    Hu, Zongliang

    2017-10-27

    We propose a likelihood ratio test framework for testing normal mean vectors in high-dimensional data under two common scenarios: the one-sample test and the two-sample test with equal covariance matrices. We derive the test statistics under the assumption that the covariance matrices follow a diagonal matrix structure. In comparison with the diagonal Hotelling\\'s tests, our proposed test statistics display some interesting characteristics. In particular, they are a summation of the log-transformed squared t-statistics rather than a direct summation of those components. More importantly, to derive the asymptotic normality of our test statistics under the null and local alternative hypotheses, we do not require the assumption that the covariance matrix follows a diagonal matrix structure. As a consequence, our proposed test methods are very flexible and can be widely applied in practice. Finally, simulation studies and a real data analysis are also conducted to demonstrate the advantages of our likelihood ratio test method.

  14. ROC [Receiver Operating Characteristics] study of maximum likelihood estimator human brain image reconstructions in PET [Positron Emission Tomography] clinical practice

    International Nuclear Information System (INIS)

    Llacer, J.; Veklerov, E.; Nolan, D.; Grafton, S.T.; Mazziotta, J.C.; Hawkins, R.A.; Hoh, C.K.; Hoffman, E.J.

    1990-10-01

    This paper will report on the progress to date in carrying out Receiver Operating Characteristics (ROC) studies comparing Maximum Likelihood Estimator (MLE) and Filtered Backprojection (FBP) reconstructions of normal and abnormal human brain PET data in a clinical setting. A previous statistical study of reconstructions of the Hoffman brain phantom with real data indicated that the pixel-to-pixel standard deviation in feasible MLE images is approximately proportional to the square root of the number of counts in a region, as opposed to a standard deviation which is high and largely independent of the number of counts in FBP. A preliminary ROC study carried out with 10 non-medical observers performing a relatively simple detectability task indicates that, for the majority of observers, lower standard deviation translates itself into a statistically significant detectability advantage in MLE reconstructions. The initial results of ongoing tests with four experienced neurologists/nuclear medicine physicians are presented. Normal cases of 18 F -- fluorodeoxyglucose (FDG) cerebral metabolism studies and abnormal cases in which a variety of lesions have been introduced into normal data sets have been evaluated. We report on the results of reading the reconstructions of 90 data sets, each corresponding to a single brain slice. It has become apparent that the design of the study based on reading single brain slices is too insensitive and we propose a variation based on reading three consecutive slices at a time, rating only the center slice. 9 refs., 2 figs., 1 tab

  15. Frequency-Domain Maximum-Likelihood Estimation of High-Voltage Pulse Transformer Model Parameters

    CERN Document Server

    Aguglia, D; Martins, C.D.A.

    2014-01-01

    This paper presents an offline frequency-domain nonlinear and stochastic identification method for equivalent model parameter estimation of high-voltage pulse transformers. Such kinds of transformers are widely used in the pulsed-power domain, and the difficulty in deriving pulsed-power converter optimal control strategies is directly linked to the accuracy of the equivalent circuit parameters. These components require models which take into account electric fields energies represented by stray capacitance in the equivalent circuit. These capacitive elements must be accurately identified, since they greatly influence the general converter performances. A nonlinear frequency-based identification method, based on maximum-likelihood estimation, is presented, and a sensitivity analysis of the best experimental test to be considered is carried out. The procedure takes into account magnetic saturation and skin effects occurring in the windings during the frequency tests. The presented method is validated by experim...

  16. Direct reconstruction of the source intensity distribution of a clinical linear accelerator using a maximum likelihood expectation maximization algorithm.

    Science.gov (United States)

    Papaconstadopoulos, P; Levesque, I R; Maglieri, R; Seuntjens, J

    2016-02-07

    Direct determination of the source intensity distribution of clinical linear accelerators is still a challenging problem for small field beam modeling. Current techniques most often involve special equipment and are difficult to implement in the clinic. In this work we present a maximum-likelihood expectation-maximization (MLEM) approach to the source reconstruction problem utilizing small fields and a simple experimental set-up. The MLEM algorithm iteratively ray-traces photons from the source plane to the exit plane and extracts corrections based on photon fluence profile measurements. The photon fluence profiles were determined by dose profile film measurements in air using a high density thin foil as build-up material and an appropriate point spread function (PSF). The effect of other beam parameters and scatter sources was minimized by using the smallest field size ([Formula: see text] cm(2)). The source occlusion effect was reproduced by estimating the position of the collimating jaws during this process. The method was first benchmarked against simulations for a range of typical accelerator source sizes. The sources were reconstructed with an accuracy better than 0.12 mm in the full width at half maximum (FWHM) to the respective electron sources incident on the target. The estimated jaw positions agreed within 0.2 mm with the expected values. The reconstruction technique was also tested against measurements on a Varian Novalis Tx linear accelerator and compared to a previously commissioned Monte Carlo model. The reconstructed FWHM of the source agreed within 0.03 mm and 0.11 mm to the commissioned electron source in the crossplane and inplane orientations respectively. The impact of the jaw positioning, experimental and PSF uncertainties on the reconstructed source distribution was evaluated with the former presenting the dominant effect.

  17. Empirical likelihood

    CERN Document Server

    Owen, Art B

    2001-01-01

    Empirical likelihood provides inferences whose validity does not depend on specifying a parametric model for the data. Because it uses a likelihood, the method has certain inherent advantages over resampling methods: it uses the data to determine the shape of the confidence regions, and it makes it easy to combined data from multiple sources. It also facilitates incorporating side information, and it simplifies accounting for censored, truncated, or biased sampling.One of the first books published on the subject, Empirical Likelihood offers an in-depth treatment of this method for constructing confidence regions and testing hypotheses. The author applies empirical likelihood to a range of problems, from those as simple as setting a confidence region for a univariate mean under IID sampling, to problems defined through smooth functions of means, regression models, generalized linear models, estimating equations, or kernel smooths, and to sampling with non-identically distributed data. Abundant figures offer vi...

  18. Local likelihood estimation of complex tail dependence structures in high dimensions, applied to US precipitation extremes

    KAUST Repository

    Camilo, Daniela Castro

    2017-10-02

    In order to model the complex non-stationary dependence structure of precipitation extremes over the entire contiguous U.S., we propose a flexible local approach based on factor copula models. Our sub-asymptotic spatial modeling framework yields non-trivial tail dependence structures, with a weakening dependence strength as events become more extreme, a feature commonly observed with precipitation data but not accounted for in classical asymptotic extreme-value models. To estimate the local extremal behavior, we fit the proposed model in small regional neighborhoods to high threshold exceedances, under the assumption of local stationarity. This allows us to gain in flexibility, while making inference for such a large and complex dataset feasible. Adopting a local censored likelihood approach, inference is made on a fine spatial grid, and local estimation is performed taking advantage of distributed computing resources and of the embarrassingly parallel nature of this estimation procedure. The local model is efficiently fitted at all grid points, and uncertainty is measured using a block bootstrap procedure. An extensive simulation study shows that our approach is able to adequately capture complex, non-stationary dependencies, while our study of U.S. winter precipitation data reveals interesting differences in local tail structures over space, which has important implications on regional risk assessment of extreme precipitation events. A comparison between past and current data suggests that extremes in certain areas might be slightly wider in extent nowadays than during the first half of the twentieth century.

  19. Local likelihood estimation of complex tail dependence structures in high dimensions, applied to US precipitation extremes

    KAUST Repository

    Camilo, Daniela Castro; Huser, Raphaë l

    2017-01-01

    In order to model the complex non-stationary dependence structure of precipitation extremes over the entire contiguous U.S., we propose a flexible local approach based on factor copula models. Our sub-asymptotic spatial modeling framework yields non-trivial tail dependence structures, with a weakening dependence strength as events become more extreme, a feature commonly observed with precipitation data but not accounted for in classical asymptotic extreme-value models. To estimate the local extremal behavior, we fit the proposed model in small regional neighborhoods to high threshold exceedances, under the assumption of local stationarity. This allows us to gain in flexibility, while making inference for such a large and complex dataset feasible. Adopting a local censored likelihood approach, inference is made on a fine spatial grid, and local estimation is performed taking advantage of distributed computing resources and of the embarrassingly parallel nature of this estimation procedure. The local model is efficiently fitted at all grid points, and uncertainty is measured using a block bootstrap procedure. An extensive simulation study shows that our approach is able to adequately capture complex, non-stationary dependencies, while our study of U.S. winter precipitation data reveals interesting differences in local tail structures over space, which has important implications on regional risk assessment of extreme precipitation events. A comparison between past and current data suggests that extremes in certain areas might be slightly wider in extent nowadays than during the first half of the twentieth century.

  20. Assessing the Impact of Peer Educator Outreach on the Likelihood and Acceleration of Clinic Utilization among Sex Workers.

    Science.gov (United States)

    Krishnamurthy, Parthasarathy; Hui, Sam K; Shivkumar, Narayanan; Gowda, Chandrasekhar; Pushpalatha, R

    2016-01-01

    Peer-led outreach is a critical element of HIV and STI-reduction interventions aimed at sex workers. We study the association between peer-led outreach to sex workers and the time to utilize health facilities for timely STI syndromic-detection and treatment. Using data on the timing of peer-outreach interventions and clinic visits, we utilize an Extended Cox model to assess whether peer educator outreach intensity is associated with accelerated clinic utilization among sex workers. Our data comes from 2705 female sex workers registered into Pragati, a women-in-sex-work outreach program, and followed from 2008 through 2012. We analyze this data using an Extended Cox model with the density of peer educator visits in a 30-day rolling window as the key predictor, while controlling for the sex workers' age, client volume, location of sex work, and education level. The principal outcome of interest is the timing of the first voluntary clinic utilization. More frequent peer visit is associated with earlier first clinic visit (HR: 1.83, 95% CI, 1.75-1.91, p educator outreach. Peer outreach density is associated with increased likelihood of-and shortened duration to-clinic utilization among female sex workers, suggesting potential staff resourcing implications. Given the observational nature of our study, however, these findings should be interpreted as an association rather than as a causal relationship.

  1. Assessing the Impact of Peer Educator Outreach on the Likelihood and Acceleration of Clinic Utilization among Sex Workers.

    Directory of Open Access Journals (Sweden)

    Parthasarathy Krishnamurthy

    Full Text Available Peer-led outreach is a critical element of HIV and STI-reduction interventions aimed at sex workers. We study the association between peer-led outreach to sex workers and the time to utilize health facilities for timely STI syndromic-detection and treatment. Using data on the timing of peer-outreach interventions and clinic visits, we utilize an Extended Cox model to assess whether peer educator outreach intensity is associated with accelerated clinic utilization among sex workers.Our data comes from 2705 female sex workers registered into Pragati, a women-in-sex-work outreach program, and followed from 2008 through 2012. We analyze this data using an Extended Cox model with the density of peer educator visits in a 30-day rolling window as the key predictor, while controlling for the sex workers' age, client volume, location of sex work, and education level. The principal outcome of interest is the timing of the first voluntary clinic utilization.More frequent peer visit is associated with earlier first clinic visit (HR: 1.83, 95% CI, 1.75-1.91, p < .001. In addition, 18% of all syndrome-based STI detected come from clinic visits in which the sex worker reports no symptoms, underscoring the importance of inducing clinic visits in the detection of STI. Additional models to test the robustness of these findings indicate consistent beneficial effect of peer educator outreach.Peer outreach density is associated with increased likelihood of-and shortened duration to-clinic utilization among female sex workers, suggesting potential staff resourcing implications. Given the observational nature of our study, however, these findings should be interpreted as an association rather than as a causal relationship.

  2. Likelihood of Null Effects of Large NHLBI Clinical Trials Has Increased over Time.

    Directory of Open Access Journals (Sweden)

    Robert M Kaplan

    Full Text Available We explore whether the number of null results in large National Heart Lung, and Blood Institute (NHLBI funded trials has increased over time.We identified all large NHLBI supported RCTs between 1970 and 2012 evaluating drugs or dietary supplements for the treatment or prevention of cardiovascular disease. Trials were included if direct costs >$500,000/year, participants were adult humans, and the primary outcome was cardiovascular risk, disease or death. The 55 trials meeting these criteria were coded for whether they were published prior to or after the year 2000, whether they registered in clinicaltrials.gov prior to publication, used active or placebo comparator, and whether or not the trial had industry co-sponsorship. We tabulated whether the study reported a positive, negative, or null result on the primary outcome variable and for total mortality.17 of 30 studies (57% published prior to 2000 showed a significant benefit of intervention on the primary outcome in comparison to only 2 among the 25 (8% trials published after 2000 (χ2=12.2,df= 1, p=0.0005. There has been no change in the proportion of trials that compared treatment to placebo versus active comparator. Industry co-sponsorship was unrelated to the probability of reporting a significant benefit. Pre-registration in clinical trials.gov was strongly associated with the trend toward null findings.The number NHLBI trials reporting positive results declined after the year 2000. Prospective declaration of outcomes in RCTs, and the adoption of transparent reporting standards, as required by clinicaltrials.gov, may have contributed to the trend toward null findings.

  3. Reducing the likelihood of future human activities that could affect geologic high-level waste repositories

    International Nuclear Information System (INIS)

    1984-05-01

    The disposal of radioactive wastes in deep geologic formations provides a means of isolating the waste from people until the radioactivity has decayed to safe levels. However, isolating people from the wastes is a different problem, since we do not know what the future condition of society will be. The Human Interference Task Force was convened by the US Department of Energy to determine whether reasonable means exist (or could be developed) to reduce the likelihood of future human unintentionally intruding on radioactive waste isolation systems. The task force concluded that significant reductions in the likelihood of human interference could be achieved, for perhaps thousands of years into the future, if appropriate steps are taken to communicate the existence of the repository. Consequently, for two years the task force directed most of its study toward the area of long-term communication. Methods are discussed for achieving long-term communication by using permanent markers and widely disseminated records, with various steps taken to provide multiple levels of protection against loss, destruction, and major language/societal changes. Also developed is the concept of a universal symbol to denote Caution - Biohazardous Waste Buried Here. If used for the thousands of non-radioactive biohazardous waste sites in this country alone, a symbol could transcend generations and language changes, thereby vastly improving the likelihood of successful isolation of all buried biohazardous wastes

  4. Reducing the likelihood of future human activities that could affect geologic high-level waste repositories

    Energy Technology Data Exchange (ETDEWEB)

    1984-05-01

    The disposal of radioactive wastes in deep geologic formations provides a means of isolating the waste from people until the radioactivity has decayed to safe levels. However, isolating people from the wastes is a different problem, since we do not know what the future condition of society will be. The Human Interference Task Force was convened by the US Department of Energy to determine whether reasonable means exist (or could be developed) to reduce the likelihood of future human unintentionally intruding on radioactive waste isolation systems. The task force concluded that significant reductions in the likelihood of human interference could be achieved, for perhaps thousands of years into the future, if appropriate steps are taken to communicate the existence of the repository. Consequently, for two years the task force directed most of its study toward the area of long-term communication. Methods are discussed for achieving long-term communication by using permanent markers and widely disseminated records, with various steps taken to provide multiple levels of protection against loss, destruction, and major language/societal changes. Also developed is the concept of a universal symbol to denote Caution - Biohazardous Waste Buried Here. If used for the thousands of non-radioactive biohazardous waste sites in this country alone, a symbol could transcend generations and language changes, thereby vastly improving the likelihood of successful isolation of all buried biohazardous wastes.

  5. MAXIMUM LIKELIHOOD CLASSIFICATION OF HIGH-RESOLUTION SAR IMAGES IN URBAN AREA

    Directory of Open Access Journals (Sweden)

    M. Soheili Majd

    2012-09-01

    Full Text Available In this work, we propose a state-of-the-art on statistical analysis of polarimetric synthetic aperture radar (SAR data, through the modeling of several indices. We concentrate on eight ground classes which have been carried out from amplitudes, co-polarisation ratio, depolarization ratios, and other polarimetric descriptors. To study their different statistical behaviours, we consider Gauss, log- normal, Beta I, Weibull, Gamma, and Fisher statistical models and estimate their parameters using three methods: method of moments (MoM, maximum-likelihood (ML methodology, and log-cumulants method (MoML. Then, we study the opportunity of introducing this information in an adapted supervised classification scheme based on Maximum–Likelihood and Fisher pdf. Our work relies on an image of a suburban area, acquired by the airborne RAMSES SAR sensor of ONERA. The results prove the potential of such data to discriminate urban surfaces and show the usefulness of adapting any classical classification algorithm however classification maps present a persistant class confusion between flat gravelled or concrete roofs and trees.

  6. THE GENERALIZED MAXIMUM LIKELIHOOD METHOD APPLIED TO HIGH PRESSURE PHASE EQUILIBRIUM

    Directory of Open Access Journals (Sweden)

    Lúcio CARDOZO-FILHO

    1997-12-01

    Full Text Available The generalized maximum likelihood method was used to determine binary interaction parameters between carbon dioxide and components of orange essential oil. Vapor-liquid equilibrium was modeled with Peng-Robinson and Soave-Redlich-Kwong equations, using a methodology proposed in 1979 by Asselineau, Bogdanic and Vidal. Experimental vapor-liquid equilibrium data on binary mixtures formed with carbon dioxide and compounds usually found in orange essential oil were used to test the model. These systems were chosen to demonstrate that the maximum likelihood method produces binary interaction parameters for cubic equations of state capable of satisfactorily describing phase equilibrium, even for a binary such as ethanol/CO2. Results corroborate that the Peng-Robinson, as well as the Soave-Redlich-Kwong, equation can be used to describe phase equilibrium for the following systems: components of essential oil of orange/CO2.Foi empregado o método da máxima verossimilhança generalizado para determinação de parâmetros de interação binária entre os componentes do óleo essencial de laranja e dióxido de carbono. Foram usados dados experimentais de equilíbrio líquido-vapor de misturas binárias de dióxido de carbono e componentes do óleo essencial de laranja. O equilíbrio líquido-vapor foi modelado com as equações de Peng-Robinson e de Soave-Redlich-Kwong usando a metodologia proposta em 1979 por Asselineau, Bogdanic e Vidal. A escolha destes sistemas teve como objetivo demonstrar que o método da máxima verosimilhança produz parâmetros de interação binária, para equações cúbicas de estado capazes de descrever satisfatoriamente até mesmo o equilíbrio para o binário etanol/CO2. Os resultados comprovam que tanto a equação de Peng-Robinson quanto a de Soave-Redlich-Kwong podem ser empregadas para descrever o equilíbrio de fases para o sistemas: componentes do óleo essencial de laranja/CO2.

  7. The phylogenetic likelihood library.

    Science.gov (United States)

    Flouri, T; Izquierdo-Carrasco, F; Darriba, D; Aberer, A J; Nguyen, L-T; Minh, B Q; Von Haeseler, A; Stamatakis, A

    2015-03-01

    We introduce the Phylogenetic Likelihood Library (PLL), a highly optimized application programming interface for developing likelihood-based phylogenetic inference and postanalysis software. The PLL implements appropriate data structures and functions that allow users to quickly implement common, error-prone, and labor-intensive tasks, such as likelihood calculations, model parameter as well as branch length optimization, and tree space exploration. The highly optimized and parallelized implementation of the phylogenetic likelihood function and a thorough documentation provide a framework for rapid development of scalable parallel phylogenetic software. By example of two likelihood-based phylogenetic codes we show that the PLL improves the sequential performance of current software by a factor of 2-10 while requiring only 1 month of programming time for integration. We show that, when numerical scaling for preventing floating point underflow is enabled, the double precision likelihood calculations in the PLL are up to 1.9 times faster than those in BEAGLE. On an empirical DNA dataset with 2000 taxa the AVX version of PLL is 4 times faster than BEAGLE (scaling enabled and required). The PLL is available at http://www.libpll.org under the GNU General Public License (GPL). © The Author(s) 2014. Published by Oxford University Press, on behalf of the Society of Systematic Biologists.

  8. Myocardial perfusion assessment by dual-energy computed tomography in patients with intermediate to high likelihood of coronary artery disease

    International Nuclear Information System (INIS)

    De Zam, M.C.; Capunay, C.; Rodriguez Granillo, G.A.; Deviggiano, A.; Campisi, R.; Munain, M. López de; Vallejos, J.; Carrascosa, P.M.

    2015-01-01

    Objectives. We sought to explore the feasibility and diagnostic performance of dual-energy computed tomography (DECT) for the evaluation of myocardial perfusion in patients with intermediate to high likelihood of coronary artery disease (CAD), and to assess the impact of beam hardening artifacts (HAE). Methods. The present prospective study involved patients with known or suspected CAD referred for myocardial perfusion imaging by single-photon emission computed tomography (SPECT). Twenty patients were included in the study protocol, and scanned using DECT imaging (n = 20). The same pharmacological stress was used for DECT and SPECT scans. Results. A total of 680 left ventricular segments were evaluated by DECT and SPECT. The contrast to noise ratio was 8.8±2.9. The diagnostic performance of DECT was very good in identifying perfusion defects [area under ROC curve (AUC) of DECT 0.90 (0.86-0.94)] compared with SPECT, and remained unaffected when including only segments affected by beam hardening artifacts (BHA) [AUC= DECT 0.90 (0.84-0.96)]. Conclusions. In this pilot investigation, myocardial perfusion assessment by DECT imaging in patients with intermediate to high likelihood of CAD was feasible and remained unaffected by the presence of BHA. (authors) [es

  9. Receiver-operating characteristic curves and likelihood ratios: improvements over traditional methods for the evaluation and application of veterinary clinical pathology tests

    DEFF Research Database (Denmark)

    Gardner, Ian A.; Greiner, Matthias

    2006-01-01

    Receiver-operating characteristic (ROC) curves provide a cutoff-independent method for the evaluation of continuous or ordinal tests used in clinical pathology laboratories. The area under the curve is a useful overall measure of test accuracy and can be used to compare different tests (or...... different equipment) used by the same tester, as well as the accuracy of different diagnosticians that use the same test material. To date, ROC analysis has not been widely used in veterinary clinical pathology studies, although it should be considered a useful complement to estimates of sensitivity...... and specificity in test evaluation studies. In addition, calculation of likelihood ratios can potentially improve the clinical utility of such studies because likelihood ratios provide an indication of how the post-test probability changes as a function of the magnitude of the test results. For ordinal test...

  10. Likelihood of Bone Recurrence in Prior Sites of Metastasis in Patients With High-Risk Neuroblastoma

    Energy Technology Data Exchange (ETDEWEB)

    Polishchuk, Alexei L. [Department of Radiation Oncology, University of California at San Francisco School of Medicine and UCSF Benioff Children' s Hospital, San Francisco, California (United States); Li, Richard [Division of Radiation Oncology, Dana Farber/Boston Children' s Cancer and Blood Disorders Center, Brigham and Women' s Hospital, Harvard Medical School, Boston, Massachusetts (United States); Hill-Kayser, Christine [Department of Radiation Oncology, University of Pennsylvania School of Medicine, Philadelphia, Pennsylvania (United States); Little, Anthony [Division of Oncology, Children' s Hospital of Philadelphia, Philadelphia, Pennsylvania (United States); Hawkins, Randall A. [Department of Radiology, University of California at San Francisco School of Medicine and UCSF Benioff Children' s Hospital, San Francisco, California (United States); Hamilton, Jeffrey; Lau, Michael [Department of Radiation Oncology, University of California at San Francisco School of Medicine and UCSF Benioff Children' s Hospital, San Francisco, California (United States); Tran, Hung Chi [Division of Hematology/Oncology, Children' s Hospital of Los Angeles, Los Angeles, California (United States); Strahlendorf, Caron [Division of Hematology and Oncology, Department of Pediatrics, The University of British Columbia, Vancouver, British Columbia (Canada); Lemons, Richard S. [Division of Pediatric Hematology/Oncology, University of Utah School of Medicine, Salt Lake City, Utah (United States); Weinberg, Vivian [Department of Radiation Oncology, University of California at San Francisco School of Medicine and UCSF Benioff Children' s Hospital, San Francisco, California (United States); Matthay, Katherine K.; DuBois, Steven G. [Department of Pediatrics, University of California at San Francisco School of Medicine and UCSF Benioff Children' s Hospital, San Francisco, California (United States); and others

    2014-07-15

    Purpose/Objectives: Despite recent improvements in outcomes, 40% of children with high-risk neuroblastoma will experience relapse, facing a guarded prognosis for long-term cure. Whether recurrences are at new sites or sites of original disease may guide decision making during initial therapy. Methods and Materials: Eligible patients were retrospectively identified from institutional databases at first metastatic relapse of high-risk neuroblastoma. Included patients had disease involving metaiodobenzylguanidine (MIBG)-avid metastatic sites at diagnosis and first relapse, achieved a complete or partial response with no more than one residual MIBG-avid site before first relapse, and received no total body irradiation or therapy with {sup 131}I-MIBG before first relapse. Anatomically defined metastatic sites were tracked from diagnosis through first relapse to determine tendency of disease to recur at previously involved versus uninvolved sites and to assess whether this pattern was influenced by site irradiation. Results: Of 159 MIBG-avid metastatic sites identified among 43 patients at first relapse, 131 (82.4%) overlapped anatomically with the set of 525 sites present at diagnosis. This distribution was similar for bone sites, but patterns of relapse were more varied for the smaller subset of soft tissue metastases. Among all metastatic sites at diagnosis in our subsequently relapsed patient cohort, only 3 of 19 irradiated sites (15.8%) recurred as compared with 128 of 506 (25.3%) unirradiated sites. Conclusions: Metastatic bone relapse in neuroblastoma usually occurs at anatomic sites of previous disease. Metastatic sites identified at diagnosis that did not receive radiation during frontline therapy appeared to have a higher risk of involvement at first relapse relative to previously irradiated metastatic sites. These observations support the current paradigm of irradiating metastases that persist after induction chemotherapy in high-risk patients. Furthermore

  11. Statistical Identification of Composed Visual Features Indicating High Likelihood of Grasp Success

    DEFF Research Database (Denmark)

    Thomsen, Mikkel Tang; Bodenhagen, Leon; Krüger, Norbert

    2013-01-01

    configurations of three 3D surface features that predict grasping actions with a high success probability. The strategy is based on first computing spatial relations between visual entities and secondly, exploring the cross-space of these relational feature space and grasping actions. The data foundation...... for identifying such indicative feature constellations is generated in a simulated environment wherein visual features are extracted and a large amount of grasping actions are evaluated through dynamic simulation. Based on the identified feature constellations, we validate by applying the acquired knowledge...

  12. Local Likelihood Approach for High-Dimensional Peaks-Over-Threshold Inference

    KAUST Repository

    Baki, Zhuldyzay

    2018-05-14

    Global warming is affecting the Earth climate year by year, the biggest difference being observable in increasing temperatures in the World Ocean. Following the long- term global ocean warming trend, average sea surface temperatures across the global tropics and subtropics have increased by 0.4–1◦C in the last 40 years. These rates become even higher in semi-enclosed southern seas, such as the Red Sea, threaten- ing the survival of thermal-sensitive species. As average sea surface temperatures are projected to continue to rise, careful study of future developments of extreme temper- atures is paramount for the sustainability of marine ecosystem and biodiversity. In this thesis, we use Extreme-Value Theory to study sea surface temperature extremes from a gridded dataset comprising 16703 locations over the Red Sea. The data were provided by Operational SST and Sea Ice Analysis (OSTIA), a satellite-based data system designed for numerical weather prediction. After pre-processing the data to account for seasonality and global trends, we analyze the marginal distribution of ex- tremes, defined as observations exceeding a high spatially varying threshold, using the Generalized Pareto distribution. This model allows us to extrapolate beyond the ob- served data to compute the 100-year return levels over the entire Red Sea, confirming the increasing trend of extreme temperatures. To understand the dynamics govern- ing the dependence of extreme temperatures in the Red Sea, we propose a flexible local approach based on R-Pareto processes, which extend the univariate Generalized Pareto distribution to the spatial setting. Assuming that the sea surface temperature varies smoothly over space, we perform inference based on the gradient score method over small regional neighborhoods, in which the data are assumed to be stationary in space. This approach allows us to capture spatial non-stationarity, and to reduce the overall computational cost by taking advantage of

  13. ColliderBit. A GAMBIT module for the calculation of high-energy collider observables and likelihoods

    Energy Technology Data Exchange (ETDEWEB)

    Balazs, Csaba [Monash University, School of Physics and Astronomy, Melbourne, VIC (Australia); Australian Research Council Centre of Excellence for Particle Physics at the Tera-scale (Australia); Buckley, Andy [University of Glasgow, SUPA, School of Physics and Astronomy, Glasgow (United Kingdom); Dal, Lars A.; Krislock, Abram; Raklev, Are [University of Oslo, Department of Physics, Oslo (Norway); Farmer, Ben [AlbaNova University Centre, Oskar Klein Centre for Cosmoparticle Physics, Stockholm (Sweden); Jackson, Paul; Murnane, Daniel; White, Martin [Australian Research Council Centre of Excellence for Particle Physics at the Tera-scale (Australia); University of Adelaide, Department of Physics, Adelaide, SA (Australia); Kvellestad, Anders [NORDITA, Stockholm (Sweden); Putze, Antje [Universite de Savoie, LAPTh, Annecy-le-Vieux (France); Rogan, Christopher [Harvard University, Department of Physics, Cambridge, MA (United States); Saavedra, Aldo [Australian Research Council Centre of Excellence for Particle Physics at the Tera-scale (Australia); The University of Sydney, Faculty of Engineering and Information Technologies, Centre for Translational Data Science, School of Physics, Sydney, NSW (Australia); Scott, Pat [Imperial College London, Blackett Laboratory, Department of Physics, London (United Kingdom); Weniger, Christoph [University of Amsterdam, GRAPPA, Institute of Physics, Amsterdam (Netherlands); Collaboration: The GAMBIT Scanner Workgroup

    2017-11-15

    We describe ColliderBit, a new code for the calculation of high energy collider observables in theories of physics beyond the Standard Model (BSM). ColliderBit features a generic interface to BSM models, a unique parallelised Monte Carlo event generation scheme suitable for large-scale supercomputer applications, and a number of LHC analyses, covering a reasonable range of the BSM signatures currently sought by ATLAS and CMS. ColliderBit also calculates likelihoods for Higgs sector observables, and LEP searches for BSM particles. These features are provided by a combination of new code unique toColliderBit, and interfaces to existing state-of-the-art public codes. ColliderBit is both an important part of the GAMBIT framework for BSM inference, and a standalone tool for efficiently applying collider constraints to theories of new physics. (orig.)

  14. SkyFACT: high-dimensional modeling of gamma-ray emission with adaptive templates and penalized likelihoods

    Energy Technology Data Exchange (ETDEWEB)

    Storm, Emma; Weniger, Christoph [GRAPPA, Institute of Physics, University of Amsterdam, Science Park 904, 1090 GL Amsterdam (Netherlands); Calore, Francesca, E-mail: e.m.storm@uva.nl, E-mail: c.weniger@uva.nl, E-mail: francesca.calore@lapth.cnrs.fr [LAPTh, CNRS, 9 Chemin de Bellevue, BP-110, Annecy-le-Vieux, 74941, Annecy Cedex (France)

    2017-08-01

    We present SkyFACT (Sky Factorization with Adaptive Constrained Templates), a new approach for studying, modeling and decomposing diffuse gamma-ray emission. Like most previous analyses, the approach relies on predictions from cosmic-ray propagation codes like GALPROP and DRAGON. However, in contrast to previous approaches, we account for the fact that models are not perfect and allow for a very large number (∼> 10{sup 5}) of nuisance parameters to parameterize these imperfections. We combine methods of image reconstruction and adaptive spatio-spectral template regression in one coherent hybrid approach. To this end, we use penalized Poisson likelihood regression, with regularization functions that are motivated by the maximum entropy method. We introduce methods to efficiently handle the high dimensionality of the convex optimization problem as well as the associated semi-sparse covariance matrix, using the L-BFGS-B algorithm and Cholesky factorization. We test the method both on synthetic data as well as on gamma-ray emission from the inner Galaxy, |ℓ|<90{sup o} and | b |<20{sup o}, as observed by the Fermi Large Area Telescope. We finally define a simple reference model that removes most of the residual emission from the inner Galaxy, based on conventional diffuse emission components as well as components for the Fermi bubbles, the Fermi Galactic center excess, and extended sources along the Galactic disk. Variants of this reference model can serve as basis for future studies of diffuse emission in and outside the Galactic disk.

  15. Maximum likelihood phylogenetic reconstruction from high-resolution whole-genome data and a tree of 68 eukaryotes.

    Science.gov (United States)

    Lin, Yu; Hu, Fei; Tang, Jijun; Moret, Bernard M E

    2013-01-01

    The rapid accumulation of whole-genome data has renewed interest in the study of the evolution of genomic architecture, under such events as rearrangements, duplications, losses. Comparative genomics, evolutionary biology, and cancer research all require tools to elucidate the mechanisms, history, and consequences of those evolutionary events, while phylogenetics could use whole-genome data to enhance its picture of the Tree of Life. Current approaches in the area of phylogenetic analysis are limited to very small collections of closely related genomes using low-resolution data (typically a few hundred syntenic blocks); moreover, these approaches typically do not include duplication and loss events. We describe a maximum likelihood (ML) approach for phylogenetic analysis that takes into account genome rearrangements as well as duplications, insertions, and losses. Our approach can handle high-resolution genomes (with 40,000 or more markers) and can use in the same analysis genomes with very different numbers of markers. Because our approach uses a standard ML reconstruction program (RAxML), it scales up to large trees. We present the results of extensive testing on both simulated and real data showing that our approach returns very accurate results very quickly. In particular, we analyze a dataset of 68 high-resolution eukaryotic genomes, with from 3,000 to 42,000 genes, from the eGOB database; the analysis, including bootstrapping, takes just 3 hours on a desktop system and returns a tree in agreement with all well supported branches, while also suggesting resolutions for some disputed placements.

  16. A Penalized Likelihood Framework For High-Dimensional Phylogenetic Comparative Methods And An Application To New-World Monkeys Brain Evolution.

    Science.gov (United States)

    Julien, Clavel; Leandro, Aristide; Hélène, Morlon

    2018-06-19

    Working with high-dimensional phylogenetic comparative datasets is challenging because likelihood-based multivariate methods suffer from low statistical performances as the number of traits p approaches the number of species n and because some computational complications occur when p exceeds n. Alternative phylogenetic comparative methods have recently been proposed to deal with the large p small n scenario but their use and performances are limited. Here we develop a penalized likelihood framework to deal with high-dimensional comparative datasets. We propose various penalizations and methods for selecting the intensity of the penalties. We apply this general framework to the estimation of parameters (the evolutionary trait covariance matrix and parameters of the evolutionary model) and model comparison for the high-dimensional multivariate Brownian (BM), Early-burst (EB), Ornstein-Uhlenbeck (OU) and Pagel's lambda models. We show using simulations that our penalized likelihood approach dramatically improves the estimation of evolutionary trait covariance matrices and model parameters when p approaches n, and allows for their accurate estimation when p equals or exceeds n. In addition, we show that penalized likelihood models can be efficiently compared using Generalized Information Criterion (GIC). We implement these methods, as well as the related estimation of ancestral states and the computation of phylogenetic PCA in the R package RPANDA and mvMORPH. Finally, we illustrate the utility of the new proposed framework by evaluating evolutionary models fit, analyzing integration patterns, and reconstructing evolutionary trajectories for a high-dimensional 3-D dataset of brain shape in the New World monkeys. We find a clear support for an Early-burst model suggesting an early diversification of brain morphology during the ecological radiation of the clade. Penalized likelihood offers an efficient way to deal with high-dimensional multivariate comparative data.

  17. Evaluation of the likelihood of reflux developing in patients with recurrent upper respiratory infections, recurrent sinusitis or recurrent otitis seen in ear-nose-throat outpatient clinics.

    Science.gov (United States)

    Önal, Zerrin; Çullu-Çokuğraş, Fügen; Işıldak, Hüseyin; Kaytaz, Asım; Kutlu, Tufan; Erkan, Tülay; Doğusoy, Gülen

    2015-01-01

    Gastroesophageal reflux is considered a risk factor for recurrent or persistent upper and lower respiratory tract conditions including asthma, chronic cough, sinusitis, laryngitis, serous otitis and paroxysmal laryngospasm. Fifty-one subjects with recurrent (more than three) episodes of upper respiratory tract infection (URTI), serous otitis or sinusitis who had been admitted to an earnose- throat (ENT) outpatient clinic during the previous year were enrolled in the present study to evaluate the presence of laryngeal and/or esophageal reflux. The participants, who were randomly selected, were questioned about symptoms of reflux, including vomiting, abdominal pain, failure to thrive, halitosis, bitter taste in the mouth, chronic cough, heartburn, constipation and hoarseness. All subjects had an endoscopic examination, an otoscopic examination, a tympanogram and upper GI system endoscopy. Esophagitis was diagnosed endoscopically and histologically. The likelihood of occurrence of esophagitis was found to be higher only among subjects with postglottic edema/erythema as determined by pathological laryngeal examination. The reflux complaints reported did not predict the development of esophagitis, but the odds of esophagitis occurring were ninefold greater among subjects with recurrent otitis. Of the subjects, 45.1% were Helicobacter pylori-positive. However, no association was found between esophagitis and Helicobacter pylori positivity. The likelihood of the occurrence of esophagitis was found to be increased in the presence of recurrent otitis media and/or postglottic edema, irrespective of the presence of reflux symptoms. We concluded that, in contrast to the situation where adults are concerned, the boundaries for discriminating laryngopharyngeal reflux from gastroesophageal reflux are somewhat blurred in pediatric patients.

  18. Logic of likelihood

    International Nuclear Information System (INIS)

    Wall, M.J.W.

    1992-01-01

    The notion of open-quotes probabilityclose quotes is generalized to that of open-quotes likelihood,close quotes and a natural logical structure is shown to exist for any physical theory which predicts likelihoods. Two physically based axioms are given for this logical structure to form an orthomodular poset, with an order-determining set of states. The results strengthen the basis of the quantum logic approach to axiomatic quantum theory. 25 refs

  19. Clinical high risk for psychosis

    DEFF Research Database (Denmark)

    van der Steen, Y; Gimpel-Drees, J; Lataster, T

    2017-01-01

    OBJECTIVE: The aim of this study was to assess associations between momentary stress and both affective and psychotic symptoms in everyday life of individuals at clinical high risk (CHR), compared to chronic psychotic patients and healthy controls, in search for evidence of early stress...... and 26 healthy controls. RESULTS: Multilevel models showed significantly larger associations between negative affect (NA) and activity-related stress for CHR patients than for psychotic patients (P = 0.008) and for CHR compared to controls (P

  20. Predicting reattendance at a high-risk breast cancer clinic.

    Science.gov (United States)

    Ormseth, Sarah R; Wellisch, David K; Aréchiga, Adam E; Draper, Taylor L

    2015-10-01

    The research about follow-up patterns of women attending high-risk breast-cancer clinics is sparse. This study sought to profile daughters of breast-cancer patients who are likely to return versus those unlikely to return for follow-up care in a high-risk clinic. Our investigation included 131 patients attending the UCLA Revlon Breast Center High Risk Clinic. Predictor variables included age, computed breast-cancer risk, participants' perceived personal risk, clinically significant depressive symptomatology (CES-D score ≥ 16), current level of anxiety (State-Trait Anxiety Inventory), and survival status of participants' mothers (survived or passed away from breast cancer). A greater likelihood of reattendance was associated with older age (adjusted odds ratio [AOR] = 1.07, p = 0.004), computed breast-cancer risk (AOR = 1.10, p = 0.017), absence of depressive symptomatology (AOR = 0.25, p = 0.009), past psychiatric diagnosis (AOR = 3.14, p = 0.029), and maternal loss to breast cancer (AOR = 2.59, p = 0.034). Also, an interaction was found between mother's survival and perceived risk (p = 0.019), such that reattendance was associated with higher perceived risk among participants whose mothers survived (AOR = 1.04, p = 0.002), but not those whose mothers died (AOR = 0.99, p = 0.685). Furthermore, a nonlinear inverted "U" relationship was observed between state anxiety and reattendance (p = 0.037); participants with moderate anxiety were more likely to reattend than those with low or high anxiety levels. Demographic, medical, and psychosocial factors were found to be independently associated with reattendance to a high-risk breast-cancer clinic. Explication of the profiles of women who may or may not reattend may serve to inform the development and implementation of interventions to increase the likelihood of follow-up care.

  1. Marijuana usage in relation to harmfulness ratings, perceived likelihood of negative consequences, and defense mechanisms in high school students.

    Science.gov (United States)

    Como-Lesko, N; Primavera, L H; Szeszko, P R

    1994-08-01

    This study investigated high school students' marijuana usage patterns in relation to their harmfulness ratings of 15 licit and illicit drugs, perceived negative consequences from using marijuana, and types of defense mechanisms employed. Subjects were classified into one of five pattern-of-use groups based on marijuana usage: principled nonusers, nonusers, light users, moderate users, and heavy users. Principled nonusers (individuals who have never used marijuana and would not do so if it was legalized) rated marijuana, hashish, cocaine, and alcohol as significantly more harmful than heavy users. A cluster analysis of the drugs' harmfulness ratings best fit a three cluster solution and were named medicinal drugs, recreational drugs, and hard drugs. In general, principled nonusers rated negative consequences from using marijuana as significantly more likely to occur than other groups. Principled nonusers and heavy users utilized reversal from the Defense Mechanism Inventory, which includes repression and denial, significantly more than nonusers, indicating some trait common to the two extreme pattern-of-use groups.

  2. Incremental value of myocardial perfusion over coronary angiography by spectral computed tomography in patients with intermediate to high likelihood of coronary artery disease

    Energy Technology Data Exchange (ETDEWEB)

    Carrascosa, Patricia M., E-mail: investigacion@diagnosticomaipu.com.ar; Deviggiano, Alejandro; Capunay, Carlos; Campisi, Roxana; López Munain, Marina de; Vallejos, Javier; Tajer, Carlos; Rodriguez-Granillo, Gaston A.

    2015-04-15

    Highlights: •We evaluated myocardial perfusion by dual energy computed tomography (DECT). •We included patients with intermediate to high likelihood of coronary artery disease. •Stress myocardial perfusion by DECT had a reliable accuracy for the detection of ischemia. •Stress myocardial perfusion with DECT showed an incremental value over anatomical evaluation. •DECT imaging was associated to a significant reduction in radiation dose compared to SPECT. -- Abstract: Purpose: We sought to explore the diagnostic performance of dual energy computed tomography (DECT) for the evaluation of myocardial perfusion in patients with intermediate to high likelihood of coronary artery disease (CAD). Materials and methods: Consecutive patients with known or suspected CAD referred for myocardial perfusion imaging by single-photon emission computed tomography (SPECT) constituted the study population and were scanned using a DECT scanner equipped with gemstone detectors for spectral imaging, and a SPECT. The same pharmacological stress was used for both scans. Results: Twenty-five patients were prospectively included in the study protocol. The mean age was 63.4 ± 10.6 years. The total mean effective radiation dose was 7.5 ± 1.2 mSv with DECT and 8.2 ± 1.7 mSv with SPECT (p = 0.007). A total of 425 left ventricular segments were evaluated by DECT, showing a reliable accuracy for the detection of reversible perfusion defects [area under ROC curve (AUC) 0.84 (0.80–0.87)]. Furthermore, adding stress myocardial perfusion provided a significant incremental value over anatomical evaluation alone by computed tomography coronary angiography [AUC 0.70 (0.65–0.74), p = 0.003]. Conclusions: In this pilot investigation, stress myocardial perfusion by DECT demonstrated a significant incremental value over anatomical evaluation alone by CTCA for the detection of reversible perfusion defects.

  3. Earthquake likelihood model testing

    Science.gov (United States)

    Schorlemmer, D.; Gerstenberger, M.C.; Wiemer, S.; Jackson, D.D.; Rhoades, D.A.

    2007-01-01

    INTRODUCTIONThe Regional Earthquake Likelihood Models (RELM) project aims to produce and evaluate alternate models of earthquake potential (probability per unit volume, magnitude, and time) for California. Based on differing assumptions, these models are produced to test the validity of their assumptions and to explore which models should be incorporated in seismic hazard and risk evaluation. Tests based on physical and geological criteria are useful but we focus on statistical methods using future earthquake catalog data only. We envision two evaluations: a test of consistency with observed data and a comparison of all pairs of models for relative consistency. Both tests are based on the likelihood method, and both are fully prospective (i.e., the models are not adjusted to fit the test data). To be tested, each model must assign a probability to any possible event within a specified region of space, time, and magnitude. For our tests the models must use a common format: earthquake rates in specified “bins” with location, magnitude, time, and focal mechanism limits.Seismology cannot yet deterministically predict individual earthquakes; however, it should seek the best possible models for forecasting earthquake occurrence. This paper describes the statistical rules of an experiment to examine and test earthquake forecasts. The primary purposes of the tests described below are to evaluate physical models for earthquakes, assure that source models used in seismic hazard and risk studies are consistent with earthquake data, and provide quantitative measures by which models can be assigned weights in a consensus model or be judged as suitable for particular regions.In this paper we develop a statistical method for testing earthquake likelihood models. A companion paper (Schorlemmer and Gerstenberger 2007, this issue) discusses the actual implementation of these tests in the framework of the RELM initiative.Statistical testing of hypotheses is a common task and a

  4. Likelihood estimators for multivariate extremes

    KAUST Repository

    Huser, Raphaë l; Davison, Anthony C.; Genton, Marc G.

    2015-01-01

    The main approach to inference for multivariate extremes consists in approximating the joint upper tail of the observations by a parametric family arising in the limit for extreme events. The latter may be expressed in terms of componentwise maxima, high threshold exceedances or point processes, yielding different but related asymptotic characterizations and estimators. The present paper clarifies the connections between the main likelihood estimators, and assesses their practical performance. We investigate their ability to estimate the extremal dependence structure and to predict future extremes, using exact calculations and simulation, in the case of the logistic model.

  5. Likelihood estimators for multivariate extremes

    KAUST Repository

    Huser, Raphaël

    2015-11-17

    The main approach to inference for multivariate extremes consists in approximating the joint upper tail of the observations by a parametric family arising in the limit for extreme events. The latter may be expressed in terms of componentwise maxima, high threshold exceedances or point processes, yielding different but related asymptotic characterizations and estimators. The present paper clarifies the connections between the main likelihood estimators, and assesses their practical performance. We investigate their ability to estimate the extremal dependence structure and to predict future extremes, using exact calculations and simulation, in the case of the logistic model.

  6. Likelihood devices in spatial statistics

    NARCIS (Netherlands)

    Zwet, E.W. van

    1999-01-01

    One of the main themes of this thesis is the application to spatial data of modern semi- and nonparametric methods. Another, closely related theme is maximum likelihood estimation from spatial data. Maximum likelihood estimation is not common practice in spatial statistics. The method of moments

  7. Efficient Bit-to-Symbol Likelihood Mappings

    Science.gov (United States)

    Moision, Bruce E.; Nakashima, Michael A.

    2010-01-01

    This innovation is an efficient algorithm designed to perform bit-to-symbol and symbol-to-bit likelihood mappings that represent a significant portion of the complexity of an error-correction code decoder for high-order constellations. Recent implementation of the algorithm in hardware has yielded an 8- percent reduction in overall area relative to the prior design.

  8. Extended likelihood inference in reliability

    International Nuclear Information System (INIS)

    Martz, H.F. Jr.; Beckman, R.J.; Waller, R.A.

    1978-10-01

    Extended likelihood methods of inference are developed in which subjective information in the form of a prior distribution is combined with sampling results by means of an extended likelihood function. The extended likelihood function is standardized for use in obtaining extended likelihood intervals. Extended likelihood intervals are derived for the mean of a normal distribution with known variance, the failure-rate of an exponential distribution, and the parameter of a binomial distribution. Extended second-order likelihood methods are developed and used to solve several prediction problems associated with the exponential and binomial distributions. In particular, such quantities as the next failure-time, the number of failures in a given time period, and the time required to observe a given number of failures are predicted for the exponential model with a gamma prior distribution on the failure-rate. In addition, six types of life testing experiments are considered. For the binomial model with a beta prior distribution on the probability of nonsurvival, methods are obtained for predicting the number of nonsurvivors in a given sample size and for predicting the required sample size for observing a specified number of nonsurvivors. Examples illustrate each of the methods developed. Finally, comparisons are made with Bayesian intervals in those cases where these are known to exist

  9. Obtaining reliable Likelihood Ratio tests from simulated likelihood functions

    DEFF Research Database (Denmark)

    Andersen, Laura Mørch

    It is standard practice by researchers and the default option in many statistical programs to base test statistics for mixed models on simulations using asymmetric draws (e.g. Halton draws). This paper shows that when the estimated likelihood functions depend on standard deviations of mixed param...

  10. Composite likelihood estimation of demographic parameters

    Directory of Open Access Journals (Sweden)

    Garrigan Daniel

    2009-11-01

    accuracy, demographic parameters from three simulated data sets that vary in the magnitude of a founder event and a skew in the effective population size of the X chromosome relative to the autosomes. The behavior of the Markov chain is also examined and shown to convergence to its stationary distribution, while also showing high levels of parameter mixing. The analysis of three pairwise comparisons of sub-Saharan African human populations with non-African human populations do not provide unequivocal support for a strong non-African founder event from these nuclear data. The estimates do however suggest a skew in the ratio of X chromosome to autosome effective population size that is greater than one. However in all three cases, the 95% highest posterior density interval for this ratio does include three-fourths, the value expected under an equal breeding sex ratio. Conclusion The implementation of composite and approximate likelihood methods in a framework that includes MCMCMC demographic parameter estimation shows great promise for being flexible and computationally efficient enough to scale up to the level of whole-genome polymorphism and divergence analysis. Further work must be done to characterize the effects of the assumption of linkage equilibrium among genomic regions that is crucial to the validity of applying the composite likelihood method.

  11. Planck intermediate results: XVI. Profile likelihoods for cosmological parameters

    DEFF Research Database (Denmark)

    Bartlett, J.G.; Cardoso, J.-F.; Delabrouille, J.

    2014-01-01

    We explore the 2013 Planck likelihood function with a high-precision multi-dimensional minimizer (Minuit). This allows a refinement of the CDM best-fit solution with respect to previously-released results, and the construction of frequentist confidence intervals using profile likelihoods. The agr...

  12. Ego involvement increases doping likelihood.

    Science.gov (United States)

    Ring, Christopher; Kavussanu, Maria

    2018-08-01

    Achievement goal theory provides a framework to help understand how individuals behave in achievement contexts, such as sport. Evidence concerning the role of motivation in the decision to use banned performance enhancing substances (i.e., doping) is equivocal on this issue. The extant literature shows that dispositional goal orientation has been weakly and inconsistently associated with doping intention and use. It is possible that goal involvement, which describes the situational motivational state, is a stronger determinant of doping intention. Accordingly, the current study used an experimental design to examine the effects of goal involvement, manipulated using direct instructions and reflective writing, on doping likelihood in hypothetical situations in college athletes. The ego-involving goal increased doping likelihood compared to no goal and a task-involving goal. The present findings provide the first evidence that ego involvement can sway the decision to use doping to improve athletic performance.

  13. Phylogenetic analysis using parsimony and likelihood methods.

    Science.gov (United States)

    Yang, Z

    1996-02-01

    The assumptions underlying the maximum-parsimony (MP) method of phylogenetic tree reconstruction were intuitively examined by studying the way the method works. Computer simulations were performed to corroborate the intuitive examination. Parsimony appears to involve very stringent assumptions concerning the process of sequence evolution, such as constancy of substitution rates between nucleotides, constancy of rates across nucleotide sites, and equal branch lengths in the tree. For practical data analysis, the requirement of equal branch lengths means similar substitution rates among lineages (the existence of an approximate molecular clock), relatively long interior branches, and also few species in the data. However, a small amount of evolution is neither a necessary nor a sufficient requirement of the method. The difficulties involved in the application of current statistical estimation theory to tree reconstruction were discussed, and it was suggested that the approach proposed by Felsenstein (1981, J. Mol. Evol. 17: 368-376) for topology estimation, as well as its many variations and extensions, differs fundamentally from the maximum likelihood estimation of a conventional statistical parameter. Evidence was presented showing that the Felsenstein approach does not share the asymptotic efficiency of the maximum likelihood estimator of a statistical parameter. Computer simulations were performed to study the probability that MP recovers the true tree under a hierarchy of models of nucleotide substitution; its performance relative to the likelihood method was especially noted. The results appeared to support the intuitive examination of the assumptions underlying MP. When a simple model of nucleotide substitution was assumed to generate data, the probability that MP recovers the true topology could be as high as, or even higher than, that for the likelihood method. When the assumed model became more complex and realistic, e.g., when substitution rates were

  14. Factors Associated with Young Adults’ Pregnancy Likelihood

    Science.gov (United States)

    Kitsantas, Panagiota; Lindley, Lisa L.; Wu, Huichuan

    2014-01-01

    OBJECTIVES While progress has been made to reduce adolescent pregnancies in the United States, rates of unplanned pregnancy among young adults (18–29 years) remain high. In this study, we assessed factors associated with perceived likelihood of pregnancy (likelihood of getting pregnant/getting partner pregnant in the next year) among sexually experienced young adults who were not trying to get pregnant and had ever used contraceptives. METHODS We conducted a secondary analysis of 660 young adults, 18–29 years old in the United States, from the cross-sectional National Survey of Reproductive and Contraceptive Knowledge. Logistic regression and classification tree analyses were conducted to generate profiles of young adults most likely to report anticipating a pregnancy in the next year. RESULTS Nearly one-third (32%) of young adults indicated they believed they had at least some likelihood of becoming pregnant in the next year. Young adults who believed that avoiding pregnancy was not very important were most likely to report pregnancy likelihood (odds ratio [OR], 5.21; 95% CI, 2.80–9.69), as were young adults for whom avoiding a pregnancy was important but not satisfied with their current contraceptive method (OR, 3.93; 95% CI, 1.67–9.24), attended religious services frequently (OR, 3.0; 95% CI, 1.52–5.94), were uninsured (OR, 2.63; 95% CI, 1.31–5.26), and were likely to have unprotected sex in the next three months (OR, 1.77; 95% CI, 1.04–3.01). DISCUSSION These results may help guide future research and the development of pregnancy prevention interventions targeting sexually experienced young adults. PMID:25782849

  15. Efficient Detection of Repeating Sites to Accelerate Phylogenetic Likelihood Calculations.

    Science.gov (United States)

    Kobert, K; Stamatakis, A; Flouri, T

    2017-03-01

    The phylogenetic likelihood function (PLF) is the major computational bottleneck in several applications of evolutionary biology such as phylogenetic inference, species delimitation, model selection, and divergence times estimation. Given the alignment, a tree and the evolutionary model parameters, the likelihood function computes the conditional likelihood vectors for every node of the tree. Vector entries for which all input data are identical result in redundant likelihood operations which, in turn, yield identical conditional values. Such operations can be omitted for improving run-time and, using appropriate data structures, reducing memory usage. We present a fast, novel method for identifying and omitting such redundant operations in phylogenetic likelihood calculations, and assess the performance improvement and memory savings attained by our method. Using empirical and simulated data sets, we show that a prototype implementation of our method yields up to 12-fold speedups and uses up to 78% less memory than one of the fastest and most highly tuned implementations of the PLF currently available. Our method is generic and can seamlessly be integrated into any phylogenetic likelihood implementation. [Algorithms; maximum likelihood; phylogenetic likelihood function; phylogenetics]. © The Author(s) 2016. Published by Oxford University Press, on behalf of the Society of Systematic Biologists.

  16. Maximum likelihood of phylogenetic networks.

    Science.gov (United States)

    Jin, Guohua; Nakhleh, Luay; Snir, Sagi; Tuller, Tamir

    2006-11-01

    Horizontal gene transfer (HGT) is believed to be ubiquitous among bacteria, and plays a major role in their genome diversification as well as their ability to develop resistance to antibiotics. In light of its evolutionary significance and implications for human health, developing accurate and efficient methods for detecting and reconstructing HGT is imperative. In this article we provide a new HGT-oriented likelihood framework for many problems that involve phylogeny-based HGT detection and reconstruction. Beside the formulation of various likelihood criteria, we show that most of these problems are NP-hard, and offer heuristics for efficient and accurate reconstruction of HGT under these criteria. We implemented our heuristics and used them to analyze biological as well as synthetic data. In both cases, our criteria and heuristics exhibited very good performance with respect to identifying the correct number of HGT events as well as inferring their correct location on the species tree. Implementation of the criteria as well as heuristics and hardness proofs are available from the authors upon request. Hardness proofs can also be downloaded at http://www.cs.tau.ac.il/~tamirtul/MLNET/Supp-ML.pdf

  17. Dimension-Independent Likelihood-Informed MCMC

    KAUST Repository

    Cui, Tiangang; Law, Kody; Marzouk, Youssef

    2015-01-01

    Many Bayesian inference problems require exploring the posterior distribution of high-dimensional parameters, which in principle can be described as functions. By exploiting low-dimensional structure in the change from prior to posterior [distributions], we introduce a suite of MCMC samplers that can adapt to the complex structure of the posterior distribution, yet are well-defined on function space. Posterior sampling in nonlinear inverse problems arising from various partial di erential equations and also a stochastic differential equation are used to demonstrate the e ciency of these dimension-independent likelihood-informed samplers.

  18. Dimension-Independent Likelihood-Informed MCMC

    KAUST Repository

    Cui, Tiangang

    2015-01-07

    Many Bayesian inference problems require exploring the posterior distribution of high-dimensional parameters, which in principle can be described as functions. By exploiting low-dimensional structure in the change from prior to posterior [distributions], we introduce a suite of MCMC samplers that can adapt to the complex structure of the posterior distribution, yet are well-defined on function space. Posterior sampling in nonlinear inverse problems arising from various partial di erential equations and also a stochastic differential equation are used to demonstrate the e ciency of these dimension-independent likelihood-informed samplers.

  19. Dissociating response conflict and error likelihood in anterior cingulate cortex.

    Science.gov (United States)

    Yeung, Nick; Nieuwenhuis, Sander

    2009-11-18

    Neuroimaging studies consistently report activity in anterior cingulate cortex (ACC) in conditions of high cognitive demand, leading to the view that ACC plays a crucial role in the control of cognitive processes. According to one prominent theory, the sensitivity of ACC to task difficulty reflects its role in monitoring for the occurrence of competition, or "conflict," between responses to signal the need for increased cognitive control. However, a contrasting theory proposes that ACC is the recipient rather than source of monitoring signals, and that ACC activity observed in relation to task demand reflects the role of this region in learning about the likelihood of errors. Response conflict and error likelihood are typically confounded, making the theories difficult to distinguish empirically. The present research therefore used detailed computational simulations to derive contrasting predictions regarding ACC activity and error rate as a function of response speed. The simulations demonstrated a clear dissociation between conflict and error likelihood: fast response trials are associated with low conflict but high error likelihood, whereas slow response trials show the opposite pattern. Using the N2 component as an index of ACC activity, an EEG study demonstrated that when conflict and error likelihood are dissociated in this way, ACC activity tracks conflict and is negatively correlated with error likelihood. These findings support the conflict-monitoring theory and suggest that, in speeded decision tasks, ACC activity reflects current task demands rather than the retrospective coding of past performance.

  20. Understanding the properties of diagnostic tests - Part 2: Likelihood ratios.

    Science.gov (United States)

    Ranganathan, Priya; Aggarwal, Rakesh

    2018-01-01

    Diagnostic tests are used to identify subjects with and without disease. In a previous article in this series, we examined some attributes of diagnostic tests - sensitivity, specificity, and predictive values. In this second article, we look at likelihood ratios, which are useful for the interpretation of diagnostic test results in everyday clinical practice.

  1. The Laplace Likelihood Ratio Test for Heteroscedasticity

    Directory of Open Access Journals (Sweden)

    J. Martin van Zyl

    2011-01-01

    Full Text Available It is shown that the likelihood ratio test for heteroscedasticity, assuming the Laplace distribution, gives good results for Gaussian and fat-tailed data. The likelihood ratio test, assuming normality, is very sensitive to any deviation from normality, especially when the observations are from a distribution with fat tails. Such a likelihood test can also be used as a robust test for a constant variance in residuals or a time series if the data is partitioned into groups.

  2. MXLKID: a maximum likelihood parameter identifier

    International Nuclear Information System (INIS)

    Gavel, D.T.

    1980-07-01

    MXLKID (MaXimum LiKelihood IDentifier) is a computer program designed to identify unknown parameters in a nonlinear dynamic system. Using noisy measurement data from the system, the maximum likelihood identifier computes a likelihood function (LF). Identification of system parameters is accomplished by maximizing the LF with respect to the parameters. The main body of this report briefly summarizes the maximum likelihood technique and gives instructions and examples for running the MXLKID program. MXLKID is implemented LRLTRAN on the CDC7600 computer at LLNL. A detailed mathematical description of the algorithm is given in the appendices. 24 figures, 6 tables

  3. Maximum Likelihood Reconstruction for Magnetic Resonance Fingerprinting.

    Science.gov (United States)

    Zhao, Bo; Setsompop, Kawin; Ye, Huihui; Cauley, Stephen F; Wald, Lawrence L

    2016-08-01

    This paper introduces a statistical estimation framework for magnetic resonance (MR) fingerprinting, a recently proposed quantitative imaging paradigm. Within this framework, we present a maximum likelihood (ML) formalism to estimate multiple MR tissue parameter maps directly from highly undersampled, noisy k-space data. A novel algorithm, based on variable splitting, the alternating direction method of multipliers, and the variable projection method, is developed to solve the resulting optimization problem. Representative results from both simulations and in vivo experiments demonstrate that the proposed approach yields significantly improved accuracy in parameter estimation, compared to the conventional MR fingerprinting reconstruction. Moreover, the proposed framework provides new theoretical insights into the conventional approach. We show analytically that the conventional approach is an approximation to the ML reconstruction; more precisely, it is exactly equivalent to the first iteration of the proposed algorithm for the ML reconstruction, provided that a gridding reconstruction is used as an initialization.

  4. Essays on empirical likelihood in economics

    NARCIS (Netherlands)

    Gao, Z.

    2012-01-01

    This thesis intends to exploit the roots of empirical likelihood and its related methods in mathematical programming and computation. The roots will be connected and the connections will induce new solutions for the problems of estimation, computation, and generalization of empirical likelihood.

  5. Highly effective cystic fibrosis clinical research teams: critical success factors.

    Science.gov (United States)

    Retsch-Bogart, George Z; Van Dalfsen, Jill M; Marshall, Bruce C; George, Cynthia; Pilewski, Joseph M; Nelson, Eugene C; Goss, Christopher H; Ramsey, Bonnie W

    2014-08-01

    Bringing new therapies to patients with rare diseases depends in part on optimizing clinical trial conduct through efficient study start-up processes and rapid enrollment. Suboptimal execution of clinical trials in academic medical centers not only results in high cost to institutions and sponsors, but also delays the availability of new therapies. Addressing the factors that contribute to poor outcomes requires novel, systematic approaches tailored to the institution and disease under study. To use clinical trial performance metrics data analysis to select high-performing cystic fibrosis (CF) clinical research teams and then identify factors contributing to their success. Mixed-methods research, including semi-structured qualitative interviews of high-performing research teams. CF research teams at nine clinical centers from the CF Foundation Therapeutics Development Network. Survey of site characteristics, direct observation of team meetings and facilities, and semi-structured interviews with clinical research team members and institutional program managers and leaders in clinical research. Critical success factors noted at all nine high-performing centers were: 1) strong leadership, 2) established and effective communication within the research team and with the clinical care team, and 3) adequate staff. Other frequent characteristics included a mature culture of research, customer service orientation in interactions with study participants, shared efficient processes, continuous process improvement activities, and a businesslike approach to clinical research. Clinical research metrics allowed identification of high-performing clinical research teams. Site visits identified several critical factors leading to highly successful teams that may help other clinical research teams improve clinical trial performance.

  6. Dimension-independent likelihood-informed MCMC

    KAUST Repository

    Cui, Tiangang

    2015-10-08

    Many Bayesian inference problems require exploring the posterior distribution of high-dimensional parameters that represent the discretization of an underlying function. This work introduces a family of Markov chain Monte Carlo (MCMC) samplers that can adapt to the particular structure of a posterior distribution over functions. Two distinct lines of research intersect in the methods developed here. First, we introduce a general class of operator-weighted proposal distributions that are well defined on function space, such that the performance of the resulting MCMC samplers is independent of the discretization of the function. Second, by exploiting local Hessian information and any associated low-dimensional structure in the change from prior to posterior distributions, we develop an inhomogeneous discretization scheme for the Langevin stochastic differential equation that yields operator-weighted proposals adapted to the non-Gaussian structure of the posterior. The resulting dimension-independent and likelihood-informed (DILI) MCMC samplers may be useful for a large class of high-dimensional problems where the target probability measure has a density with respect to a Gaussian reference measure. Two nonlinear inverse problems are used to demonstrate the efficiency of these DILI samplers: an elliptic PDE coefficient inverse problem and path reconstruction in a conditioned diffusion.

  7. Dimension-independent likelihood-informed MCMC

    KAUST Repository

    Cui, Tiangang; Law, Kody; Marzouk, Youssef M.

    2015-01-01

    Many Bayesian inference problems require exploring the posterior distribution of high-dimensional parameters that represent the discretization of an underlying function. This work introduces a family of Markov chain Monte Carlo (MCMC) samplers that can adapt to the particular structure of a posterior distribution over functions. Two distinct lines of research intersect in the methods developed here. First, we introduce a general class of operator-weighted proposal distributions that are well defined on function space, such that the performance of the resulting MCMC samplers is independent of the discretization of the function. Second, by exploiting local Hessian information and any associated low-dimensional structure in the change from prior to posterior distributions, we develop an inhomogeneous discretization scheme for the Langevin stochastic differential equation that yields operator-weighted proposals adapted to the non-Gaussian structure of the posterior. The resulting dimension-independent and likelihood-informed (DILI) MCMC samplers may be useful for a large class of high-dimensional problems where the target probability measure has a density with respect to a Gaussian reference measure. Two nonlinear inverse problems are used to demonstrate the efficiency of these DILI samplers: an elliptic PDE coefficient inverse problem and path reconstruction in a conditioned diffusion.

  8. Asymptotic Likelihood Distribution for Correlated & Constrained Systems

    CERN Document Server

    Agarwal, Ujjwal

    2016-01-01

    It describes my work as summer student at CERN. The report discusses the asymptotic distribution of the likelihood ratio for total no. of parameters being h and 2 out of these being are constrained and correlated.

  9. Maximum-Likelihood Detection Of Noncoherent CPM

    Science.gov (United States)

    Divsalar, Dariush; Simon, Marvin K.

    1993-01-01

    Simplified detectors proposed for use in maximum-likelihood-sequence detection of symbols in alphabet of size M transmitted by uncoded, full-response continuous phase modulation over radio channel with additive white Gaussian noise. Structures of receivers derived from particular interpretation of maximum-likelihood metrics. Receivers include front ends, structures of which depends only on M, analogous to those in receivers of coherent CPM. Parts of receivers following front ends have structures, complexity of which would depend on N.

  10. Multilevel maximum likelihood estimation with application to covariance matrices

    Czech Academy of Sciences Publication Activity Database

    Turčičová, Marie; Mandel, J.; Eben, Kryštof

    Published online: 23 January ( 2018 ) ISSN 0361-0926 R&D Projects: GA ČR GA13-34856S Institutional support: RVO:67985807 Keywords : Fisher information * High dimension * Hierarchical maximum likelihood * Nested parameter spaces * Spectral diagonal covariance model * Sparse inverse covariance model Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.311, year: 2016

  11. Maximum Likelihood and Restricted Likelihood Solutions in Multiple-Method Studies.

    Science.gov (United States)

    Rukhin, Andrew L

    2011-01-01

    A formulation of the problem of combining data from several sources is discussed in terms of random effects models. The unknown measurement precision is assumed not to be the same for all methods. We investigate maximum likelihood solutions in this model. By representing the likelihood equations as simultaneous polynomial equations, the exact form of the Groebner basis for their stationary points is derived when there are two methods. A parametrization of these solutions which allows their comparison is suggested. A numerical method for solving likelihood equations is outlined, and an alternative to the maximum likelihood method, the restricted maximum likelihood, is studied. In the situation when methods variances are considered to be known an upper bound on the between-method variance is obtained. The relationship between likelihood equations and moment-type equations is also discussed.

  12. Likelihood inference for unions of interacting discs

    DEFF Research Database (Denmark)

    Møller, Jesper; Helisová, Katarina

    To the best of our knowledge, this is the first paper which discusses likelihood inference or a random set using a germ-grain model, where the individual grains are unobservable edge effects occur, and other complications appear. We consider the case where the grains form a disc process modelled...... is specified with respect to a given marked Poisson model (i.e. a Boolean model). We show how edge effects and other complications can be handled by considering a certain conditional likelihood. Our methodology is illustrated by analyzing Peter Diggle's heather dataset, where we discuss the results...... of simulation-based maximum likelihood inference and the effect of specifying different reference Poisson models....

  13. Maximum likelihood estimation for integrated diffusion processes

    DEFF Research Database (Denmark)

    Baltazar-Larios, Fernando; Sørensen, Michael

    We propose a method for obtaining maximum likelihood estimates of parameters in diffusion models when the data is a discrete time sample of the integral of the process, while no direct observations of the process itself are available. The data are, moreover, assumed to be contaminated...... EM-algorithm to obtain maximum likelihood estimates of the parameters in the diffusion model. As part of the algorithm, we use a recent simple method for approximate simulation of diffusion bridges. In simulation studies for the Ornstein-Uhlenbeck process and the CIR process the proposed method works...... by measurement errors. Integrated volatility is an example of this type of observations. Another example is ice-core data on oxygen isotopes used to investigate paleo-temperatures. The data can be viewed as incomplete observations of a model with a tractable likelihood function. Therefore we propose a simulated...

  14. Clinical and molecular features of high-grade osteosarcoma

    NARCIS (Netherlands)

    Anninga, Jakob Klaas

    2013-01-01

    It can be concluded from this thesis that high-grade osteosarcoma is at clinical, pathological and molecular level a heterogeneous disease. To treat high-grade osteosarcoma, neo-adjuvant chemotherapy should be combined with radical surgery, irrespective the localization. There are only 4 effective

  15. Maintaining symmetry of simulated likelihood functions

    DEFF Research Database (Denmark)

    Andersen, Laura Mørch

    This paper suggests solutions to two different types of simulation errors related to Quasi-Monte Carlo integration. Likelihood functions which depend on standard deviations of mixed parameters are symmetric in nature. This paper shows that antithetic draws preserve this symmetry and thereby...... improves precision substantially. Another source of error is that models testing away mixing dimensions must replicate the relevant dimensions of the quasi-random draws in the simulation of the restricted likelihood. These simulation errors are ignored in the standard estimation procedures used today...

  16. Likelihood inference for unions of interacting discs

    DEFF Research Database (Denmark)

    Møller, Jesper; Helisova, K.

    2010-01-01

    This is probably the first paper which discusses likelihood inference for a random set using a germ-grain model, where the individual grains are unobservable, edge effects occur and other complications appear. We consider the case where the grains form a disc process modelled by a marked point...... process, where the germs are the centres and the marks are the associated radii of the discs. We propose to use a recent parametric class of interacting disc process models, where the minimal sufficient statistic depends on various geometric properties of the random set, and the density is specified......-based maximum likelihood inference and the effect of specifying different reference Poisson models....

  17. GENERALIZATION OF RAYLEIGH MAXIMUM LIKELIHOOD DESPECKLING FILTER USING QUADRILATERAL KERNELS

    Directory of Open Access Journals (Sweden)

    S. Sridevi

    2013-02-01

    Full Text Available Speckle noise is the most prevalent noise in clinical ultrasound images. It visibly looks like light and dark spots and deduce the pixel intensity as murkiest. Gazing at fetal ultrasound images, the impact of edge and local fine details are more palpable for obstetricians and gynecologists to carry out prenatal diagnosis of congenital heart disease. A robust despeckling filter has to be contrived to proficiently suppress speckle noise and simultaneously preserve the features. The proposed filter is the generalization of Rayleigh maximum likelihood filter by the exploitation of statistical tools as tuning parameters and use different shapes of quadrilateral kernels to estimate the noise free pixel from neighborhood. The performance of various filters namely Median, Kuwahura, Frost, Homogenous mask filter and Rayleigh maximum likelihood filter are compared with the proposed filter in terms PSNR and image profile. Comparatively the proposed filters surpass the conventional filters.

  18. Planck 2013 results. XV. CMB power spectra and likelihood

    CERN Document Server

    Ade, P.A.R.; Armitage-Caplan, C.; Arnaud, M.; Ashdown, M.; Atrio-Barandela, F.; Aumont, J.; Baccigalupi, C.; Banday, A.J.; Barreiro, R.B.; Bartlett, J.G.; Battaner, E.; Benabed, K.; Benoit, A.; Benoit-Levy, A.; Bernard, J.P.; Bersanelli, M.; Bielewicz, P.; Bobin, J.; Bock, J.J.; Bonaldi, A.; Bonavera, L.; Bond, J.R.; Borrill, J.; Bouchet, F.R.; Boulanger, F.; Bridges, M.; Bucher, M.; Burigana, C.; Butler, R.C.; Calabrese, E.; Cardoso, J.F.; Catalano, A.; Challinor, A.; Chamballu, A.; Chiang, L.Y.; Chiang, H.C.; Christensen, P.R.; Church, S.; Clements, D.L.; Colombi, S.; Colombo, L.P.L.; Combet, C.; Couchot, F.; Coulais, A.; Crill, B.P.; Curto, A.; Cuttaia, F.; Danese, L.; Davies, R.D.; Davis, R.J.; de Bernardis, P.; de Rosa, A.; de Zotti, G.; Delabrouille, J.; Delouis, J.M.; Desert, F.X.; Dickinson, C.; Diego, J.M.; Dole, H.; Donzelli, S.; Dore, O.; Douspis, M.; Dunkley, J.; Dupac, X.; Efstathiou, G.; Elsner, F.; Ensslin, T.A.; Eriksen, H.K.; Finelli, F.; Forni, O.; Frailis, M.; Fraisse, A.A.; Franceschi, E.; Gaier, T.C.; Galeotta, S.; Galli, S.; Ganga, K.; Giard, M.; Giardino, G.; Giraud-Heraud, Y.; Gjerlow, E.; Gonzalez-Nuevo, J.; Gorski, K.M.; Gratton, S.; Gregorio, A.; Gruppuso, A.; Gudmundsson, J.E.; Hansen, F.K.; Hanson, D.; Harrison, D.; Helou, G.; Henrot-Versille, S.; Hernandez-Monteagudo, C.; Herranz, D.; Hildebrandt, S.R.; Hivon, E.; Hobson, M.; Holmes, W.A.; Hornstrup, A.; Hovest, W.; Huffenberger, K.M.; Hurier, G.; Jaffe, T.R.; Jaffe, A.H.; Jewell, J.; Jones, W.C.; Juvela, M.; Keihanen, E.; Keskitalo, R.; Kiiveri, K.; Kisner, T.S.; Kneissl, R.; Knoche, J.; Knox, L.; Kunz, M.; Kurki-Suonio, H.; Lagache, G.; Lahteenmaki, A.; Lamarre, J.M.; Lasenby, A.; Lattanzi, M.; Laureijs, R.J.; Lawrence, C.R.; Le Jeune, M.; Leach, S.; Leahy, J.P.; Leonardi, R.; Leon-Tavares, J.; Lesgourgues, J.; Liguori, M.; Lilje, P.B.; Lindholm, V.; Linden-Vornle, M.; Lopez-Caniego, M.; Lubin, P.M.; Macias-Perez, J.F.; Maffei, B.; Maino, D.; Mandolesi, N.; Marinucci, D.; Maris, M.; Marshall, D.J.; Martin, P.G.; Martinez-Gonzalez, E.; Masi, S.; Matarrese, S.; Matthai, F.; Mazzotta, P.; Meinhold, P.R.; Melchiorri, A.; Mendes, L.; Menegoni, E.; Mennella, A.; Migliaccio, M.; Millea, M.; Mitra, S.; Miville-Deschenes, M.A.; Molinari, D.; Moneti, A.; Montier, L.; Morgante, G.; Mortlock, D.; Moss, A.; Munshi, D.; Naselsky, P.; Nati, F.; Natoli, P.; Netterfield, C.B.; Norgaard-Nielsen, H.U.; Noviello, F.; Novikov, D.; Novikov, I.; O'Dwyer, I.J.; Orieux, F.; Osborne, S.; Oxborrow, C.A.; Paci, F.; Pagano, L.; Pajot, F.; Paladini, R.; Paoletti, D.; Partridge, B.; Pasian, F.; Patanchon, G.; Paykari, P.; Perdereau, O.; Perotto, L.; Perrotta, F.; Piacentini, F.; Piat, M.; Pierpaoli, E.; Pietrobon, D.; Plaszczynski, S.; Pointecouteau, E.; Polenta, G.; Ponthieu, N.; Popa, L.; Poutanen, T.; Pratt, G.W.; Prezeau, G.; Prunet, S.; Puget, J.L.; Rachen, J.P.; Rahlin, A.; Rebolo, R.; Reinecke, M.; Remazeilles, M.; Renault, C.; Ricciardi, S.; Riller, T.; Ringeval, C.; Ristorcelli, I.; Rocha, G.; Rosset, C.; Roudier, G.; Rowan-Robinson, M.; Rubino-Martin, J.A.; Rusholme, B.; Sandri, M.; Sanselme, L.; Santos, D.; Savini, G.; Scott, D.; Seiffert, M.D.; Shellard, E.P.S.; Spencer, L.D.; Starck, J.L.; Stolyarov, V.; Stompor, R.; Sudiwala, R.; Sureau, F.; Sutton, D.; Suur-Uski, A.S.; Sygnet, J.F.; Tauber, J.A.; Tavagnacco, D.; Terenzi, L.; Toffolatti, L.; Tomasi, M.; Tristram, M.; Tucci, M.; Tuovinen, J.; Turler, M.; Valenziano, L.; Valiviita, J.; Van Tent, B.; Varis, J.; Vielva, P.; Villa, F.; Vittorio, N.; Wade, L.A.; Wandelt, B.D.; Wehus, I.K.; White, M.; White, S.D.M.; Yvon, D.; Zacchei, A.; Zonca, A.

    2014-01-01

    We present the Planck likelihood, a complete statistical description of the two-point correlation function of the CMB temperature fluctuations. We use this likelihood to derive the Planck CMB power spectrum over three decades in l, covering 2 = 50, we employ a correlated Gaussian likelihood approximation based on angular cross-spectra derived from the 100, 143 and 217 GHz channels. We validate our likelihood through an extensive suite of consistency tests, and assess the impact of residual foreground and instrumental uncertainties on cosmological parameters. We find good internal agreement among the high-l cross-spectra with residuals of a few uK^2 at l <= 1000. We compare our results with foreground-cleaned CMB maps, and with cross-spectra derived from the 70 GHz Planck map, and find broad agreement in terms of spectrum residuals and cosmological parameters. The best-fit LCDM cosmology is in excellent agreement with preliminary Planck polarisation spectra. The standard LCDM cosmology is well constrained b...

  19. Likelihood-ratio-based biometric verification

    NARCIS (Netherlands)

    Bazen, A.M.; Veldhuis, Raymond N.J.

    2002-01-01

    This paper presents results on optimal similarity measures for biometric verification based on fixed-length feature vectors. First, we show that the verification of a single user is equivalent to the detection problem, which implies that for single-user verification the likelihood ratio is optimal.

  20. Likelihood Ratio-Based Biometric Verification

    NARCIS (Netherlands)

    Bazen, A.M.; Veldhuis, Raymond N.J.

    The paper presents results on optimal similarity measures for biometric verification based on fixed-length feature vectors. First, we show that the verification of a single user is equivalent to the detection problem, which implies that, for single-user verification, the likelihood ratio is optimal.

  1. Gaussian copula as a likelihood function for environmental models

    Science.gov (United States)

    Wani, O.; Espadas, G.; Cecinati, F.; Rieckermann, J.

    2017-12-01

    Parameter estimation of environmental models always comes with uncertainty. To formally quantify this parametric uncertainty, a likelihood function needs to be formulated, which is defined as the probability of observations given fixed values of the parameter set. A likelihood function allows us to infer parameter values from observations using Bayes' theorem. The challenge is to formulate a likelihood function that reliably describes the error generating processes which lead to the observed monitoring data, such as rainfall and runoff. If the likelihood function is not representative of the error statistics, the parameter inference will give biased parameter values. Several uncertainty estimation methods that are currently being used employ Gaussian processes as a likelihood function, because of their favourable analytical properties. Box-Cox transformation is suggested to deal with non-symmetric and heteroscedastic errors e.g. for flow data which are typically more uncertain in high flows than in periods with low flows. Problem with transformations is that the results are conditional on hyper-parameters, for which it is difficult to formulate the analyst's belief a priori. In an attempt to address this problem, in this research work we suggest learning the nature of the error distribution from the errors made by the model in the "past" forecasts. We use a Gaussian copula to generate semiparametric error distributions . 1) We show that this copula can be then used as a likelihood function to infer parameters, breaking away from the practice of using multivariate normal distributions. Based on the results from a didactical example of predicting rainfall runoff, 2) we demonstrate that the copula captures the predictive uncertainty of the model. 3) Finally, we find that the properties of autocorrelation and heteroscedasticity of errors are captured well by the copula, eliminating the need to use transforms. In summary, our findings suggest that copulas are an

  2. A clinical gamma camera-based pinhole collimated system for high resolution small animal SPECT imaging

    Energy Technology Data Exchange (ETDEWEB)

    Mejia, J.; Galvis-Alonso, O.Y., E-mail: mejia_famerp@yahoo.com.b [Faculdade de Medicina de Sao Jose do Rio Preto (FAMERP), SP (Brazil). Dept. de Biologia Molecular; Castro, A.A. de; Simoes, M.V. [Faculdade de Medicina de Sao Jose do Rio Preto (FAMERP), SP (Brazil). Dept. de Clinica Medica; Leite, J.P. [Universidade de Sao Paulo (FMRP/USP), Ribeirao Preto, SP (Brazil). Fac. de Medicina. Dept. de Neurociencias e Ciencias do Comportamento; Braga, J. [Instituto Nacional de Pesquisas Espaciais (INPE), Sao Jose dos Campos, SP (Brazil). Div. de Astrofisica

    2010-11-15

    The main objective of the present study was to upgrade a clinical gamma camera to obtain high resolution tomographic images of small animal organs. The system is based on a clinical gamma camera to which we have adapted a special-purpose pinhole collimator and a device for positioning and rotating the target based on a computer-controlled step motor. We developed a software tool to reconstruct the target's three-dimensional distribution of emission from a set of planar projections, based on the maximum likelihood algorithm. We present details on the hardware and software implementation. We imaged phantoms and heart and kidneys of rats. When using pinhole collimators, the spatial resolution and sensitivity of the imaging system depend on parameters such as the detector-to-collimator and detector-to-target distances and pinhole diameter. In this study, we reached an object voxel size of 0.6 mm and spatial resolution better than 2.4 and 1.7 mm full width at half maximum when 1.5- and 1.0-mm diameter pinholes were used, respectively. Appropriate sensitivity to study the target of interest was attained in both cases. Additionally, we show that as few as 12 projections are sufficient to attain good quality reconstructions, a result that implies a significant reduction of acquisition time and opens the possibility for radiotracer dynamic studies. In conclusion, a high resolution single photon emission computed tomography (SPECT) system was developed using a commercial clinical gamma camera, allowing the acquisition of detailed volumetric images of small animal organs. This type of system has important implications for research areas such as Cardiology, Neurology or Oncology. (author)

  3. Maximum likelihood convolutional decoding (MCD) performance due to system losses

    Science.gov (United States)

    Webster, L.

    1976-01-01

    A model for predicting the computational performance of a maximum likelihood convolutional decoder (MCD) operating in a noisy carrier reference environment is described. This model is used to develop a subroutine that will be utilized by the Telemetry Analysis Program to compute the MCD bit error rate. When this computational model is averaged over noisy reference phase errors using a high-rate interpolation scheme, the results are found to agree quite favorably with experimental measurements.

  4. The clinical profile of high-risk mentally disordered offenders.

    Science.gov (United States)

    Yiend, Jenny; Freestone, Mark; Vazquez-Montes, Maria; Holland, Josephine; Burns, Tom

    2013-07-01

    High-risk mentally disordered offenders present a diverse array of clinical characteristics. To contain and effectively treat this heterogeneous population requires a full understanding of the group's clinical profile. This study aimed to identify and validate clusters of clinically coherent profiles within one high-risk mentally disordered population in the UK. Latent class analysis (a statistical technique to identify clustering of variance from a set of categorical variables) was applied to 174 cases using clinical diagnostic information to identify the most parsimonious model of best fit. Validity analyses were performed. Three identified classes were a 'delinquent' group (n = 119) characterised by poor educational history, strong criminal careers and high recidivism risk; a 'primary psychopathy' group (n = 38) characterised by good educational profiles and homicide offences and an 'expressive psychopathy' group (n = 17) presenting the lowest risk and characterised by more special educational needs and sexual offences. Individuals classed as high-risk mentally disordered offenders can be loosely segregated into three discrete subtypes: 'delinquent', 'psychopathic' or 'expressive psychopathic', respectively. These groups represent different levels of risk to society and reflect differing treatment needs.

  5. Review of Elaboration Likelihood Model of persuasion

    OpenAIRE

    藤原, 武弘; 神山, 貴弥

    1989-01-01

    This article mainly introduces Elaboration Likelihood Model (ELM), proposed by Petty & Cacioppo, that is, a general attitude change theory. ELM posturates two routes to persuasion; central and peripheral route. Attitude change by central route is viewed as resulting from a diligent consideration of the issue-relevant informations presented. On the other hand, attitude change by peripheral route is viewed as resulting from peripheral cues in the persuasion context. Secondly we compare these tw...

  6. Unbinned likelihood analysis of EGRET observations

    International Nuclear Information System (INIS)

    Digel, Seth W.

    2000-01-01

    We present a newly-developed likelihood analysis method for EGRET data that defines the likelihood function without binning the photon data or averaging the instrumental response functions. The standard likelihood analysis applied to EGRET data requires the photons to be binned spatially and in energy, and the point-spread functions to be averaged over energy and inclination angle. The full-width half maximum of the point-spread function increases by about 40% from on-axis to 30 degree sign inclination, and depending on the binning in energy can vary by more than that in a single energy bin. The new unbinned method avoids the loss of information that binning and averaging cause and can properly analyze regions where EGRET viewing periods overlap and photons with different inclination angles would otherwise be combined in the same bin. In the poster, we describe the unbinned analysis method and compare its sensitivity with binned analysis for detecting point sources in EGRET data

  7. Maximum Likelihood Compton Polarimetry with the Compton Spectrometer and Imager

    Energy Technology Data Exchange (ETDEWEB)

    Lowell, A. W.; Boggs, S. E; Chiu, C. L.; Kierans, C. A.; Sleator, C.; Tomsick, J. A.; Zoglauer, A. C. [Space Sciences Laboratory, University of California, Berkeley (United States); Chang, H.-K.; Tseng, C.-H.; Yang, C.-Y. [Institute of Astronomy, National Tsing Hua University, Taiwan (China); Jean, P.; Ballmoos, P. von [IRAP Toulouse (France); Lin, C.-H. [Institute of Physics, Academia Sinica, Taiwan (China); Amman, M. [Lawrence Berkeley National Laboratory (United States)

    2017-10-20

    Astrophysical polarization measurements in the soft gamma-ray band are becoming more feasible as detectors with high position and energy resolution are deployed. Previous work has shown that the minimum detectable polarization (MDP) of an ideal Compton polarimeter can be improved by ∼21% when an unbinned, maximum likelihood method (MLM) is used instead of the standard approach of fitting a sinusoid to a histogram of azimuthal scattering angles. Here we outline a procedure for implementing this maximum likelihood approach for real, nonideal polarimeters. As an example, we use the recent observation of GRB 160530A with the Compton Spectrometer and Imager. We find that the MDP for this observation is reduced by 20% when the MLM is used instead of the standard method.

  8. Approximate maximum likelihood estimation for population genetic inference.

    Science.gov (United States)

    Bertl, Johanna; Ewing, Gregory; Kosiol, Carolin; Futschik, Andreas

    2017-11-27

    In many population genetic problems, parameter estimation is obstructed by an intractable likelihood function. Therefore, approximate estimation methods have been developed, and with growing computational power, sampling-based methods became popular. However, these methods such as Approximate Bayesian Computation (ABC) can be inefficient in high-dimensional problems. This led to the development of more sophisticated iterative estimation methods like particle filters. Here, we propose an alternative approach that is based on stochastic approximation. By moving along a simulated gradient or ascent direction, the algorithm produces a sequence of estimates that eventually converges to the maximum likelihood estimate, given a set of observed summary statistics. This strategy does not sample much from low-likelihood regions of the parameter space, and is fast, even when many summary statistics are involved. We put considerable efforts into providing tuning guidelines that improve the robustness and lead to good performance on problems with high-dimensional summary statistics and a low signal-to-noise ratio. We then investigate the performance of our resulting approach and study its properties in simulations. Finally, we re-estimate parameters describing the demographic history of Bornean and Sumatran orang-utans.

  9. [Targeting high-risk drugs to optimize clinical pharmacists' intervention].

    Science.gov (United States)

    Mouterde, Anne-Laure; Bourdelin, Magali; Maison, Ophélie; Coursier, Sandra; Bontemps, Hervé

    2016-12-01

    By the Order of 6 April 2011, the pharmacist must validate all the prescriptions containing "high-risk drugs" or those of "patients at risk". To optimize this clinical pharmacy activity, we identified high-risk drugs. A list of high-risk drugs has been established using literature, pharmacists' interventions (PI) performed in our hospital and a survey sent to hospital pharmacists. In a prospective study (analysis of 100 prescriptions for each high-risk drug selected), we have identified the most relevant to target. We obtained a statistically significant PI rate (P<0.05) for digoxin, oral anticoagulants direct, oral methotrexate and colchicine. This method of targeted pharmaceutical validation based on high-risk drugs is relevant to detect patients with high risk of medicine-related illness. Copyright © 2016 Société française de pharmacologie et de thérapeutique. Published by Elsevier Masson SAS. All rights reserved.

  10. Early detection of psychosis: finding those at clinical high risk.

    Science.gov (United States)

    Addington, Jean; Epstein, Irvin; Reynolds, Andrea; Furimsky, Ivana; Rudy, Laura; Mancini, Barbara; McMillan, Simone; Kirsopp, Diane; Zipursky, Robert B

    2008-08-01

    In early detection work, recruiting individuals who meet the prodromal criteria is difficult. The aim of this paper was to describe the development of a research clinic for individuals who appear to be at risk of developing a psychosis and the process for educating the community and obtaining referrals. The outcome of all referrals to the clinic over a 4-year period was examined. Following an ongoing education campaign that was over inclusive in order to aid recruitment, approximately 27% of all referrals met the criteria for being at clinical high risk of psychosis. We are seeing only a small proportion of those in the community who eventually go on to develop a psychotic illness. This raises two important issues, namely how to remedy the situation, and second, the impact of this on current research in terms of sampling bias and generalizability of research findings. © 2008 The Authors. Journal compilation © 2008 Blackwell Publishing Asia Pty Ltd.

  11. Bias correction in the hierarchical likelihood approach to the analysis of multivariate survival data.

    Science.gov (United States)

    Jeon, Jihyoun; Hsu, Li; Gorfine, Malka

    2012-07-01

    Frailty models are useful for measuring unobserved heterogeneity in risk of failures across clusters, providing cluster-specific risk prediction. In a frailty model, the latent frailties shared by members within a cluster are assumed to act multiplicatively on the hazard function. In order to obtain parameter and frailty variate estimates, we consider the hierarchical likelihood (H-likelihood) approach (Ha, Lee and Song, 2001. Hierarchical-likelihood approach for frailty models. Biometrika 88, 233-243) in which the latent frailties are treated as "parameters" and estimated jointly with other parameters of interest. We find that the H-likelihood estimators perform well when the censoring rate is low, however, they are substantially biased when the censoring rate is moderate to high. In this paper, we propose a simple and easy-to-implement bias correction method for the H-likelihood estimators under a shared frailty model. We also extend the method to a multivariate frailty model, which incorporates complex dependence structure within clusters. We conduct an extensive simulation study and show that the proposed approach performs very well for censoring rates as high as 80%. We also illustrate the method with a breast cancer data set. Since the H-likelihood is the same as the penalized likelihood function, the proposed bias correction method is also applicable to the penalized likelihood estimators.

  12. The STAR Data Reporting Guidelines for Clinical High Altitude Research.

    Science.gov (United States)

    Brodmann Maeder, Monika; Brugger, Hermann; Pun, Matiram; Strapazzon, Giacomo; Dal Cappello, Tomas; Maggiorini, Marco; Hackett, Peter; Bärtsch, Peter; Swenson, Erik R; Zafren, Ken

    2018-03-01

    Brodmann Maeder, Monika, Hermann Brugger, Matiram Pun, Giacomo Strapazzon, Tomas Dal Cappello, Marco Maggiorini, Peter Hackett, Peter Baärtsch, Erik R. Swenson, Ken Zafren (STAR Core Group), and the STAR Delphi Expert Group. The STARdata reporting guidelines for clinical high altitude research. High AltMedBiol. 19:7-14, 2018. The goal of the STAR (STrengthening Altitude Research) initiative was to produce a uniform set of key elements for research and reporting in clinical high-altitude (HA) medicine. The STAR initiative was inspired by research on treatment of cardiac arrest, in which the establishment of the Utstein Style, a uniform data reporting protocol, substantially contributed to improving data reporting and subsequently the quality of scientific evidence. The STAR core group used the Delphi method, in which a group of experts reaches a consensus over multiple rounds using a formal method. We selected experts in the field of clinical HA medicine based on their scientific credentials and identified an initial set of parameters for evaluation by the experts. Of 51 experts in HA research who were identified initially, 21 experts completed both rounds. The experts identified 42 key parameters in 5 categories (setting, individual factors, acute mountain sickness and HA cerebral edema, HA pulmonary edema, and treatment) that were considered essential for research and reporting in clinical HA research. An additional 47 supplemental parameters were identified that should be reported depending on the nature of the research. The STAR initiative, using the Delphi method, identified a set of key parameters essential for research and reporting in clinical HA medicine.

  13. Comparisons of likelihood and machine learning methods of individual classification

    Science.gov (United States)

    Guinand, B.; Topchy, A.; Page, K.S.; Burnham-Curtis, M. K.; Punch, W.F.; Scribner, K.T.

    2002-01-01

    Classification methods used in machine learning (e.g., artificial neural networks, decision trees, and k-nearest neighbor clustering) are rarely used with population genetic data. We compare different nonparametric machine learning techniques with parametric likelihood estimations commonly employed in population genetics for purposes of assigning individuals to their population of origin (“assignment tests”). Classifier accuracy was compared across simulated data sets representing different levels of population differentiation (low and high FST), number of loci surveyed (5 and 10), and allelic diversity (average of three or eight alleles per locus). Empirical data for the lake trout (Salvelinus namaycush) exhibiting levels of population differentiation comparable to those used in simulations were examined to further evaluate and compare classification methods. Classification error rates associated with artificial neural networks and likelihood estimators were lower for simulated data sets compared to k-nearest neighbor and decision tree classifiers over the entire range of parameters considered. Artificial neural networks only marginally outperformed the likelihood method for simulated data (0–2.8% lower error rates). The relative performance of each machine learning classifier improved relative likelihood estimators for empirical data sets, suggesting an ability to “learn” and utilize properties of empirical genotypic arrays intrinsic to each population. Likelihood-based estimation methods provide a more accessible option for reliable assignment of individuals to the population of origin due to the intricacies in development and evaluation of artificial neural networks. In recent years, characterization of highly polymorphic molecular markers such as mini- and microsatellites and development of novel methods of analysis have enabled researchers to extend investigations of ecological and evolutionary processes below the population level to the level of

  14. Caching and interpolated likelihoods: accelerating cosmological Monte Carlo Markov chains

    Energy Technology Data Exchange (ETDEWEB)

    Bouland, Adam; Easther, Richard; Rosenfeld, Katherine, E-mail: adam.bouland@aya.yale.edu, E-mail: richard.easther@yale.edu, E-mail: krosenfeld@cfa.harvard.edu [Department of Physics, Yale University, New Haven CT 06520 (United States)

    2011-05-01

    We describe a novel approach to accelerating Monte Carlo Markov Chains. Our focus is cosmological parameter estimation, but the algorithm is applicable to any problem for which the likelihood surface is a smooth function of the free parameters and computationally expensive to evaluate. We generate a high-order interpolating polynomial for the log-likelihood using the first points gathered by the Markov chains as a training set. This polynomial then accurately computes the majority of the likelihoods needed in the latter parts of the chains. We implement a simple version of this algorithm as a patch (InterpMC) to CosmoMC and show that it accelerates parameter estimatation by a factor of between two and four for well-converged chains. The current code is primarily intended as a ''proof of concept'', and we argue that there is considerable room for further performance gains. Unlike other approaches to accelerating parameter fits, we make no use of precomputed training sets or special choices of variables, and InterpMC is almost entirely transparent to the user.

  15. Caching and interpolated likelihoods: accelerating cosmological Monte Carlo Markov chains

    International Nuclear Information System (INIS)

    Bouland, Adam; Easther, Richard; Rosenfeld, Katherine

    2011-01-01

    We describe a novel approach to accelerating Monte Carlo Markov Chains. Our focus is cosmological parameter estimation, but the algorithm is applicable to any problem for which the likelihood surface is a smooth function of the free parameters and computationally expensive to evaluate. We generate a high-order interpolating polynomial for the log-likelihood using the first points gathered by the Markov chains as a training set. This polynomial then accurately computes the majority of the likelihoods needed in the latter parts of the chains. We implement a simple version of this algorithm as a patch (InterpMC) to CosmoMC and show that it accelerates parameter estimatation by a factor of between two and four for well-converged chains. The current code is primarily intended as a ''proof of concept'', and we argue that there is considerable room for further performance gains. Unlike other approaches to accelerating parameter fits, we make no use of precomputed training sets or special choices of variables, and InterpMC is almost entirely transparent to the user

  16. Corporate governance effect on financial distress likelihood: Evidence from Spain

    Directory of Open Access Journals (Sweden)

    Montserrat Manzaneque

    2016-01-01

    Full Text Available The paper explores some mechanisms of corporate governance (ownership and board characteristics in Spanish listed companies and their impact on the likelihood of financial distress. An empirical study was conducted between 2007 and 2012 using a matched-pairs research design with 308 observations, with half of them classified as distressed and non-distressed. Based on the previous study by Pindado, Rodrigues, and De la Torre (2008, a broader concept of bankruptcy is used to define business failure. Employing several conditional logistic models, as well as to other previous studies on bankruptcy, the results confirm that in difficult situations prior to bankruptcy, the impact of board ownership and proportion of independent directors on business failure likelihood are similar to those exerted in more extreme situations. These results go one step further, to offer a negative relationship between board size and the likelihood of financial distress. This result is interpreted as a form of creating diversity and to improve the access to the information and resources, especially in contexts where the ownership is highly concentrated and large shareholders have a great power to influence the board structure. However, the results confirm that ownership concentration does not have a significant impact on financial distress likelihood in the Spanish context. It is argued that large shareholders are passive as regards an enhanced monitoring of management and, alternatively, they do not have enough incentives to hold back the financial distress. These findings have important implications in the Spanish context, where several changes in the regulatory listing requirements have been carried out with respect to corporate governance, and where there is no empirical evidence regarding this respect.

  17. The likelihood principle and its proof – a never-ending story…

    DEFF Research Database (Denmark)

    Jørgensen, Thomas Martini

    2015-01-01

    An ongoing controversy in philosophy of statistics is the so-called “likelihood principle” essentially stating that all evidence which is obtained from an experiment about an unknown quantity θ is contained in the likelihood function of θ. Common classical statistical methodology, such as the use...... of significance tests, and confidence intervals, depends on the experimental procedure and unrealized events and thus violates the likelihood principle. The likelihood principle was identified by that name and proved in a famous paper by Allan Birnbaum in 1962. However, ever since both the principle itself...... as well as the proof has been highly debated. This presentation will illustrate the debate of both the principle and its proof, from 1962 and up to today. An often-used experiment to illustrate the controversy between classical interpretation and evidential confirmation based on the likelihood principle...

  18. Multi-Channel Maximum Likelihood Pitch Estimation

    DEFF Research Database (Denmark)

    Christensen, Mads Græsbøll

    2012-01-01

    In this paper, a method for multi-channel pitch estimation is proposed. The method is a maximum likelihood estimator and is based on a parametric model where the signals in the various channels share the same fundamental frequency but can have different amplitudes, phases, and noise characteristics....... This essentially means that the model allows for different conditions in the various channels, like different signal-to-noise ratios, microphone characteristics and reverberation. Moreover, the method does not assume that a certain array structure is used but rather relies on a more general model and is hence...

  19. Approximate maximum parsimony and ancestral maximum likelihood.

    Science.gov (United States)

    Alon, Noga; Chor, Benny; Pardi, Fabio; Rapoport, Anat

    2010-01-01

    We explore the maximum parsimony (MP) and ancestral maximum likelihood (AML) criteria in phylogenetic tree reconstruction. Both problems are NP-hard, so we seek approximate solutions. We formulate the two problems as Steiner tree problems under appropriate distances. The gist of our approach is the succinct characterization of Steiner trees for a small number of leaves for the two distances. This enables the use of known Steiner tree approximation algorithms. The approach leads to a 16/9 approximation ratio for AML and asymptotically to a 1.55 approximation ratio for MP.

  20. Predicting the onset of psychosis in patients at clinical high risk: practical guide to probabilistic prognostic reasoning.

    Science.gov (United States)

    Fusar-Poli, P; Schultze-Lutter, F

    2016-02-01

    Prediction of psychosis in patients at clinical high risk (CHR) has become a mainstream focus of clinical and research interest worldwide. When using CHR instruments for clinical purposes, the predicted outcome is but only a probability; and, consequently, any therapeutic action following the assessment is based on probabilistic prognostic reasoning. Yet, probabilistic reasoning makes considerable demands on the clinicians. We provide here a scholarly practical guide summarising the key concepts to support clinicians with probabilistic prognostic reasoning in the CHR state. We review risk or cumulative incidence of psychosis in, person-time rate of psychosis, Kaplan-Meier estimates of psychosis risk, measures of prognostic accuracy, sensitivity and specificity in receiver operator characteristic curves, positive and negative predictive values, Bayes' theorem, likelihood ratios, potentials and limits of real-life applications of prognostic probabilistic reasoning in the CHR state. Understanding basic measures used for prognostic probabilistic reasoning is a prerequisite for successfully implementing the early detection and prevention of psychosis in clinical practice. Future refinement of these measures for CHR patients may actually influence risk management, especially as regards initiating or withholding treatment. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  1. Lumbar disc herniation at high levels : MRI and clinical findings

    International Nuclear Information System (INIS)

    Paek, Chung Ho; Kwon, Soon Tae; Lee, Jun Kyu; Ahn, Jae Sung; Lee, Hwan Do; Chung, Yon Su; Jeong, Ki Ho; Cho, Jun Sik

    1999-01-01

    To assess the frequency, location, associated MR findings, and clinical symptoms of the high level lumbar disc herniation(HLDH). A total of 1076 patients with lunbar disc herniation were retrospectively reviewed. MR images of 41 of these with HLDH(T12-L1, L1-2, L2-3) were analysed in terms of frequency, location, and associated MR findings, and correlated with clinical symptoms of HLDH. The prevalence of HLDH was 3.8%(41/1076). HLDH was located at T12-L1 level in four patients(10%), at L1-2 level in 14(34%), at L2-3 level in 21(51%), and at both L1-2 and L2-3 levels in two. The age of patients ranged from 20 to 72 years (mean, 44), and there were 26 men and 16 women. In 11(27%), whose mean age was 32 years, isolated disc herniation was limited to these high lumbar segments. The remaining 30 patients had HLDH associated with variable involvement of the lower lumbar segments. Associated lesions were as follow : lower level disc herniation(14 patients, 34%); apophyseal ring fracture(8 patients, 19%); Schmorl's node and spondylolisthesis (each 6 patients, each 14%); spondylolysis(3 patients, 7%); and retrolisthesis(2 patients, 5%). In 20 patients(49%) with HLDH(n=41), there was a previous history of trauma. Patients with HLDH showed a relatively high incidence of associated coexisting abnormalities such as lower lumbar disc herniation, apophyseal ring fracture, Schmorl's node, spondylolysis, and retrolisthesis. In about half of all patients with HLDH there was a previous history of trauma. The mean age of patients with isolated HLDH was lower; clinical symptoms of the condition were relatively nonspecific and their incidence was low

  2. Lumbar disc herniation at high levels : MRI and clinical findings

    Energy Technology Data Exchange (ETDEWEB)

    Paek, Chung Ho; Kwon, Soon Tae; Lee, Jun Kyu; Ahn, Jae Sung; Lee, Hwan Do; Chung, Yon Su; Jeong, Ki Ho; Cho, Jun Sik [Chungnam National Univ. College of Medicine, Taejon (Korea, Republic of)

    1999-04-01

    To assess the frequency, location, associated MR findings, and clinical symptoms of the high level lumbar disc herniation(HLDH). A total of 1076 patients with lunbar disc herniation were retrospectively reviewed. MR images of 41 of these with HLDH(T12-L1, L1-2, L2-3) were analysed in terms of frequency, location, and associated MR findings, and correlated with clinical symptoms of HLDH. The prevalence of HLDH was 3.8%(41/1076). HLDH was located at T12-L1 level in four patients(10%), at L1-2 level in 14(34%), at L2-3 level in 21(51%), and at both L1-2 and L2-3 levels in two. The age of patients ranged from 20 to 72 years (mean, 44), and there were 26 men and 16 women. In 11(27%), whose mean age was 32 years, isolated disc herniation was limited to these high lumbar segments. The remaining 30 patients had HLDH associated with variable involvement of the lower lumbar segments. Associated lesions were as follow : lower level disc herniation(14 patients, 34%); apophyseal ring fracture(8 patients, 19%); Schmorl's node and spondylolisthesis (each 6 patients, each 14%); spondylolysis(3 patients, 7%); and retrolisthesis(2 patients, 5%). In 20 patients(49%) with HLDH(n=41), there was a previous history of trauma. Patients with HLDH showed a relatively high incidence of associated coexisting abnormalities such as lower lumbar disc herniation, apophyseal ring fracture, Schmorl's node, spondylolysis, and retrolisthesis. In about half of all patients with HLDH there was a previous history of trauma. The mean age of patients with isolated HLDH was lower; clinical symptoms of the condition were relatively nonspecific and their incidence was low.

  3. Elemental composition of cosmic rays using a maximum likelihood method

    International Nuclear Information System (INIS)

    Ruddick, K.

    1996-01-01

    We present a progress report on our attempts to determine the composition of cosmic rays in the knee region of the energy spectrum. We have used three different devices to measure properties of the extensive air showers produced by primary cosmic rays: the Soudan 2 underground detector measures the muon flux deep underground, a proportional tube array samples shower density at the surface of the earth, and a Cherenkov array observes light produced high in the atmosphere. We have begun maximum likelihood fits to these measurements with the hope of determining the nuclear mass number A on an event by event basis. (orig.)

  4. Estimating likelihood of future crashes for crash-prone drivers

    OpenAIRE

    Subasish Das; Xiaoduan Sun; Fan Wang; Charles Leboeuf

    2015-01-01

    At-fault crash-prone drivers are usually considered as the high risk group for possible future incidents or crashes. In Louisiana, 34% of crashes are repeatedly committed by the at-fault crash-prone drivers who represent only 5% of the total licensed drivers in the state. This research has conducted an exploratory data analysis based on the driver faultiness and proneness. The objective of this study is to develop a crash prediction model to estimate the likelihood of future crashes for the a...

  5. Maximum Likelihood Joint Tracking and Association in Strong Clutter

    Directory of Open Access Journals (Sweden)

    Leonid I. Perlovsky

    2013-01-01

    Full Text Available We have developed a maximum likelihood formulation for a joint detection, tracking and association problem. An efficient non-combinatorial algorithm for this problem is developed in case of strong clutter for radar data. By using an iterative procedure of the dynamic logic process “from vague-to-crisp” explained in the paper, the new tracker overcomes the combinatorial complexity of tracking in highly-cluttered scenarios and results in an orders-of-magnitude improvement in signal-to-clutter ratio.

  6. A Predictive Likelihood Approach to Bayesian Averaging

    Directory of Open Access Journals (Sweden)

    Tomáš Jeřábek

    2015-01-01

    Full Text Available Multivariate time series forecasting is applied in a wide range of economic activities related to regional competitiveness and is the basis of almost all macroeconomic analysis. In this paper we combine multivariate density forecasts of GDP growth, inflation and real interest rates from four various models, two type of Bayesian vector autoregression (BVAR models, a New Keynesian dynamic stochastic general equilibrium (DSGE model of small open economy and DSGE-VAR model. The performance of models is identified using historical dates including domestic economy and foreign economy, which is represented by countries of the Eurozone. Because forecast accuracy of observed models are different, the weighting scheme based on the predictive likelihood, the trace of past MSE matrix, model ranks are used to combine the models. The equal-weight scheme is used as a simple combination scheme. The results show that optimally combined densities are comparable to the best individual models.

  7. Subtracting and Fitting Histograms using Profile Likelihood

    CERN Document Server

    D'Almeida, F M L

    2008-01-01

    It is known that many interesting signals expected at LHC are of unknown shape and strongly contaminated by background events. These signals will be dif cult to detect during the rst years of LHC operation due to the initial low luminosity. In this work, one presents a method of subtracting histograms based on the pro le likelihood function when the background is previously estimated by Monte Carlo events and one has low statistics. Estimators for the signal in each bin of the histogram difference are calculated so as limits for the signals with 68.3% of Con dence Level in a low statistics case when one has a exponential background and a Gaussian signal. The method can also be used to t histograms when the signal shape is known. Our results show a good performance and avoid the problem of negative values when subtracting histograms.

  8. High blood pressure in acute ischemic stroke and clinical outcome.

    Science.gov (United States)

    Manabe, Yasuhiro; Kono, Syoichiro; Tanaka, Tomotaka; Narai, Hisashi; Omori, Nobuhiko

    2009-11-16

    This study aimed to evaluate the prognostic value of acute phase blood pressure in patients with acute ischemic stroke by determining whether or not it contributes to clinical outcome. We studied 515 consecutive patients admitted within the first 48 hours after the onset of ischemic strokes, employing systolic and diastolic blood pressure measurements recorded within 36 hours after admission. High blood pressure was defined when the mean of at least 2 blood pressure measurements was ≥200 mmHg systolic and/or ≥110 mmHg diastolic at 6 to 24 hours after admission or ≥180 mmHg systolic and/or ≥105 mmHg diastolic at 24 to 36 hours after admission. The high blood pressure group was found to include 16% of the patients. Age, sex, diabetes mellitus, hypercholesterolemia, atrial fibrillation, ischemic heart disease, stroke history, carotid artery stenosis, leukoaraiosis, NIH Stroke Scale (NIHSS) on admission and mortality were not significantly correlated with either the high blood pressure or non-high blood pressure group. High blood pressure on admission was significantly associated with a past history of hypertension, kidney disease, the modified Rankin Scale (mRS) on discharge and the length of stay. On logistic regression analysis, with no previous history of hypertension, diabetes mellitus, atrial fibrillation, and kidney disease were independent risk factors associated with the presence of high blood pressure [odds ratio (OR), 1.85 (95% confidence interval (CI): 1.06-3.22), 1.89 (95% CI: 1.11-3.22), and 3.31 (95% CI: 1.36-8.04), respectively]. Multi-organ injury may be presented in acute stroke patients with high blood pressure. Patients with high blood pressure had a poor functional outcome after acute ischemic stroke.

  9. A maximum likelihood framework for protein design

    Directory of Open Access Journals (Sweden)

    Philippe Hervé

    2006-06-01

    Full Text Available Abstract Background The aim of protein design is to predict amino-acid sequences compatible with a given target structure. Traditionally envisioned as a purely thermodynamic question, this problem can also be understood in a wider context, where additional constraints are captured by learning the sequence patterns displayed by natural proteins of known conformation. In this latter perspective, however, we still need a theoretical formalization of the question, leading to general and efficient learning methods, and allowing for the selection of fast and accurate objective functions quantifying sequence/structure compatibility. Results We propose a formulation of the protein design problem in terms of model-based statistical inference. Our framework uses the maximum likelihood principle to optimize the unknown parameters of a statistical potential, which we call an inverse potential to contrast with classical potentials used for structure prediction. We propose an implementation based on Markov chain Monte Carlo, in which the likelihood is maximized by gradient descent and is numerically estimated by thermodynamic integration. The fit of the models is evaluated by cross-validation. We apply this to a simple pairwise contact potential, supplemented with a solvent-accessibility term, and show that the resulting models have a better predictive power than currently available pairwise potentials. Furthermore, the model comparison method presented here allows one to measure the relative contribution of each component of the potential, and to choose the optimal number of accessibility classes, which turns out to be much higher than classically considered. Conclusion Altogether, this reformulation makes it possible to test a wide diversity of models, using different forms of potentials, or accounting for other factors than just the constraint of thermodynamic stability. Ultimately, such model-based statistical analyses may help to understand the forces

  10. Modelling maximum likelihood estimation of availability

    International Nuclear Information System (INIS)

    Waller, R.A.; Tietjen, G.L.; Rock, G.W.

    1975-01-01

    Suppose the performance of a nuclear powered electrical generating power plant is continuously monitored to record the sequence of failure and repairs during sustained operation. The purpose of this study is to assess one method of estimating the performance of the power plant when the measure of performance is availability. That is, we determine the probability that the plant is operational at time t. To study the availability of a power plant, we first assume statistical models for the variables, X and Y, which denote the time-to-failure and the time-to-repair variables, respectively. Once those statistical models are specified, the availability, A(t), can be expressed as a function of some or all of their parameters. Usually those parameters are unknown in practice and so A(t) is unknown. This paper discusses the maximum likelihood estimator of A(t) when the time-to-failure model for X is an exponential density with parameter, lambda, and the time-to-repair model for Y is an exponential density with parameter, theta. Under the assumption of exponential models for X and Y, it follows that the instantaneous availability at time t is A(t)=lambda/(lambda+theta)+theta/(lambda+theta)exp[-[(1/lambda)+(1/theta)]t] with t>0. Also, the steady-state availability is A(infinity)=lambda/(lambda+theta). We use the observations from n failure-repair cycles of the power plant, say X 1 , X 2 , ..., Xsub(n), Y 1 , Y 2 , ..., Ysub(n) to present the maximum likelihood estimators of A(t) and A(infinity). The exact sampling distributions for those estimators and some statistical properties are discussed before a simulation model is used to determine 95% simulation intervals for A(t). The methodology is applied to two examples which approximate the operating history of two nuclear power plants. (author)

  11. High-resolution multimodal clinical multiphoton tomography of skin

    Science.gov (United States)

    König, Karsten

    2011-03-01

    This review focuses on multimodal multiphoton tomography based on near infrared femtosecond lasers. Clinical multiphoton tomographs for 3D high-resolution in vivo imaging have been placed into the market several years ago. The second generation of this Prism-Award winning High-Tech skin imaging tool (MPTflex) was introduced in 2010. The same year, the world's first clinical CARS studies have been performed with a hybrid multimodal multiphoton tomograph. In particular, non-fluorescent lipids and water as well as mitochondrial fluorescent NAD(P)H, fluorescent elastin, keratin, and melanin as well as SHG-active collagen has been imaged with submicron resolution in patients suffering from psoriasis. Further multimodal approaches include the combination of multiphoton tomographs with low-resolution wide-field systems such as ultrasound, optoacoustical, OCT, and dermoscopy systems. Multiphoton tomographs are currently employed in Australia, Japan, the US, and in several European countries for early diagnosis of skin cancer, optimization of treatment strategies, and cosmetic research including long-term testing of sunscreen nanoparticles as well as anti-aging products.

  12. Massive optimal data compression and density estimation for scalable, likelihood-free inference in cosmology

    Science.gov (United States)

    Alsing, Justin; Wandelt, Benjamin; Feeney, Stephen

    2018-03-01

    Many statistical models in cosmology can be simulated forwards but have intractable likelihood functions. Likelihood-free inference methods allow us to perform Bayesian inference from these models using only forward simulations, free from any likelihood assumptions or approximations. Likelihood-free inference generically involves simulating mock data and comparing to the observed data; this comparison in data-space suffers from the curse of dimensionality and requires compression of the data to a small number of summary statistics to be tractable. In this paper we use massive asymptotically-optimal data compression to reduce the dimensionality of the data-space to just one number per parameter, providing a natural and optimal framework for summary statistic choice for likelihood-free inference. Secondly, we present the first cosmological application of Density Estimation Likelihood-Free Inference (DELFI), which learns a parameterized model for joint distribution of data and parameters, yielding both the parameter posterior and the model evidence. This approach is conceptually simple, requires less tuning than traditional Approximate Bayesian Computation approaches to likelihood-free inference and can give high-fidelity posteriors from orders of magnitude fewer forward simulations. As an additional bonus, it enables parameter inference and Bayesian model comparison simultaneously. We demonstrate Density Estimation Likelihood-Free Inference with massive data compression on an analysis of the joint light-curve analysis supernova data, as a simple validation case study. We show that high-fidelity posterior inference is possible for full-scale cosmological data analyses with as few as ˜104 simulations, with substantial scope for further improvement, demonstrating the scalability of likelihood-free inference to large and complex cosmological datasets.

  13. Evidence based exercise - clinical benefits of high intensity interval training.

    Science.gov (United States)

    Shiraev, Tim; Barclay, Gabriella

    2012-12-01

    Aerobic exercise has a marked impact on cardiovascular disease risk. Benefits include improved serum lipid profiles, blood pressure and inflammatory markers as well as reduced risk of stroke, acute coronary syndrome and overall cardiovascular mortality. Most exercise programs prescribed for fat reduction involve continuous, moderate aerobic exercise, as per Australian Heart Foundation clinical guidelines. This article describes the benefits of exercise for patients with cardiovascular and metabolic disease and details the numerous benefits of high intensity interval training (HIIT) in particular. Aerobic exercise has numerous benefits for high-risk populations and such benefits, especially weight loss, are amplified with HIIT. High intensity interval training involves repeatedly exercising at a high intensity for 30 seconds to several minutes, separated by 1-5 minutes of recovery (either no or low intensity exercise). HIT is associated with increased patient compliance and improved cardiovascular and metabolic outcomes and is suitable for implementation in both healthy and 'at risk' populations. Importantly, as some types of exercise are contraindicated in certain patient populations and HIIT is a complex concept for those unfamiliar to exercise, some patients may require specific assessment or instruction before commencing a HIIT program.

  14. Estimation of Model's Marginal likelihood Using Adaptive Sparse Grid Surrogates in Bayesian Model Averaging

    Science.gov (United States)

    Zeng, X.

    2015-12-01

    A large number of model executions are required to obtain alternative conceptual models' predictions and their posterior probabilities in Bayesian model averaging (BMA). The posterior model probability is estimated through models' marginal likelihood and prior probability. The heavy computation burden hinders the implementation of BMA prediction, especially for the elaborated marginal likelihood estimator. For overcoming the computation burden of BMA, an adaptive sparse grid (SG) stochastic collocation method is used to build surrogates for alternative conceptual models through the numerical experiment of a synthetical groundwater model. BMA predictions depend on model posterior weights (or marginal likelihoods), and this study also evaluated four marginal likelihood estimators, including arithmetic mean estimator (AME), harmonic mean estimator (HME), stabilized harmonic mean estimator (SHME), and thermodynamic integration estimator (TIE). The results demonstrate that TIE is accurate in estimating conceptual models' marginal likelihoods. The BMA-TIE has better predictive performance than other BMA predictions. TIE has high stability for estimating conceptual model's marginal likelihood. The repeated estimated conceptual model's marginal likelihoods by TIE have significant less variability than that estimated by other estimators. In addition, the SG surrogates are efficient to facilitate BMA predictions, especially for BMA-TIE. The number of model executions needed for building surrogates is 4.13%, 6.89%, 3.44%, and 0.43% of the required model executions of BMA-AME, BMA-HME, BMA-SHME, and BMA-TIE, respectively.

  15. Preliminary attempt on maximum likelihood tomosynthesis reconstruction of DEI data

    International Nuclear Information System (INIS)

    Wang Zhentian; Huang Zhifeng; Zhang Li; Kang Kejun; Chen Zhiqiang; Zhu Peiping

    2009-01-01

    Tomosynthesis is a three-dimension reconstruction method that can remove the effect of superimposition with limited angle projections. It is especially promising in mammography where radiation dose is concerned. In this paper, we propose a maximum likelihood tomosynthesis reconstruction algorithm (ML-TS) on the apparent absorption data of diffraction enhanced imaging (DEI). The motivation of this contribution is to develop a tomosynthesis algorithm in low-dose or noisy circumstances and make DEI get closer to clinic application. The theoretical statistical models of DEI data in physics are analyzed and the proposed algorithm is validated with the experimental data at the Beijing Synchrotron Radiation Facility (BSRF). The results of ML-TS have better contrast compared with the well known 'shift-and-add' algorithm and FBP algorithm. (authors)

  16. Likelihood-Based Inference of B Cell Clonal Families.

    Directory of Open Access Journals (Sweden)

    Duncan K Ralph

    2016-10-01

    Full Text Available The human immune system depends on a highly diverse collection of antibody-making B cells. B cell receptor sequence diversity is generated by a random recombination process called "rearrangement" forming progenitor B cells, then a Darwinian process of lineage diversification and selection called "affinity maturation." The resulting receptors can be sequenced in high throughput for research and diagnostics. Such a collection of sequences contains a mixture of various lineages, each of which may be quite numerous, or may consist of only a single member. As a step to understanding the process and result of this diversification, one may wish to reconstruct lineage membership, i.e. to cluster sampled sequences according to which came from the same rearrangement events. We call this clustering problem "clonal family inference." In this paper we describe and validate a likelihood-based framework for clonal family inference based on a multi-hidden Markov Model (multi-HMM framework for B cell receptor sequences. We describe an agglomerative algorithm to find a maximum likelihood clustering, two approximate algorithms with various trade-offs of speed versus accuracy, and a third, fast algorithm for finding specific lineages. We show that under simulation these algorithms greatly improve upon existing clonal family inference methods, and that they also give significantly different clusters than previous methods when applied to two real data sets.

  17. Likelihood analysis of the minimal AMSB model

    Energy Technology Data Exchange (ETDEWEB)

    Bagnaschi, E.; Weiglein, G. [DESY, Hamburg (Germany); Borsato, M.; Chobanova, V.; Lucio, M.; Santos, D.M. [Universidade de Santiago de Compostela, Santiago de Compostela (Spain); Sakurai, K. [Institute for Particle Physics Phenomenology, University of Durham, Science Laboratories, Department of Physics, Durham (United Kingdom); University of Warsaw, Faculty of Physics, Institute of Theoretical Physics, Warsaw (Poland); Buchmueller, O.; Citron, M.; Costa, J.C.; Richards, A. [Imperial College, High Energy Physics Group, Blackett Laboratory, London (United Kingdom); Cavanaugh, R. [Fermi National Accelerator Laboratory, Batavia, IL (United States); University of Illinois at Chicago, Physics Department, Chicago, IL (United States); De Roeck, A. [Experimental Physics Department, CERN, Geneva (Switzerland); Antwerp University, Wilrijk (Belgium); Dolan, M.J. [School of Physics, University of Melbourne, ARC Centre of Excellence for Particle Physics at the Terascale, Melbourne (Australia); Ellis, J.R. [King' s College London, Theoretical Particle Physics and Cosmology Group, Department of Physics, London (United Kingdom); CERN, Theoretical Physics Department, Geneva (Switzerland); Flaecher, H. [University of Bristol, H.H. Wills Physics Laboratory, Bristol (United Kingdom); Heinemeyer, S. [Campus of International Excellence UAM+CSIC, Madrid (Spain); Instituto de Fisica Teorica UAM-CSIC, Madrid (Spain); Instituto de Fisica de Cantabria (CSIC-UC), Cantabria (Spain); Isidori, G. [Physik-Institut, Universitaet Zuerich, Zurich (Switzerland); Luo, F. [Kavli IPMU (WPI), UTIAS, The University of Tokyo, Kashiwa, Chiba (Japan); Olive, K.A. [School of Physics and Astronomy, University of Minnesota, William I. Fine Theoretical Physics Institute, Minneapolis, MN (United States)

    2017-04-15

    We perform a likelihood analysis of the minimal anomaly-mediated supersymmetry-breaking (mAMSB) model using constraints from cosmology and accelerator experiments. We find that either a wino-like or a Higgsino-like neutralino LSP, χ{sup 0}{sub 1}, may provide the cold dark matter (DM), both with similar likelihoods. The upper limit on the DM density from Planck and other experiments enforces m{sub χ{sup 0}{sub 1}} 0) but the scalar mass m{sub 0} is poorly constrained. In the wino-LSP case, m{sub 3/2} is constrained to about 900 TeV and m{sub χ{sup 0}{sub 1}} to 2.9 ± 0.1 TeV, whereas in the Higgsino-LSP case m{sub 3/2} has just a lower limit >or similar 650 TeV (>or similar 480 TeV) and m{sub χ{sup 0}{sub 1}} is constrained to 1.12 (1.13) ± 0.02 TeV in the μ > 0 (μ < 0) scenario. In neither case can the anomalous magnetic moment of the muon, (g-2){sub μ}, be improved significantly relative to its Standard Model (SM) value, nor do flavour measurements constrain the model significantly, and there are poor prospects for discovering supersymmetric particles at the LHC, though there are some prospects for direct DM detection. On the other hand, if the χ{sup 0}{sub 1} contributes only a fraction of the cold DM density, future LHC E{sub T}-based searches for gluinos, squarks and heavier chargino and neutralino states as well as disappearing track searches in the wino-like LSP region will be relevant, and interference effects enable BR(B{sub s,d} → μ{sup +}μ{sup -}) to agree with the data better than in the SM in the case of wino-like DM with μ > 0. (orig.)

  18. Cyberbullying in those at clinical high risk for psychosis.

    Science.gov (United States)

    Magaud, Emilie; Nyman, Karissa; Addington, Jean

    2013-11-01

    Several studies suggest an association between experiences of childhood trauma including bullying and the development of psychotic symptoms. The use of communications technology has created a new media for bullying called 'cyberbullying'. Research has demonstrated associations between traditional bullying and cyberbullying. Negative effects of cyberbullying appear similar in nature and severity to the reported effects of traditional bullying. Our aim was to examine the prevalence and correlates of cyberbullying in those at clinical high risk (CHR) for psychosis. Fifty young people at CHR for psychosis were administered the Childhood Trauma Questionnaire with added questions about cyberbullying. Cyberbullying was reported in 38% of the sample. Those who experienced cyberbullying also reported experiencing previous trauma. It is possible that cyberbullying may be a problem for those at CHR of psychosis, and due to the vulnerable nature of these young people may have longitudinal implications. © 2013 Wiley Publishing Asia Pty Ltd.

  19. Zero-inflated Poisson model based likelihood ratio test for drug safety signal detection.

    Science.gov (United States)

    Huang, Lan; Zheng, Dan; Zalkikar, Jyoti; Tiwari, Ram

    2017-02-01

    In recent decades, numerous methods have been developed for data mining of large drug safety databases, such as Food and Drug Administration's (FDA's) Adverse Event Reporting System, where data matrices are formed by drugs such as columns and adverse events as rows. Often, a large number of cells in these data matrices have zero cell counts and some of them are "true zeros" indicating that the drug-adverse event pairs cannot occur, and these zero counts are distinguished from the other zero counts that are modeled zero counts and simply indicate that the drug-adverse event pairs have not occurred yet or have not been reported yet. In this paper, a zero-inflated Poisson model based likelihood ratio test method is proposed to identify drug-adverse event pairs that have disproportionately high reporting rates, which are also called signals. The maximum likelihood estimates of the model parameters of zero-inflated Poisson model based likelihood ratio test are obtained using the expectation and maximization algorithm. The zero-inflated Poisson model based likelihood ratio test is also modified to handle the stratified analyses for binary and categorical covariates (e.g. gender and age) in the data. The proposed zero-inflated Poisson model based likelihood ratio test method is shown to asymptotically control the type I error and false discovery rate, and its finite sample performance for signal detection is evaluated through a simulation study. The simulation results show that the zero-inflated Poisson model based likelihood ratio test method performs similar to Poisson model based likelihood ratio test method when the estimated percentage of true zeros in the database is small. Both the zero-inflated Poisson model based likelihood ratio test and likelihood ratio test methods are applied to six selected drugs, from the 2006 to 2011 Adverse Event Reporting System database, with varying percentages of observed zero-count cells.

  20. Likelihood Analysis of Supersymmetric SU(5) GUTs

    CERN Document Server

    Bagnaschi, E.

    2017-01-01

    We perform a likelihood analysis of the constraints from accelerator experiments and astrophysical observations on supersymmetric (SUSY) models with SU(5) boundary conditions on soft SUSY-breaking parameters at the GUT scale. The parameter space of the models studied has 7 parameters: a universal gaugino mass $m_{1/2}$, distinct masses for the scalar partners of matter fermions in five- and ten-dimensional representations of SU(5), $m_5$ and $m_{10}$, and for the $\\mathbf{5}$ and $\\mathbf{\\bar 5}$ Higgs representations $m_{H_u}$ and $m_{H_d}$, a universal trilinear soft SUSY-breaking parameter $A_0$, and the ratio of Higgs vevs $\\tan \\beta$. In addition to previous constraints from direct sparticle searches, low-energy and flavour observables, we incorporate constraints based on preliminary results from 13 TeV LHC searches for jets + MET events and long-lived particles, as well as the latest PandaX-II and LUX searches for direct Dark Matter detection. In addition to previously-identified mechanisms for bringi...

  1. Reducing the likelihood of long tennis matches.

    Science.gov (United States)

    Barnett, Tristan; Alan, Brown; Pollard, Graham

    2006-01-01

    Long matches can cause problems for tournaments. For example, the starting times of subsequent matches can be substantially delayed causing inconvenience to players, spectators, officials and television scheduling. They can even be seen as unfair in the tournament setting when the winner of a very long match, who may have negative aftereffects from such a match, plays the winner of an average or shorter length match in the next round. Long matches can also lead to injuries to the participating players. One factor that can lead to long matches is the use of the advantage set as the fifth set, as in the Australian Open, the French Open and Wimbledon. Another factor is long rallies and a greater than average number of points per game. This tends to occur more frequently on the slower surfaces such as at the French Open. The mathematical method of generating functions is used to show that the likelihood of long matches can be substantially reduced by using the tiebreak game in the fifth set, or more effectively by using a new type of game, the 50-40 game, throughout the match. Key PointsThe cumulant generating function has nice properties for calculating the parameters of distributions in a tennis matchA final tiebreaker set reduces the length of matches as currently being used in the US OpenA new 50-40 game reduces the length of matches whilst maintaining comparable probabilities for the better player to win the match.

  2. Maximum likelihood window for time delay estimation

    International Nuclear Information System (INIS)

    Lee, Young Sup; Yoon, Dong Jin; Kim, Chi Yup

    2004-01-01

    Time delay estimation for the detection of leak location in underground pipelines is critically important. Because the exact leak location depends upon the precision of the time delay between sensor signals due to leak noise and the speed of elastic waves, the research on the estimation of time delay has been one of the key issues in leak lovating with the time arrival difference method. In this study, an optimal Maximum Likelihood window is considered to obtain a better estimation of the time delay. This method has been proved in experiments, which can provide much clearer and more precise peaks in cross-correlation functions of leak signals. The leak location error has been less than 1 % of the distance between sensors, for example the error was not greater than 3 m for 300 m long underground pipelines. Apart from the experiment, an intensive theoretical analysis in terms of signal processing has been described. The improved leak locating with the suggested method is due to the windowing effect in frequency domain, which offers a weighting in significant frequencies.

  3. Anticipating cognitive effort: roles of perceived error-likelihood and time demands.

    Science.gov (United States)

    Dunn, Timothy L; Inzlicht, Michael; Risko, Evan F

    2017-11-13

    Why are some actions evaluated as effortful? In the present set of experiments we address this question by examining individuals' perception of effort when faced with a trade-off between two putative cognitive costs: how much time a task takes vs. how error-prone it is. Specifically, we were interested in whether individuals anticipate engaging in a small amount of hard work (i.e., low time requirement, but high error-likelihood) vs. a large amount of easy work (i.e., high time requirement, but low error-likelihood) as being more effortful. In between-subject designs, Experiments 1 through 3 demonstrated that individuals anticipate options that are high in perceived error-likelihood (yet less time consuming) as more effortful than options that are perceived to be more time consuming (yet low in error-likelihood). Further, when asked to evaluate which of the two tasks was (a) more effortful, (b) more error-prone, and (c) more time consuming, effort-based and error-based choices closely tracked one another, but this was not the case for time-based choices. Utilizing a within-subject design, Experiment 4 demonstrated overall similar pattern of judgments as Experiments 1 through 3. However, both judgments of error-likelihood and time demand similarly predicted effort judgments. Results are discussed within the context of extant accounts of cognitive control, with considerations of how error-likelihood and time demands may independently and conjunctively factor into judgments of cognitive effort.

  4. Optimized Large-scale CMB Likelihood and Quadratic Maximum Likelihood Power Spectrum Estimation

    Science.gov (United States)

    Gjerløw, E.; Colombo, L. P. L.; Eriksen, H. K.; Górski, K. M.; Gruppuso, A.; Jewell, J. B.; Plaszczynski, S.; Wehus, I. K.

    2015-11-01

    We revisit the problem of exact cosmic microwave background (CMB) likelihood and power spectrum estimation with the goal of minimizing computational costs through linear compression. This idea was originally proposed for CMB purposes by Tegmark et al., and here we develop it into a fully functioning computational framework for large-scale polarization analysis, adopting WMAP as a working example. We compare five different linear bases (pixel space, harmonic space, noise covariance eigenvectors, signal-to-noise covariance eigenvectors, and signal-plus-noise covariance eigenvectors) in terms of compression efficiency, and find that the computationally most efficient basis is the signal-to-noise eigenvector basis, which is closely related to the Karhunen-Loeve and Principal Component transforms, in agreement with previous suggestions. For this basis, the information in 6836 unmasked WMAP sky map pixels can be compressed into a smaller set of 3102 modes, with a maximum error increase of any single multipole of 3.8% at ℓ ≤ 32 and a maximum shift in the mean values of a joint distribution of an amplitude-tilt model of 0.006σ. This compression reduces the computational cost of a single likelihood evaluation by a factor of 5, from 38 to 7.5 CPU seconds, and it also results in a more robust likelihood by implicitly regularizing nearly degenerate modes. Finally, we use the same compression framework to formulate a numerically stable and computationally efficient variation of the Quadratic Maximum Likelihood implementation, which requires less than 3 GB of memory and 2 CPU minutes per iteration for ℓ ≤ 32, rendering low-ℓ QML CMB power spectrum analysis fully tractable on a standard laptop.

  5. Maximum likelihood versus likelihood-free quantum system identification in the atom maser

    International Nuclear Information System (INIS)

    Catana, Catalin; Kypraios, Theodore; Guţă, Mădălin

    2014-01-01

    We consider the problem of estimating a dynamical parameter of a Markovian quantum open system (the atom maser), by performing continuous time measurements in the system's output (outgoing atoms). Two estimation methods are investigated and compared. Firstly, the maximum likelihood estimator (MLE) takes into account the full measurement data and is asymptotically optimal in terms of its mean square error. Secondly, the ‘likelihood-free’ method of approximate Bayesian computation (ABC) produces an approximation of the posterior distribution for a given set of summary statistics, by sampling trajectories at different parameter values and comparing them with the measurement data via chosen statistics. Building on previous results which showed that atom counts are poor statistics for certain values of the Rabi angle, we apply MLE to the full measurement data and estimate its Fisher information. We then select several correlation statistics such as waiting times, distribution of successive identical detections, and use them as input of the ABC algorithm. The resulting posterior distribution follows closely the data likelihood, showing that the selected statistics capture ‘most’ statistical information about the Rabi angle. (paper)

  6. Likelihood analysis of supersymmetric SU(5) GUTs

    Energy Technology Data Exchange (ETDEWEB)

    Bagnaschi, E.; Weiglein, G. [DESY, Hamburg (Germany); Costa, J.C.; Buchmueller, O.; Citron, M.; Richards, A.; De Vries, K.J. [Imperial College, High Energy Physics Group, Blackett Laboratory, London (United Kingdom); Sakurai, K. [University of Durham, Science Laboratories, Department of Physics, Institute for Particle Physics Phenomenology, Durham (United Kingdom); University of Warsaw, Faculty of Physics, Institute of Theoretical Physics, Warsaw (Poland); Borsato, M.; Chobanova, V.; Lucio, M.; Martinez Santos, D. [Universidade de Santiago de Compostela, Santiago de Compostela (Spain); Cavanaugh, R. [Fermi National Accelerator Laboratory, Batavia, IL (United States); University of Illinois at Chicago, Physics Department, Chicago, IL (United States); Roeck, A. de [CERN, Experimental Physics Department, Geneva (Switzerland); Antwerp University, Wilrijk (Belgium); Dolan, M.J. [University of Melbourne, ARC Centre of Excellence for Particle Physics at the Terascale, School of Physics, Parkville (Australia); Ellis, J.R. [King' s College London, Theoretical Particle Physics and Cosmology Group, Department of Physics, London (United Kingdom); Theoretical Physics Department, CERN, Geneva 23 (Switzerland); Flaecher, H. [University of Bristol, H.H. Wills Physics Laboratory, Bristol (United Kingdom); Heinemeyer, S. [Campus of International Excellence UAM+CSIC, Cantoblanco, Madrid (Spain); Instituto de Fisica Teorica UAM-CSIC, Madrid (Spain); Instituto de Fisica de Cantabria (CSIC-UC), Santander (Spain); Isidori, G. [Universitaet Zuerich, Physik-Institut, Zurich (Switzerland); Olive, K.A. [University of Minnesota, William I. Fine Theoretical Physics Institute, School of Physics and Astronomy, Minneapolis, MN (United States)

    2017-02-15

    We perform a likelihood analysis of the constraints from accelerator experiments and astrophysical observations on supersymmetric (SUSY) models with SU(5) boundary conditions on soft SUSY-breaking parameters at the GUT scale. The parameter space of the models studied has seven parameters: a universal gaugino mass m{sub 1/2}, distinct masses for the scalar partners of matter fermions in five- and ten-dimensional representations of SU(5), m{sub 5} and m{sub 10}, and for the 5 and anti 5 Higgs representations m{sub H{sub u}} and m{sub H{sub d}}, a universal trilinear soft SUSY-breaking parameter A{sub 0}, and the ratio of Higgs vevs tan β. In addition to previous constraints from direct sparticle searches, low-energy and flavour observables, we incorporate constraints based on preliminary results from 13 TeV LHC searches for jets + E{sub T} events and long-lived particles, as well as the latest PandaX-II and LUX searches for direct Dark Matter detection. In addition to previously identified mechanisms for bringing the supersymmetric relic density into the range allowed by cosmology, we identify a novel u{sub R}/c{sub R} - χ{sup 0}{sub 1} coannihilation mechanism that appears in the supersymmetric SU(5) GUT model and discuss the role of ν{sub τ} coannihilation. We find complementarity between the prospects for direct Dark Matter detection and SUSY searches at the LHC. (orig.)

  7. Likelihood analysis of supersymmetric SU(5) GUTs

    Energy Technology Data Exchange (ETDEWEB)

    Bagnaschi, E. [DESY, Hamburg (Germany); Costa, J.C. [Imperial College, London (United Kingdom). Blackett Lab.; Sakurai, K. [Durham Univ. (United Kingdom). Inst. for Particle Physics Phenomonology; Warsaw Univ. (Poland). Inst. of Theoretical Physics; Collaboration: MasterCode Collaboration; and others

    2016-10-15

    We perform a likelihood analysis of the constraints from accelerator experiments and astrophysical observations on supersymmetric (SUSY) models with SU(5) boundary conditions on soft SUSY-breaking parameters at the GUT scale. The parameter space of the models studied has 7 parameters: a universal gaugino mass m{sub 1/2}, distinct masses for the scalar partners of matter fermions in five- and ten-dimensional representations of SU(5), m{sub 5} and m{sub 10}, and for the 5 and anti 5 Higgs representations m{sub H{sub u}} and m{sub H{sub d}}, a universal trilinear soft SUSY-breaking parameter A{sub 0}, and the ratio of Higgs vevs tan β. In addition to previous constraints from direct sparticle searches, low-energy and avour observables, we incorporate constraints based on preliminary results from 13 TeV LHC searches for jets+E{sub T} events and long-lived particles, as well as the latest PandaX-II and LUX searches for direct Dark Matter detection. In addition to previously-identified mechanisms for bringing the supersymmetric relic density into the range allowed by cosmology, we identify a novel u{sub R}/c{sub R}-χ{sup 0}{sub 1} coannihilation mechanism that appears in the supersymmetric SU(5) GUT model and discuss the role of ν{sub T} coannihilation. We find complementarity between the prospects for direct Dark Matter detection and SUSY searches at the LHC.

  8. Epilepsy and Intellectual Disability: Does Epilepsy Increase the Likelihood of Co-Morbid Psychopathology?

    Science.gov (United States)

    Arshad, Saadia; Winterhalder, Robert; Underwood, Lisa; Kelesidi, Katerina; Chaplin, Eddie; Kravariti, Eugenia; Anagnostopoulos, Dimitrios; Bouras, Nick; McCarthy, Jane; Tsakanikos, Elias

    2011-01-01

    Although epilepsy is particularly common among people with intellectual disability (ID) it remains unclear whether it is associated with an increased likelihood of co-morbid psychopathology. We therefore investigated rates of mental health problems and other clinical characteristics in patients with ID and epilepsy (N=156) as compared to patients…

  9. Norplant's high cost may prohibit use in Title 10 clinics.

    Science.gov (United States)

    1991-04-01

    The article discusses the prohibitive cost of Norplant for the Title 10 low-income population served in public family planning clinics in the U.S. It is argued that it's unfair for U.S. users to pay $350 to Wyeth- Ayerst when another pharmaceutical company provides developing countries with Norplant at a cost of $14 - 23. Although the public sector and private foundations funded the development, it was explained that the company needs to recoup the investment in training and education. Medicaid and third party payers such as insurance companies will reimburse for the higher price, but if the public sector price is lowered, then the company would not make a profit and everyone would have argued for the reimbursement at the lower cost. It was suggested that a boycott of American Home Products, Wyeth-Ayerst's parent company, be made. Public family planning providers who are particularly low in funding reflect that their budget of $30,000 would only provide 85 users, and identified in this circumstance by drug abusers and multiple pregnancy women, and the need for teenagers remains unfulfilled. Another remarked that the client population served is 4700 with $54,000 in funding, which is already accounted for. The general trend of comments was that for low income women the cost is to high.

  10. The behavior of the likelihood ratio test for testing missingness

    OpenAIRE

    Hens, Niel; Aerts, Marc; Molenberghs, Geert; Thijs, Herbert

    2003-01-01

    To asses the sensitivity of conclusions to model choices in the context of selection models for non-random dropout, one can oppose the different missing mechanisms to each other; e.g. by the likelihood ratio tests. The finite sample behavior of the null distribution and the power of the likelihood ratio test is studied under a variety of missingness mechanisms. missing data; sensitivity analysis; likelihood ratio test; missing mechanisms

  11. Clinical evaluation of a medical high dynamic range display

    International Nuclear Information System (INIS)

    Marchessoux, Cedric; Paepe, Lode de; Vanovermeire, Olivier; Albani, Luigi

    2016-01-01

    Purpose: Recent new medical displays do have higher contrast and higher luminance but do not have a High Dynamic Range (HDR). HDR implies a minimum luminance value close to zero. A medical HDR display prototype based on two Liquid Crystal layers has been developed. The goal of this study is to evaluate the potential clinical benefit of such display in comparison with a low dynamic range (LDR) display. Methods: The study evaluated the clinical performance of the displays in a search and detection task. Eight radiologists read chest x-ray images some of which contained simulated lung nodules. The study used a JAFROC (Jacknife Free Receiver Operating Characteristic) approach for analyzing FROC data. The calculated figure of merit (FoM) is the probability that a lesion is rated higher than all rated nonlesions on all images. Time per case and accuracy for locating the center of the nodules were also compared. The nodules were simulated using Samei’s model. 214 CR and DR images [half were “healthy images” (chest nodule-free) and half “diseased images”] were used resulting in a total number of nodules equal to 199 with 25 images with 1 nodule, 51 images with 2 nodules, and 24 images with 3 nodules. A dedicated software interface was designed for visualizing the images for each session. For the JAFROC1 statistical analysis, the study is done per nodule category: all nodules, difficult nodules, and very difficult nodules. Results: For all nodules, the averaged FoM HDR is slightly higher than FoM LDR with 0.09% of difference. For the difficult nodules, the averaged FoM HDR is slightly higher than FoM LDR with 1.38% of difference. The averaged FoM HDR is slightly higher than FoM LDR with 0.71% of difference. For the true positive fraction (TPF), both displays (the HDR and the LDR ones) have similar TPF for all nodules, but looking at difficult and very difficult nodules, there are more TP for the HDR display. The true positive fraction has been also computed in

  12. Penalized Maximum Likelihood Estimation for univariate normal mixture distributions

    International Nuclear Information System (INIS)

    Ridolfi, A.; Idier, J.

    2001-01-01

    Due to singularities of the likelihood function, the maximum likelihood approach for the estimation of the parameters of normal mixture models is an acknowledged ill posed optimization problem. Ill posedness is solved by penalizing the likelihood function. In the Bayesian framework, it amounts to incorporating an inverted gamma prior in the likelihood function. A penalized version of the EM algorithm is derived, which is still explicit and which intrinsically assures that the estimates are not singular. Numerical evidence of the latter property is put forward with a test

  13. 21 CFR 862.2260 - High pressure liquid chromatography system for clinical use.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false High pressure liquid chromatography system for... Clinical Laboratory Instruments § 862.2260 High pressure liquid chromatography system for clinical use. (a) Identification. A high pressure liquid chromatography system for clinical use is a device intended to separate...

  14. Physical activity may decrease the likelihood of children developing constipation.

    Science.gov (United States)

    Seidenfaden, Sandra; Ormarsson, Orri Thor; Lund, Sigrun H; Bjornsson, Einar S

    2018-01-01

    Childhood constipation is common. We evaluated children diagnosed with constipation, who were referred to an Icelandic paediatric emergency department, and determined the effect of lifestyle factors on its aetiology. The parents of children who were diagnosed with constipation and participated in a phase IIB clinical trial on laxative suppositories answered an online questionnaire about their children's lifestyle and constipation in March-April 2013. The parents of nonconstipated children that visited the paediatric department of Landspitali University Hospital or an Icelandic outpatient clinic answered the same questionnaire. We analysed responses regarding 190 children aged one year to 18 years: 60 with constipation and 130 without. We found that 40% of the constipated children had recurrent symptoms, 27% had to seek medical attention more than once and 33% received medication per rectum. The 47 of 130 control group subjects aged 10-18 were much more likely to exercise more than three times a week (72%) and for more than a hour (62%) than the 26 of 60 constipated children of the same age (42% and 35%, respectively). Constipation risk factors varied with age and many children diagnosed with constipation had recurrent symptoms. Physical activity may affect the likelihood of developing constipation in older children. ©2017 Foundation Acta Paediatrica. Published by John Wiley & Sons Ltd.

  15. CONSTRUCTING A FLEXIBLE LIKELIHOOD FUNCTION FOR SPECTROSCOPIC INFERENCE

    International Nuclear Information System (INIS)

    Czekala, Ian; Andrews, Sean M.; Mandel, Kaisey S.; Green, Gregory M.; Hogg, David W.

    2015-01-01

    We present a modular, extensible likelihood framework for spectroscopic inference based on synthetic model spectra. The subtraction of an imperfect model from a continuously sampled spectrum introduces covariance between adjacent datapoints (pixels) into the residual spectrum. For the high signal-to-noise data with large spectral range that is commonly employed in stellar astrophysics, that covariant structure can lead to dramatically underestimated parameter uncertainties (and, in some cases, biases). We construct a likelihood function that accounts for the structure of the covariance matrix, utilizing the machinery of Gaussian process kernels. This framework specifically addresses the common problem of mismatches in model spectral line strengths (with respect to data) due to intrinsic model imperfections (e.g., in the atomic/molecular databases or opacity prescriptions) by developing a novel local covariance kernel formalism that identifies and self-consistently downweights pathological spectral line “outliers.” By fitting many spectra in a hierarchical manner, these local kernels provide a mechanism to learn about and build data-driven corrections to synthetic spectral libraries. An open-source software implementation of this approach is available at http://iancze.github.io/Starfish, including a sophisticated probabilistic scheme for spectral interpolation when using model libraries that are sparsely sampled in the stellar parameters. We demonstrate some salient features of the framework by fitting the high-resolution V-band spectrum of WASP-14, an F5 dwarf with a transiting exoplanet, and the moderate-resolution K-band spectrum of Gliese 51, an M5 field dwarf

  16. The high cost of clinical negligence litigation in the NHS.

    Science.gov (United States)

    Tingle, John

    2017-03-09

    John Tingle, Reader in Health Law at Nottingham Trent University, discusses a consultation document from the Department of Health on introducing fixed recoverable costs in lower-value clinical negligence claims.

  17. Planck 2013 results. XV. CMB power spectra and likelihood

    DEFF Research Database (Denmark)

    Tauber, Jan; Bartlett, J.G.; Bucher, M.

    2014-01-01

    This paper presents the Planck 2013 likelihood, a complete statistical description of the two-point correlation function of the CMB temperature fluctuations that accounts for all known relevant uncertainties, both instrumental and astrophysical in nature. We use this likelihood to derive our best...

  18. The modified signed likelihood statistic and saddlepoint approximations

    DEFF Research Database (Denmark)

    Jensen, Jens Ledet

    1992-01-01

    SUMMARY: For a number of tests in exponential families we show that the use of a normal approximation to the modified signed likelihood ratio statistic r * is equivalent to the use of a saddlepoint approximation. This is also true in a large deviation region where the signed likelihood ratio...... statistic r is of order √ n. © 1992 Biometrika Trust....

  19. Predictors of Likelihood of Speaking Up about Safety Concerns in Labour and Delivery

    Science.gov (United States)

    Lyndon, Audrey; Sexton, J. Bryan; Simpson, Kathleen Rice; Rosenstein, Alan; Lee, Kathryn A.; Wachter, Robert M.

    2011-01-01

    Background Despite widespread emphasis on promoting “assertive communication” by caregivers as essential to patient safety improvement efforts, fairly little is known about when and how clinicians speak up to address safety concerns. In this cross-sectional study we use a new measure of speaking up to begin exploring this issue in maternity care. Methods We developed a scenario-based measure of clinician’s assessment of potential harm and likelihood of speaking up in response to perceived harm. We embedded this scale in a survey with measures of safety climate, teamwork climate, disruptive behaviour, work stress, and personality traits of bravery and assertiveness. The survey was distributed to all registered nurses and obstetricians practicing in two US Labour & Delivery units. Results The response rate was 54% (125 of 230 potential respondents). Respondents were experienced clinicians (13.7 ± 11 years in specialty). Higher perception of harm, respondent role, specialty experience, and site predicted likelihood of speaking up when controlling for bravery and assertiveness. Physicians rated potential harm in common clinical scenarios lower than nurses did (7.5 vs. 8.4 on 2–10 scale; p<0.001). Some participants (12%) indicated they were unlikely to speak up despite perceiving high potential for harm in certain situations. Discussion This exploratory study found nurses and physicians differed in their harm ratings, and harm rating was a predictor of speaking up. This may partially explain persistent discrepancies between physicians and nurses in teamwork climate scores. Differing assessments of potential harms inherent in everyday practice may be a target for teamwork intervention in maternity care. PMID:22927492

  20. Likelihood analysis of parity violation in the compound nucleus

    International Nuclear Information System (INIS)

    Bowman, D.; Sharapov, E.

    1993-01-01

    We discuss the determination of the root mean-squared matrix element of the parity-violating interaction between compound-nuclear states using likelihood analysis. We briefly review the relevant features of the statistical model of the compound nucleus and the formalism of likelihood analysis. We then discuss the application of likelihood analysis to data on panty-violating longitudinal asymmetries. The reliability of the extracted value of the matrix element and errors assigned to the matrix element is stressed. We treat the situations where the spins of the p-wave resonances are not known and known using experimental data and Monte Carlo techniques. We conclude that likelihood analysis provides a reliable way to determine M and its confidence interval. We briefly discuss some problems associated with the normalization of the likelihood function

  1. Determinants of women's likelihood of vaginal self-sampling for human papillomavirus to screen for cervical cancer in Taiwan: a cross-sectional study.

    Science.gov (United States)

    Chen, Shu-Ling; Hsieh, Pao-Chun; Chou, Chia-Hui; Tzeng, Ya-Ling

    2014-11-25

    Many Taiwanese women (43.8%) did not participate in regular cervical screening in 2011. An alternative to cervical screening, self-sampling for human papillomavirus (HPV), has been available at no cost under Taiwan's National Health Insurance since 2010, but the extent and likelihood of HPV self-sampling were unknown. A cross-sectional study was performed to explore determinants of women's likelihood of HPV self-sampling. Data were collected by questionnaire from a convenience sample of 500 women attending hospital gynecologic clinics in central Taiwan from June to October 2012. Data were analyzed by descriptive statistics, chi-square test, and logistic regression. Of 500 respondents, 297 (59.4%) had heard of HPV; of these 297 women, 69 (23%) had self-sampled for HPV. Among the 297 women who had heard of HPV, 234 (78.8%) considered cost a priority for HPV self-sampling. Likelihood of HPV self-sampling was determined by previous Pap testing, high perceived risk of cervical cancer, willingness to self-sample for HPV, high HPV knowledge, and cost as a priority consideration. Outreach efforts to increase the acceptability of self-sampling for HPV testing rates should target women who have had a Pap test, perceive themselves at high risk for cervical cancer, are willing to self-sample for HPV, have a high level of HPV knowledge, and for whom the cost of self-sampling covered by health insurance is a priority.

  2. Calibration of two complex ecosystem models with different likelihood functions

    Science.gov (United States)

    Hidy, Dóra; Haszpra, László; Pintér, Krisztina; Nagy, Zoltán; Barcza, Zoltán

    2014-05-01

    goodness metric on calibration. The different likelihoods are different functions of RMSE (root mean squared error) weighted by measurement uncertainty: exponential / linear / quadratic / linear normalized by correlation. As a first calibration step sensitivity analysis was performed in order to select the influential parameters which have strong effect on the output data. In the second calibration step only the sensitive parameters were calibrated (optimal values and confidence intervals were calculated). In case of PaSim more parameters were found responsible for the 95% of the output data variance than is case of BBGC MuSo. Analysis of the results of the optimized models revealed that the exponential likelihood estimation proved to be the most robust (best model simulation with optimized parameter, highest confidence interval increase). The cross-validation of the model simulations can help in constraining the highly uncertain greenhouse gas budget of grasslands.

  3. Parallelization of maximum likelihood fits with OpenMP and CUDA

    CERN Document Server

    Jarp, S; Leduc, J; Nowak, A; Pantaleo, F

    2011-01-01

    Data analyses based on maximum likelihood fits are commonly used in the high energy physics community for fitting statistical models to data samples. This technique requires the numerical minimization of the negative log-likelihood function. MINUIT is the most common package used for this purpose in the high energy physics community. The main algorithm in this package, MIGRAD, searches the minimum by using the gradient information. The procedure requires several evaluations of the function, depending on the number of free parameters and their initial values. The whole procedure can be very CPU-time consuming in case of complex functions, with several free parameters, many independent variables and large data samples. Therefore, it becomes particularly important to speed-up the evaluation of the negative log-likelihood function. In this paper we present an algorithm and its implementation which benefits from data vectorization and parallelization (based on OpenMP) and which was also ported to Graphics Processi...

  4. Expert elicitation on ultrafine particles: likelihood of health effects and causal pathways

    Directory of Open Access Journals (Sweden)

    Brunekreef Bert

    2009-07-01

    Full Text Available Abstract Background Exposure to fine ambient particulate matter (PM has consistently been associated with increased morbidity and mortality. The relationship between exposure to ultrafine particles (UFP and health effects is less firmly established. If UFP cause health effects independently from coarser fractions, this could affect health impact assessment of air pollution, which would possibly lead to alternative policy options to be considered to reduce the disease burden of PM. Therefore, we organized an expert elicitation workshop to assess the evidence for a causal relationship between exposure to UFP and health endpoints. Methods An expert elicitation on the health effects of ambient ultrafine particle exposure was carried out, focusing on: 1 the likelihood of causal relationships with key health endpoints, and 2 the likelihood of potential causal pathways for cardiac events. Based on a systematic peer-nomination procedure, fourteen European experts (epidemiologists, toxicologists and clinicians were selected, of whom twelve attended. They were provided with a briefing book containing key literature. After a group discussion, individual expert judgments in the form of ratings of the likelihood of causal relationships and pathways were obtained using a confidence scheme adapted from the one used by the Intergovernmental Panel on Climate Change. Results The likelihood of an independent causal relationship between increased short-term UFP exposure and increased all-cause mortality, hospital admissions for cardiovascular and respiratory diseases, aggravation of asthma symptoms and lung function decrements was rated medium to high by most experts. The likelihood for long-term UFP exposure to be causally related to all cause mortality, cardiovascular and respiratory morbidity and lung cancer was rated slightly lower, mostly medium. The experts rated the likelihood of each of the six identified possible causal pathways separately. Out of these

  5. Predicting malaria in an highly endemic country using clinical and ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Kate Zinszer

    evaluate statistical models that integrate environmental and clinical data to .... was to identify and assess forecasting methods used to forecast malaria, and ...... 3Children's Hospital Informatics Program at the Harvard-MIT Division of Health Sciences and ...... Sachs J, Malaney P. The economic and social burden of malaria.

  6. High prevalence of sexual dysfunction in a vulvovaginal specialty clinic

    Science.gov (United States)

    Gordon, Dina; Gardella, Carolyn; Eschenbach, David; Mitchell, Caroline M.

    2014-01-01

    Objective Our study evaluated the presence and predictors of sexual dysfunction in a vulvovaginal specialty clinic population. Materials & Methods Women who presented to a vulvovaginal specialty clinic were eligible to enroll. Participants completed a questionnaire, including Female Sexual Function Index (FSFI) to assess sexual dysfunction and Patient Health Questionnaire (PHQ)-9 depression screen, and underwent a standardized physical exam, with vaginal swabs collected for wet mount and culture. Logistic regression assessed the relationship between sexual dysfunction and clinical diagnosis. Results We enrolled 161 women, aged 18–80 years (median = 36), presenting with vulvovaginal complaints. Median symptom duration was 24 months; 131 women (81%) reported chronic symptoms (≥12 months). By PHQ-9, 28 (17%) women met depression criteria. In the month prior to assessment, 86 (53%) women experienced sexual dysfunction. Women were primarily diagnosed with vaginitis (n = 46, 29%), vestibulodynia/vulvitis (n = 70; 43%), lichen planus or lichen sclerosus (n = 24; 15%). Controlling for age, sexual dysfunction did not correlate with chronic symptoms (IRR 0.86, 95% CI 0.50–1.48), depression (IRR 1.24; 95% CI 0.59, 2.58), or presence of any of the three main diagnoses (IRR 1.16, 95% CI 0.47, 2.88). Discussion Sexual dysfunction is present in over half of women presenting to a vulvovaginitis referral clinic, more than twice the rate in the wider population. PMID:25259664

  7. Clinical management of highly resorbed mandibular ridge without fibrous tissue

    Directory of Open Access Journals (Sweden)

    Veeramalai N Devaki

    2012-01-01

    Full Text Available Alveolar ridge atrophy poses a clinical challenge toward the fabrication of successful prosthesis. Resorption of mandibular denture bearing areas results in unstable non-retentive dentures associated with pain and discomfort. This article describes rehabilitation procedure of a patient with resorbed ridge with maximal areas of coverage to improve support and neutral zone arrangement of teeth to improve stability of denture.

  8. Likelihood Ratio Based Mixed Resolution Facial Comparison

    NARCIS (Netherlands)

    Peng, Y.; Spreeuwers, Lieuwe Jan; Veldhuis, Raymond N.J.

    2015-01-01

    In this paper, we propose a novel method for low-resolution face recognition. It is especially useful for a common situation in forensic search where faces of low resolution, e.g. on surveillance footage or in a crowd, must be compared to a high-resolution reference. This method is based on the

  9. Coronary CT angiography in clinical triage of patients at high risk of coronary artery disease

    DEFF Research Database (Denmark)

    Kühl, J Tobias; Hove, Jens D; Kristensen, Thomas S

    2017-01-01

    OBJECTIVES: To test if cardiac computed tomography angiography (CCTA) can be used in the triage of patients at high risk of coronary artery disease. DESIGN: The diagnostic value of 64-detector CCTA was evaluated in 400 patients presenting with non-ST segment elevation myocardial infarction using...... invasive coronary angiography (ICA) as the reference method. The relation between the severity of disease by CCTA and a combined endpoint of death, re-hospitalization due to new myocardial infarction, or symptom-driven coronary revascularization was assessed. RESULTS: CCTA detects significant (>50...... in patients with high likelihood of coronary artery disease and could, in theory, be used to triage high risk patients. As many obstacles remain, including logistical and safety issues, our study does not support the use of CCTA as an additional diagnostic test before ICA in an all-comer NSTEMI population....

  10. Intensive Auditory Cognitive Training Improves Verbal Memory in Adolescents and Young Adults at Clinical High Risk for Psychosis.

    Science.gov (United States)

    Loewy, Rachel; Fisher, Melissa; Schlosser, Danielle A; Biagianti, Bruno; Stuart, Barbara; Mathalon, Daniel H; Vinogradov, Sophia

    2016-07-01

    Individuals at clinical high risk (CHR) for psychosis demonstrate cognitive impairments that predict later psychotic transition and real-world functioning. Cognitive training has shown benefits in schizophrenia, but has not yet been adequately tested in the CHR population. In this double-blind randomized controlled trial, CHR individuals (N = 83) were given laptop computers and trained at home on 40 hours of auditory processing-based exercises designed to target verbal learning and memory operations, or on computer games (CG). Participants were assessed with neurocognitive tests based on the Measurement and Treatment Research to Improve Cognition in Schizophrenia initiative (MATRICS) battery and rated on symptoms and functioning. Groups were compared before and after training using a mixed-effects model with restricted maximum likelihood estimation, given the high study attrition rate (42%). Participants in the targeted cognitive training group showed a significant improvement in Verbal Memory compared to CG participants (effect size = 0.61). Positive and Total symptoms improved in both groups over time. CHR individuals showed patterns of training-induced cognitive improvement in verbal memory consistent with prior observations in schizophrenia. This is a particularly vulnerable domain in individuals at-risk for psychosis that predicts later functioning and psychotic transition. Ongoing follow-up of this cohort will assess the durability of training effects in CHR individuals, as well as the potential impact on symptoms and functioning over time. Clinical Trials Number: NCT00655239. URL: https://clinicaltrials.gov/ct2/show/NCT00655239?term=vinogradov&rank=5. Published by Oxford University Press on behalf of the Maryland Psychiatric Research Center 2016.

  11. Regularization parameter selection methods for ill-posed Poisson maximum likelihood estimation

    International Nuclear Information System (INIS)

    Bardsley, Johnathan M; Goldes, John

    2009-01-01

    In image processing applications, image intensity is often measured via the counting of incident photons emitted by the object of interest. In such cases, image data noise is accurately modeled by a Poisson distribution. This motivates the use of Poisson maximum likelihood estimation for image reconstruction. However, when the underlying model equation is ill-posed, regularization is needed. Regularized Poisson likelihood estimation has been studied extensively by the authors, though a problem of high importance remains: the choice of the regularization parameter. We will present three statistically motivated methods for choosing the regularization parameter, and numerical examples will be presented to illustrate their effectiveness

  12. Maximum likelihood pedigree reconstruction using integer linear programming.

    Science.gov (United States)

    Cussens, James; Bartlett, Mark; Jones, Elinor M; Sheehan, Nuala A

    2013-01-01

    Large population biobanks of unrelated individuals have been highly successful in detecting common genetic variants affecting diseases of public health concern. However, they lack the statistical power to detect more modest gene-gene and gene-environment interaction effects or the effects of rare variants for which related individuals are ideally required. In reality, most large population studies will undoubtedly contain sets of undeclared relatives, or pedigrees. Although a crude measure of relatedness might sometimes suffice, having a good estimate of the true pedigree would be much more informative if this could be obtained efficiently. Relatives are more likely to share longer haplotypes around disease susceptibility loci and are hence biologically more informative for rare variants than unrelated cases and controls. Distant relatives are arguably more useful for detecting variants with small effects because they are less likely to share masking environmental effects. Moreover, the identification of relatives enables appropriate adjustments of statistical analyses that typically assume unrelatedness. We propose to exploit an integer linear programming optimisation approach to pedigree learning, which is adapted to find valid pedigrees by imposing appropriate constraints. Our method is not restricted to small pedigrees and is guaranteed to return a maximum likelihood pedigree. With additional constraints, we can also search for multiple high-probability pedigrees and thus account for the inherent uncertainty in any particular pedigree reconstruction. The true pedigree is found very quickly by comparison with other methods when all individuals are observed. Extensions to more complex problems seem feasible. © 2012 Wiley Periodicals, Inc.

  13. Race of source effects in the elaboration likelihood model.

    Science.gov (United States)

    White, P H; Harkins, S G

    1994-11-01

    In a series of experiments, we investigated the effect of race of source on persuasive communications in the Elaboration Likelihood Model (R.E. Petty & J.T. Cacioppo, 1981, 1986). In Experiment 1, we found no evidence that White participants responded to a Black source as a simple negative cue. Experiment 2 suggested the possibility that exposure to a Black source led to low-involvement message processing. In Experiments 3 and 4, a distraction paradigm was used to test this possibility, and it was found that participants under low involvement were highly motivated to process a message presented by a Black source. In Experiment 5, we found that attitudes toward the source's ethnic group, rather than violations of expectancies, accounted for this processing effect. Taken together, the results of these experiments are consistent with S.L. Gaertner and J.F. Dovidio's (1986) theory of aversive racism, which suggests that Whites, because of a combination of egalitarian values and underlying negative racial attitudes, are very concerned about not appearing unfavorable toward Blacks, leading them to be highly motivated to process messages presented by a source from this group.

  14. Radiofrequency solutions in clinical high field magnetic resonance

    NARCIS (Netherlands)

    Andreychenko, A.

    2013-01-01

    Magnetic resonance imaging (MRI) and spectroscopy (MRS) benefit from the sensitivity gain at high field (≥7T). However, high field brings also certain challenges associated with growing frequency and spectral dispersion. Frequency growth results in degraded performance of large volume radiofrequency

  15. Posterior distributions for likelihood ratios in forensic science.

    Science.gov (United States)

    van den Hout, Ardo; Alberink, Ivo

    2016-09-01

    Evaluation of evidence in forensic science is discussed using posterior distributions for likelihood ratios. Instead of eliminating the uncertainty by integrating (Bayes factor) or by conditioning on parameter values, uncertainty in the likelihood ratio is retained by parameter uncertainty derived from posterior distributions. A posterior distribution for a likelihood ratio can be summarised by the median and credible intervals. Using the posterior mean of the distribution is not recommended. An analysis of forensic data for body height estimation is undertaken. The posterior likelihood approach has been criticised both theoretically and with respect to applicability. This paper addresses the latter and illustrates an interesting application area. Copyright © 2016 The Chartered Society of Forensic Sciences. Published by Elsevier Ireland Ltd. All rights reserved.

  16. Practical likelihood analysis for spatial generalized linear mixed models

    DEFF Research Database (Denmark)

    Bonat, W. H.; Ribeiro, Paulo Justiniano

    2016-01-01

    We investigate an algorithm for maximum likelihood estimation of spatial generalized linear mixed models based on the Laplace approximation. We compare our algorithm with a set of alternative approaches for two datasets from the literature. The Rhizoctonia root rot and the Rongelap are......, respectively, examples of binomial and count datasets modeled by spatial generalized linear mixed models. Our results show that the Laplace approximation provides similar estimates to Markov Chain Monte Carlo likelihood, Monte Carlo expectation maximization, and modified Laplace approximation. Some advantages...... of Laplace approximation include the computation of the maximized log-likelihood value, which can be used for model selection and tests, and the possibility to obtain realistic confidence intervals for model parameters based on profile likelihoods. The Laplace approximation also avoids the tuning...

  17. Algorithms of maximum likelihood data clustering with applications

    Science.gov (United States)

    Giada, Lorenzo; Marsili, Matteo

    2002-12-01

    We address the problem of data clustering by introducing an unsupervised, parameter-free approach based on maximum likelihood principle. Starting from the observation that data sets belonging to the same cluster share a common information, we construct an expression for the likelihood of any possible cluster structure. The likelihood in turn depends only on the Pearson's coefficient of the data. We discuss clustering algorithms that provide a fast and reliable approximation to maximum likelihood configurations. Compared to standard clustering methods, our approach has the advantages that (i) it is parameter free, (ii) the number of clusters need not be fixed in advance and (iii) the interpretation of the results is transparent. In order to test our approach and compare it with standard clustering algorithms, we analyze two very different data sets: time series of financial market returns and gene expression data. We find that different maximization algorithms produce similar cluster structures whereas the outcome of standard algorithms has a much wider variability.

  18. Generalized empirical likelihood methods for analyzing longitudinal data

    KAUST Repository

    Wang, S.; Qian, L.; Carroll, R. J.

    2010-01-01

    Efficient estimation of parameters is a major objective in analyzing longitudinal data. We propose two generalized empirical likelihood based methods that take into consideration within-subject correlations. A nonparametric version of the Wilks

  19. Maximum likelihood estimation of finite mixture model for economic data

    Science.gov (United States)

    Phoong, Seuk-Yen; Ismail, Mohd Tahir

    2014-06-01

    Finite mixture model is a mixture model with finite-dimension. This models are provides a natural representation of heterogeneity in a finite number of latent classes. In addition, finite mixture models also known as latent class models or unsupervised learning models. Recently, maximum likelihood estimation fitted finite mixture models has greatly drawn statistician's attention. The main reason is because maximum likelihood estimation is a powerful statistical method which provides consistent findings as the sample sizes increases to infinity. Thus, the application of maximum likelihood estimation is used to fit finite mixture model in the present paper in order to explore the relationship between nonlinear economic data. In this paper, a two-component normal mixture model is fitted by maximum likelihood estimation in order to investigate the relationship among stock market price and rubber price for sampled countries. Results described that there is a negative effect among rubber price and stock market price for Malaysia, Thailand, Philippines and Indonesia.

  20. Attitude towards, and likelihood of, complaining in the banking ...

    African Journals Online (AJOL)

    aims to determine customers' attitudes towards complaining as well as their likelihood of voicing a .... is particularly powerful and impacts greatly on customer satisfaction and retention. ...... 'Cross-national analysis of hotel customers' attitudes ...

  1. Narrow band interference cancelation in OFDM: Astructured maximum likelihood approach

    KAUST Repository

    Sohail, Muhammad Sadiq; Al-Naffouri, Tareq Y.; Al-Ghadhban, Samir N.

    2012-01-01

    This paper presents a maximum likelihood (ML) approach to mitigate the effect of narrow band interference (NBI) in a zero padded orthogonal frequency division multiplexing (ZP-OFDM) system. The NBI is assumed to be time variant and asynchronous

  2. Parental family variables and likelihood of divorce.

    Science.gov (United States)

    Skalkidou, A

    2000-01-01

    It has long been established that divorced men and women have substantially higher standardized general mortality than same gender persons. Because the incidence of divorce is increasing in many countries, determinants of divorce rates assume great importance as indirect risk factors for several diseases and conditions that adversely affect health. We have undertaken a study in Athens, Greece, to evaluate whether sibship size, birth order, and the gender composition of spousal sibships are related to the probability of divorce. 358 high school students, aged between 15 and 17 years, satisfactorily completed anonymous questionnaires, indicating whether their natural parents have been separated or divorced, their parents' educational achievement, birth order and sibship size by gender. The study was analyzed as a twin case-control investigation, treating those divorced or separated as cases and those who were not divorced or separated as controls. A man who grew up as an only child was almost three times as likely to divorce compared to a man with siblings, and this association was highly significant (p approximately 0.004). There was no such evidence with respect to women. After controlling for sibship size, earlier born men--but not women--appeared to be at higher risk for divorce compared to those later born. There was no evidence that the gender structure of the sibship substantially affects the risk for divorce. Even though divorce is not an organic disease, it indirectly affects health as well as the social well-being. The findings of this study need to be replicated, but, if confirmed, they could contribute to our understanding of the roots of some instances of marital dysfunction.

  3. On the likelihood function of Gaussian max-stable processes

    KAUST Repository

    Genton, M. G.; Ma, Y.; Sang, H.

    2011-01-01

    We derive a closed form expression for the likelihood function of a Gaussian max-stable process indexed by ℝd at p≤d+1 sites, d≥1. We demonstrate the gain in efficiency in the maximum composite likelihood estimators of the covariance matrix from p=2 to p=3 sites in ℝ2 by means of a Monte Carlo simulation study. © 2011 Biometrika Trust.

  4. Incorporating Nuisance Parameters in Likelihoods for Multisource Spectra

    CERN Document Server

    Conway, J.S.

    2011-01-01

    We describe here the general mathematical approach to constructing likelihoods for fitting observed spectra in one or more dimensions with multiple sources, including the effects of systematic uncertainties represented as nuisance parameters, when the likelihood is to be maximized with respect to these parameters. We consider three types of nuisance parameters: simple multiplicative factors, source spectra "morphing" parameters, and parameters representing statistical uncertainties in the predicted source spectra.

  5. On the likelihood function of Gaussian max-stable processes

    KAUST Repository

    Genton, M. G.

    2011-05-24

    We derive a closed form expression for the likelihood function of a Gaussian max-stable process indexed by ℝd at p≤d+1 sites, d≥1. We demonstrate the gain in efficiency in the maximum composite likelihood estimators of the covariance matrix from p=2 to p=3 sites in ℝ2 by means of a Monte Carlo simulation study. © 2011 Biometrika Trust.

  6. Tapered composite likelihood for spatial max-stable models

    KAUST Repository

    Sang, Huiyan

    2014-05-01

    Spatial extreme value analysis is useful to environmental studies, in which extreme value phenomena are of interest and meaningful spatial patterns can be discerned. Max-stable process models are able to describe such phenomena. This class of models is asymptotically justified to characterize the spatial dependence among extremes. However, likelihood inference is challenging for such models because their corresponding joint likelihood is unavailable and only bivariate or trivariate distributions are known. In this paper, we propose a tapered composite likelihood approach by utilizing lower dimensional marginal likelihoods for inference on parameters of various max-stable process models. We consider a weighting strategy based on a "taper range" to exclude distant pairs or triples. The "optimal taper range" is selected to maximize various measures of the Godambe information associated with the tapered composite likelihood function. This method substantially reduces the computational cost and improves the efficiency over equally weighted composite likelihood estimators. We illustrate its utility with simulation experiments and an analysis of rainfall data in Switzerland.

  7. Tapered composite likelihood for spatial max-stable models

    KAUST Repository

    Sang, Huiyan; Genton, Marc G.

    2014-01-01

    Spatial extreme value analysis is useful to environmental studies, in which extreme value phenomena are of interest and meaningful spatial patterns can be discerned. Max-stable process models are able to describe such phenomena. This class of models is asymptotically justified to characterize the spatial dependence among extremes. However, likelihood inference is challenging for such models because their corresponding joint likelihood is unavailable and only bivariate or trivariate distributions are known. In this paper, we propose a tapered composite likelihood approach by utilizing lower dimensional marginal likelihoods for inference on parameters of various max-stable process models. We consider a weighting strategy based on a "taper range" to exclude distant pairs or triples. The "optimal taper range" is selected to maximize various measures of the Godambe information associated with the tapered composite likelihood function. This method substantially reduces the computational cost and improves the efficiency over equally weighted composite likelihood estimators. We illustrate its utility with simulation experiments and an analysis of rainfall data in Switzerland.

  8. Clinical applications of a high quantum utilization scanner. Final report

    International Nuclear Information System (INIS)

    Crandall, P.H.; Cassen, B.

    1973-04-01

    The clinical usefulness of a tomographic imaging process employing a fast rectilinear scanner consisted of a spherical-cap nest of seven individual detectors (3'' x 1 / 2 '' activated sodium iodide) collimated to a common focal point at 10 cm. Hydraulic drives permitted a fast rectilinear scan to be made and, when raised or lowered, at different planes. Patients with well-defined brain lesions were studied using /sup 99m/Tc-pertechnetate or 203 Hg-chlormerodrin as tracers by measuring three dimensions of their lesions and anatomical location at the time of operation. Brain maps were used to identify this location at operation and also the location of images traced from film density displays of a conventional radioisotopic scan, the tomographic scan, and cerebral angiograms. (U.S.)

  9. Planck 2013 results. XV. CMB power spectra and likelihood

    Science.gov (United States)

    Planck Collaboration; Ade, P. A. R.; Aghanim, N.; Armitage-Caplan, C.; Arnaud, M.; Ashdown, M.; Atrio-Barandela, F.; Aumont, J.; Baccigalupi, C.; Banday, A. J.; Barreiro, R. B.; Bartlett, J. G.; Battaner, E.; Benabed, K.; Benoît, A.; Benoit-Lévy, A.; Bernard, J.-P.; Bersanelli, M.; Bielewicz, P.; Bobin, J.; Bock, J. J.; Bonaldi, A.; Bonavera, L.; Bond, J. R.; Borrill, J.; Bouchet, F. R.; Boulanger, F.; Bridges, M.; Bucher, M.; Burigana, C.; Butler, R. C.; Calabrese, E.; Cardoso, J.-F.; Catalano, A.; Challinor, A.; Chamballu, A.; Chiang, H. C.; Chiang, L.-Y.; Christensen, P. R.; Church, S.; Clements, D. L.; Colombi, S.; Colombo, L. P. L.; Combet, C.; Couchot, F.; Coulais, A.; Crill, B. P.; Curto, A.; Cuttaia, F.; Danese, L.; Davies, R. D.; Davis, R. J.; de Bernardis, P.; de Rosa, A.; de Zotti, G.; Delabrouille, J.; Delouis, J.-M.; Désert, F.-X.; Dickinson, C.; Diego, J. M.; Dole, H.; Donzelli, S.; Doré, O.; Douspis, M.; Dunkley, J.; Dupac, X.; Efstathiou, G.; Elsner, F.; Enßlin, T. A.; Eriksen, H. K.; Finelli, F.; Forni, O.; Frailis, M.; Fraisse, A. A.; Franceschi, E.; Gaier, T. C.; Galeotta, S.; Galli, S.; Ganga, K.; Giard, M.; Giardino, G.; Giraud-Héraud, Y.; Gjerløw, E.; González-Nuevo, J.; Górski, K. M.; Gratton, S.; Gregorio, A.; Gruppuso, A.; Gudmundsson, J. E.; Hansen, F. K.; Hanson, D.; Harrison, D.; Helou, G.; Henrot-Versillé, S.; Hernández-Monteagudo, C.; Herranz, D.; Hildebrandt, S. R.; Hivon, E.; Hobson, M.; Holmes, W. A.; Hornstrup, A.; Hovest, W.; Huffenberger, K. M.; Hurier, G.; Jaffe, A. H.; Jaffe, T. R.; Jewell, J.; Jones, W. C.; Juvela, M.; Keihänen, E.; Keskitalo, R.; Kiiveri, K.; Kisner, T. S.; Kneissl, R.; Knoche, J.; Knox, L.; Kunz, M.; Kurki-Suonio, H.; Lagache, G.; Lähteenmäki, A.; Lamarre, J.-M.; Lasenby, A.; Lattanzi, M.; Laureijs, R. J.; Lawrence, C. R.; Le Jeune, M.; Leach, S.; Leahy, J. P.; Leonardi, R.; León-Tavares, J.; Lesgourgues, J.; Liguori, M.; Lilje, P. B.; Linden-Vørnle, M.; Lindholm, V.; López-Caniego, M.; Lubin, P. M.; Macías-Pérez, J. F.; Maffei, B.; Maino, D.; Mandolesi, N.; Marinucci, D.; Maris, M.; Marshall, D. J.; Martin, P. G.; Martínez-González, E.; Masi, S.; Massardi, M.; Matarrese, S.; Matthai, F.; Mazzotta, P.; Meinhold, P. R.; Melchiorri, A.; Mendes, L.; Menegoni, E.; Mennella, A.; Migliaccio, M.; Millea, M.; Mitra, S.; Miville-Deschênes, M.-A.; Molinari, D.; Moneti, A.; Montier, L.; Morgante, G.; Mortlock, D.; Moss, A.; Munshi, D.; Murphy, J. A.; Naselsky, P.; Nati, F.; Natoli, P.; Netterfield, C. B.; Nørgaard-Nielsen, H. U.; Noviello, F.; Novikov, D.; Novikov, I.; O'Dwyer, I. J.; Orieux, F.; Osborne, S.; Oxborrow, C. A.; Paci, F.; Pagano, L.; Pajot, F.; Paladini, R.; Paoletti, D.; Partridge, B.; Pasian, F.; Patanchon, G.; Paykari, P.; Perdereau, O.; Perotto, L.; Perrotta, F.; Piacentini, F.; Piat, M.; Pierpaoli, E.; Pietrobon, D.; Plaszczynski, S.; Pointecouteau, E.; Polenta, G.; Ponthieu, N.; Popa, L.; Poutanen, T.; Pratt, G. W.; Prézeau, G.; Prunet, S.; Puget, J.-L.; Rachen, J. P.; Rahlin, A.; Rebolo, R.; Reinecke, M.; Remazeilles, M.; Renault, C.; Ricciardi, S.; Riller, T.; Ringeval, C.; Ristorcelli, I.; Rocha, G.; Rosset, C.; Roudier, G.; Rowan-Robinson, M.; Rubiño-Martín, J. A.; Rusholme, B.; Sandri, M.; Sanselme, L.; Santos, D.; Savini, G.; Scott, D.; Seiffert, M. D.; Shellard, E. P. S.; Spencer, L. D.; Starck, J.-L.; Stolyarov, V.; Stompor, R.; Sudiwala, R.; Sureau, F.; Sutton, D.; Suur-Uski, A.-S.; Sygnet, J.-F.; Tauber, J. A.; Tavagnacco, D.; Terenzi, L.; Toffolatti, L.; Tomasi, M.; Tristram, M.; Tucci, M.; Tuovinen, J.; Türler, M.; Valenziano, L.; Valiviita, J.; Van Tent, B.; Varis, J.; Vielva, P.; Villa, F.; Vittorio, N.; Wade, L. A.; Wandelt, B. D.; Wehus, I. K.; White, M.; White, S. D. M.; Yvon, D.; Zacchei, A.; Zonca, A.

    2014-11-01

    This paper presents the Planck 2013 likelihood, a complete statistical description of the two-point correlation function of the CMB temperature fluctuations that accounts for all known relevant uncertainties, both instrumental and astrophysical in nature. We use this likelihood to derive our best estimate of the CMB angular power spectrum from Planck over three decades in multipole moment, ℓ, covering 2 ≤ ℓ ≤ 2500. The main source of uncertainty at ℓ ≲ 1500 is cosmic variance. Uncertainties in small-scale foreground modelling and instrumental noise dominate the error budget at higher ℓs. For ℓ impact of residual foreground and instrumental uncertainties on the final cosmological parameters. We find good internal agreement among the high-ℓ cross-spectra with residuals below a few μK2 at ℓ ≲ 1000, in agreement with estimated calibration uncertainties. We compare our results with foreground-cleaned CMB maps derived from all Planck frequencies, as well as with cross-spectra derived from the 70 GHz Planck map, and find broad agreement in terms of spectrum residuals and cosmological parameters. We further show that the best-fit ΛCDM cosmology is in excellent agreement with preliminary PlanckEE and TE polarisation spectra. We find that the standard ΛCDM cosmology is well constrained by Planck from the measurements at ℓ ≲ 1500. One specific example is the spectral index of scalar perturbations, for which we report a 5.4σ deviation from scale invariance, ns = 1. Increasing the multipole range beyond ℓ ≃ 1500 does not increase our accuracy for the ΛCDM parameters, but instead allows us to study extensions beyond the standard model. We find no indication of significant departures from the ΛCDM framework. Finally, we report a tension between the Planck best-fit ΛCDM model and the low-ℓ spectrum in the form of a power deficit of 5-10% at ℓ ≲ 40, with a statistical significance of 2.5-3σ. Without a theoretically motivated model for

  10. Likelihood ratio model for classification of forensic evidence

    Energy Technology Data Exchange (ETDEWEB)

    Zadora, G., E-mail: gzadora@ies.krakow.pl [Institute of Forensic Research, Westerplatte 9, 31-033 Krakow (Poland); Neocleous, T., E-mail: tereza@stats.gla.ac.uk [University of Glasgow, Department of Statistics, 15 University Gardens, Glasgow G12 8QW (United Kingdom)

    2009-05-29

    One of the problems of analysis of forensic evidence such as glass fragments, is the determination of their use-type category, e.g. does a glass fragment originate from an unknown window or container? Very small glass fragments arise during various accidents and criminal offences, and could be carried on the clothes, shoes and hair of participants. It is therefore necessary to obtain information on their physicochemical composition in order to solve the classification problem. Scanning Electron Microscopy coupled with an Energy Dispersive X-ray Spectrometer and the Glass Refractive Index Measurement method are routinely used in many forensic institutes for the investigation of glass. A natural form of glass evidence evaluation for forensic purposes is the likelihood ratio-LR = p(E|H{sub 1})/p(E|H{sub 2}). The main aim of this paper was to study the performance of LR models for glass object classification which considered one or two sources of data variability, i.e. between-glass-object variability and(or) within-glass-object variability. Within the proposed model a multivariate kernel density approach was adopted for modelling the between-object distribution and a multivariate normal distribution was adopted for modelling within-object distributions. Moreover, a graphical method of estimating the dependence structure was employed to reduce the highly multivariate problem to several lower-dimensional problems. The performed analysis showed that the best likelihood model was the one which allows to include information about between and within-object variability, and with variables derived from elemental compositions measured by SEM-EDX, and refractive values determined before (RI{sub b}) and after (RI{sub a}) the annealing process, in the form of dRI = log{sub 10}|RI{sub a} - RI{sub b}|. This model gave better results than the model with only between-object variability considered. In addition, when dRI and variables derived from elemental compositions were used, this

  11. Likelihood ratio model for classification of forensic evidence

    International Nuclear Information System (INIS)

    Zadora, G.; Neocleous, T.

    2009-01-01

    One of the problems of analysis of forensic evidence such as glass fragments, is the determination of their use-type category, e.g. does a glass fragment originate from an unknown window or container? Very small glass fragments arise during various accidents and criminal offences, and could be carried on the clothes, shoes and hair of participants. It is therefore necessary to obtain information on their physicochemical composition in order to solve the classification problem. Scanning Electron Microscopy coupled with an Energy Dispersive X-ray Spectrometer and the Glass Refractive Index Measurement method are routinely used in many forensic institutes for the investigation of glass. A natural form of glass evidence evaluation for forensic purposes is the likelihood ratio-LR = p(E|H 1 )/p(E|H 2 ). The main aim of this paper was to study the performance of LR models for glass object classification which considered one or two sources of data variability, i.e. between-glass-object variability and(or) within-glass-object variability. Within the proposed model a multivariate kernel density approach was adopted for modelling the between-object distribution and a multivariate normal distribution was adopted for modelling within-object distributions. Moreover, a graphical method of estimating the dependence structure was employed to reduce the highly multivariate problem to several lower-dimensional problems. The performed analysis showed that the best likelihood model was the one which allows to include information about between and within-object variability, and with variables derived from elemental compositions measured by SEM-EDX, and refractive values determined before (RI b ) and after (RI a ) the annealing process, in the form of dRI = log 10 |RI a - RI b |. This model gave better results than the model with only between-object variability considered. In addition, when dRI and variables derived from elemental compositions were used, this model outperformed two other

  12. Radical prostatectomy in clinically localized high-risk prostate cancer

    DEFF Research Database (Denmark)

    Røder, Martin Andreas; Berg, Kasper Drimer; Christensen, Ib Jarle

    2013-01-01

    ) is regarded as primary therapy by others. This study examined the outcome for high-risk localized PCa patients treated with RP. Material and methods. Of 1300 patients who underwent RP, 231 were identified as high-risk. Patients were followed for biochemical recurrence (BCR) (defined as prostate-specific......Abstract Objective. The optimal therapeutic strategy for high-risk localized prostate cancer (PCa) is controversial. Supported by randomized trials, the combination of external beam radiation therapy (EBRT) and endocrine therapy (ET) is advocated by many, while radical prostatectomy (RP...... antigen ≥ 0.2 ng/ml), metastatic disease and survival. Excluding node-positive patients, none of the patients received adjuvant therapy before BCR was confirmed. Univariate and multivariate analysis was performed with Kaplan-Meier and Cox proportional hazard models. Results. Median follow-up was 4.4 years...

  13. Epidemiology of anal HPV infection in high-risk men attending a sexually transmitted infection clinic in Puerto Rico.

    Directory of Open Access Journals (Sweden)

    Vivian Colón-López

    Full Text Available Recent studies in Puerto Rico have reported an increasing incidence of anal cancer in Puerto Rican men. The objective of this study was to determine the prevalence, genotype distribution and risk factors associated with anal HPV infection among men attending an STI clinic in Puerto Rico.We conducted a cross-sectional study among 205 men 18 years and older. A comprehensive survey was administered that included a demographic and a behavioral assessment. Separate logistic regression models were performed to determine factors associated with any, high-risk (HR, and multiple anal HPV infection.The mean age of the study sample was 38.0±13.5 years. The most common HR types were 58, 51 and 31. Overall, HR anal HPV infection was found in 53.5% of the participants. Multiple HPV types in the anal canal were found in 47.6% of the sample. A third (29.8% of participants reported being men who had sex with men (MSM. MSM had a significantly higher prevalence of any, HR and multiple HPV infection (p-value<0.05. Separate multivariate logistic regression analyses showed that being MSM was associated with any (OR = 4.5; [95%CI: 1.9-10.7], HR (OR = 3.4; [95%CI: 1.1-10.3 and multiple anal HPV infection (OR = 3.6; [95%CI: 1.5-9.1. HIV was marginally associated with multiple anal HPV infection in multivariate analysis (OR = 3.3; 95%CI = 1.0-11.0.Anal HPV is common among sexually active men attending this STI clinic, with higher likelihood of anal HPV infection among MSM.

  14. Ringing Artefact Reduction By An Efficient Likelihood Improvement Method

    Science.gov (United States)

    Fuderer, Miha

    1989-10-01

    In MR imaging, the extent of the acquired spatial frequencies of the object is necessarily finite. The resulting image shows artefacts caused by "truncation" of its Fourier components. These are known as Gibbs artefacts or ringing artefacts. These artefacts are particularly. visible when the time-saving reduced acquisition method is used, say, when scanning only the lowest 70% of the 256 data lines. Filtering the data results in loss of resolution. A method is described that estimates the high frequency data from the low-frequency data lines, with the likelihood of the image as criterion. It is a computationally very efficient method, since it requires practically only two extra Fourier transforms, in addition to the normal. reconstruction. The results of this method on MR images of human subjects are promising. Evaluations on a 70% acquisition image show about 20% decrease of the error energy after processing. "Error energy" is defined as the total power of the difference to a 256-data-lines reference image. The elimination of ringing artefacts then appears almost complete..

  15. Maximum-likelihood estimation of recent shared ancestry (ERSA).

    Science.gov (United States)

    Huff, Chad D; Witherspoon, David J; Simonson, Tatum S; Xing, Jinchuan; Watkins, W Scott; Zhang, Yuhua; Tuohy, Therese M; Neklason, Deborah W; Burt, Randall W; Guthery, Stephen L; Woodward, Scott R; Jorde, Lynn B

    2011-05-01

    Accurate estimation of recent shared ancestry is important for genetics, evolution, medicine, conservation biology, and forensics. Established methods estimate kinship accurately for first-degree through third-degree relatives. We demonstrate that chromosomal segments shared by two individuals due to identity by descent (IBD) provide much additional information about shared ancestry. We developed a maximum-likelihood method for the estimation of recent shared ancestry (ERSA) from the number and lengths of IBD segments derived from high-density SNP or whole-genome sequence data. We used ERSA to estimate relationships from SNP genotypes in 169 individuals from three large, well-defined human pedigrees. ERSA is accurate to within one degree of relationship for 97% of first-degree through fifth-degree relatives and 80% of sixth-degree and seventh-degree relatives. We demonstrate that ERSA's statistical power approaches the maximum theoretical limit imposed by the fact that distant relatives frequently share no DNA through a common ancestor. ERSA greatly expands the range of relationships that can be estimated from genetic data and is implemented in a freely available software package.

  16. CORRECTING LABIAL THICK AND HIGH ATTACHED FRENUM (CLINICAL OBSERVATION.

    Directory of Open Access Journals (Sweden)

    Silvia Krusteva

    2012-11-01

    Full Text Available Labial thick and high attached maxillary frenum is commonly regarded as contributing etiology for maintaining midline diastema and upper jaw delayed development. The surgical modalities used to solve this problem are known to be quite stressful for children. Dental lasers have recently been increasingly used to treat wide variety of problems in medicine. AIM: Using a high energy diode laser to remove a short, high attached frenum of the upper lip and present the results of the procedure. MATERIAL AND METHODS: We performed frenectomy in 10 randomly selected patients of both sexes aged 7-9 years with short, thick frena of the upper lip. A Picasso soft tissue diode laser, class IV, power output 7 W, λ-810 nm was used for the procedure. RESULTS AND DISCUSSION: The healing process was uneventful, painless and without edemas developing in the soft tissues. No inflammation was found in the treated tissues. The children undergoing the procedure showed no fear. This was the reason why we preferred to use lasers as a modern therapeutic modality for soft tissue correction in the mouth.CONCLUSION: Using lasers to remove short, high attached maxillary labial frenum has the benefit of inducing less stress in children than that they experience if anaesthesia and surgery are administered. Anaesthesia with topical anaesthetics is used in the procedure. The postoperative period is free of pain and far from severe. This makes this technique particularly useful for children.

  17. Stealing among High School Students: Prevalence and Clinical Correlates

    OpenAIRE

    Grant, Jon E.; Potenza, Marc N.; Krishnan-Sarin, Suchitra; Cavallo, Dana A.; Desai, Rani A.

    2011-01-01

    Although stealing among adolescents appears to be fairly common, an assessment of adolescent stealing and its relationship to other behaviors and health issues is incompletely understood. A large sample of high school students (n=3999) was examined using a self-report survey with 153 questions concerning demographic characteristics, stealing behaviors, other health behaviors including substance use, and functioning variables such as grades and violent behavior. The overall prevalence of steal...

  18. 42 CFR 493.1453 - Condition: Laboratories performing high complexity testing; clinical consultant.

    Science.gov (United States)

    2010-10-01

    ... Condition: Laboratories performing high complexity testing; clinical consultant. The laboratory must have a... 42 Public Health 5 2010-10-01 2010-10-01 false Condition: Laboratories performing high complexity testing; clinical consultant. 493.1453 Section 493.1453 Public Health CENTERS FOR MEDICARE & MEDICAID...

  19. Headache at high school: clinical characteristics and impact.

    Science.gov (United States)

    Tonini, M C; Frediani, F

    2012-05-01

    Although migraine (MH) and tension type headache (TTH) are the most common and important causes of recurrent headache in adolescents, they are poorly understood and not recognized by parents and teachers, delaying the first physician evaluation for correct diagnosis and management. The purpose of this study is to assess the knowledge about headache impact among the students of a Communication Private High School in Rimini city, and to evaluate the main different types of headaches interfering with school and social day activities. A self-administered questionnaire interview was given to students of the last 2 years of high school; ten items assessed the headache experience during the prior 12 months, especially during school time: the features and diagnosis of headaches types (based on the 2004 IHS criteria), precipitating factors, disability measured using the migraine disability assessment (MIDAS); therapeutic intervention. Out of the 60 students, 84 % experienced recurrent headache during the last 12 months. 79 % were females, aged 17-20 years; a family history was present in 74 % of headache students, in the maternal line; 45 % of subjects were identified as having MH and 27 % TTH; 25 % had morning headache and 20 % in the afternoon; fatigue, emotional stress and lack of sleep were the main trigger factors for headache, respectively in 86, 50 and 50 % of students; 92 % of headache students could not follow the lessons, could not participate in exercises and physical activity because of the headache; none had consulted a medical doctor and the 90 % of all students had never read, listened or watched television about headache. This study remarks on the need to promote headache educational programs, starting from high school, to increase communication between teachers-family-physician and patient-adolescents, with the goal to have an early appropriate therapeutic intervention, improvement of the quality of life and to prevent long-term headache disease in the

  20. Stealing among High School Students: Prevalence and Clinical Correlates

    Science.gov (United States)

    Grant, Jon E.; Potenza, Marc N.; Krishnan-Sarin, Suchitra; Cavallo, Dana A.; Desai, Rani A.

    2013-01-01

    Although stealing among adolescents appears to be fairly common, an assessment of adolescent stealing and its relationship to other behaviors and health issues is incompletely understood. A large sample of high school students (n=3999) was examined using a self-report survey with 153 questions concerning demographic characteristics, stealing behaviors, other health behaviors including substance use, and functioning variables such as grades and violent behavior. The overall prevalence of stealing was 15.2% (95%CI: 14.8–17.0). Twenty-nine (0.72%) students endorsed symptoms consistent with a diagnosis of DSM-IV kleptomania. Poor grades, alcohol and drug use, regular smoking, sadness and hopelessness, and other antisocial behaviors were all significantly (p<.05) associated with any stealing behavior. Stealing appears fairly common among high school students and is associated with a range of potentially addictive and antisocial behaviors. Significant distress and loss of control over this behavior suggests that stealing often has significant associated morbidity. PMID:21389165

  1. Stealing among high school students: prevalence and clinical correlates.

    Science.gov (United States)

    Grant, Jon E; Potenza, Marc N; Krishnan-Sarin, Suchitra; Cavallo, Dana A; Desai, Rani A

    2011-01-01

    Although stealing among adolescents appears to be fairly common, an assessment of adolescent stealing and its relationship to other behaviors and health problems is incompletely understood. A large sample of high school students (n = 3,999) was examined by self-report survey with 153 questions concerning demographic characteristics, stealing behaviors, other health behaviors including substance use, and functioning variables, such as grades and violent behavior. The overall prevalence of stealing was 15.2 percent (95% confidence interval (CI), 14.8-17.0). Twenty-nine (0.72%) students endorsed symptoms consistent with a diagnosis of DSM-IV-TR kleptomania. Poor grades, alcohol and drug use, regular smoking, sadness and hopelessness, and other antisocial behaviors were all significantly (p stealing behavior. Stealing appears to be fairly common among high school students and is associated with a range of potentially addictive and antisocial behaviors. Significant distress and loss of control over this behavior suggest that stealing often has significant associated morbidity.

  2. Constraint likelihood analysis for a network of gravitational wave detectors

    International Nuclear Information System (INIS)

    Klimenko, S.; Rakhmanov, M.; Mitselmakher, G.; Mohanty, S.

    2005-01-01

    We propose a coherent method for detection and reconstruction of gravitational wave signals with a network of interferometric detectors. The method is derived by using the likelihood ratio functional for unknown signal waveforms. In the likelihood analysis, the global maximum of the likelihood ratio over the space of waveforms is used as the detection statistic. We identify a problem with this approach. In the case of an aligned pair of detectors, the detection statistic depends on the cross correlation between the detectors as expected, but this dependence disappears even for infinitesimally small misalignments. We solve the problem by applying constraints on the likelihood functional and obtain a new class of statistics. The resulting method can be applied to data from a network consisting of any number of detectors with arbitrary detector orientations. The method allows us reconstruction of the source coordinates and the waveforms of two polarization components of a gravitational wave. We study the performance of the method with numerical simulations and find the reconstruction of the source coordinates to be more accurate than in the standard likelihood method

  3. Profile-likelihood Confidence Intervals in Item Response Theory Models.

    Science.gov (United States)

    Chalmers, R Philip; Pek, Jolynn; Liu, Yang

    2017-01-01

    Confidence intervals (CIs) are fundamental inferential devices which quantify the sampling variability of parameter estimates. In item response theory, CIs have been primarily obtained from large-sample Wald-type approaches based on standard error estimates, derived from the observed or expected information matrix, after parameters have been estimated via maximum likelihood. An alternative approach to constructing CIs is to quantify sampling variability directly from the likelihood function with a technique known as profile-likelihood confidence intervals (PL CIs). In this article, we introduce PL CIs for item response theory models, compare PL CIs to classical large-sample Wald-type CIs, and demonstrate important distinctions among these CIs. CIs are then constructed for parameters directly estimated in the specified model and for transformed parameters which are often obtained post-estimation. Monte Carlo simulation results suggest that PL CIs perform consistently better than Wald-type CIs for both non-transformed and transformed parameters.

  4. Generalized empirical likelihood methods for analyzing longitudinal data

    KAUST Repository

    Wang, S.

    2010-02-16

    Efficient estimation of parameters is a major objective in analyzing longitudinal data. We propose two generalized empirical likelihood based methods that take into consideration within-subject correlations. A nonparametric version of the Wilks theorem for the limiting distributions of the empirical likelihood ratios is derived. It is shown that one of the proposed methods is locally efficient among a class of within-subject variance-covariance matrices. A simulation study is conducted to investigate the finite sample properties of the proposed methods and compare them with the block empirical likelihood method by You et al. (2006) and the normal approximation with a correctly estimated variance-covariance. The results suggest that the proposed methods are generally more efficient than existing methods which ignore the correlation structure, and better in coverage compared to the normal approximation with correctly specified within-subject correlation. An application illustrating our methods and supporting the simulation study results is also presented.

  5. Likelihood ratio sequential sampling models of recognition memory.

    Science.gov (United States)

    Osth, Adam F; Dennis, Simon; Heathcote, Andrew

    2017-02-01

    The mirror effect - a phenomenon whereby a manipulation produces opposite effects on hit and false alarm rates - is benchmark regularity of recognition memory. A likelihood ratio decision process, basing recognition on the relative likelihood that a stimulus is a target or a lure, naturally predicts the mirror effect, and so has been widely adopted in quantitative models of recognition memory. Glanzer, Hilford, and Maloney (2009) demonstrated that likelihood ratio models, assuming Gaussian memory strength, are also capable of explaining regularities observed in receiver-operating characteristics (ROCs), such as greater target than lure variance. Despite its central place in theorising about recognition memory, however, this class of models has not been tested using response time (RT) distributions. In this article, we develop a linear approximation to the likelihood ratio transformation, which we show predicts the same regularities as the exact transformation. This development enabled us to develop a tractable model of recognition-memory RT based on the diffusion decision model (DDM), with inputs (drift rates) provided by an approximate likelihood ratio transformation. We compared this "LR-DDM" to a standard DDM where all targets and lures receive their own drift rate parameters. Both were implemented as hierarchical Bayesian models and applied to four datasets. Model selection taking into account parsimony favored the LR-DDM, which requires fewer parameters than the standard DDM but still fits the data well. These results support log-likelihood based models as providing an elegant explanation of the regularities of recognition memory, not only in terms of choices made but also in terms of the times it takes to make them. Copyright © 2016 Elsevier Inc. All rights reserved.

  6. Unbinned likelihood maximisation framework for neutrino clustering in Python

    Energy Technology Data Exchange (ETDEWEB)

    Coenders, Stefan [Technische Universitaet Muenchen, Boltzmannstr. 2, 85748 Garching (Germany)

    2016-07-01

    Albeit having detected an astrophysical neutrino flux with IceCube, sources of astrophysical neutrinos remain hidden up to now. A detection of a neutrino point source is a smoking gun for hadronic processes and acceleration of cosmic rays. The search for neutrino sources has many degrees of freedom, for example steady versus transient, point-like versus extended sources, et cetera. Here, we introduce a Python framework designed for unbinned likelihood maximisations as used in searches for neutrino point sources by IceCube. Implementing source scenarios in a modular way, likelihood searches on various kinds can be implemented in a user-friendly way, without sacrificing speed and memory management.

  7. Nearly Efficient Likelihood Ratio Tests of the Unit Root Hypothesis

    DEFF Research Database (Denmark)

    Jansson, Michael; Nielsen, Morten Ørregaard

    Seemingly absent from the arsenal of currently available "nearly efficient" testing procedures for the unit root hypothesis, i.e. tests whose local asymptotic power functions are indistinguishable from the Gaussian power envelope, is a test admitting a (quasi-)likelihood ratio interpretation. We...... show that the likelihood ratio unit root test derived in a Gaussian AR(1) model with standard normal innovations is nearly efficient in that model. Moreover, these desirable properties carry over to more complicated models allowing for serially correlated and/or non-Gaussian innovations....

  8. A note on estimating errors from the likelihood function

    International Nuclear Information System (INIS)

    Barlow, Roger

    2005-01-01

    The points at which the log likelihood falls by 12 from its maximum value are often used to give the 'errors' on a result, i.e. the 68% central confidence interval. The validity of this is examined for two simple cases: a lifetime measurement and a Poisson measurement. Results are compared with the exact Neyman construction and with the simple Bartlett approximation. It is shown that the accuracy of the log likelihood method is poor, and the Bartlett construction explains why it is flawed

  9. Nearly Efficient Likelihood Ratio Tests for Seasonal Unit Roots

    DEFF Research Database (Denmark)

    Jansson, Michael; Nielsen, Morten Ørregaard

    In an important generalization of zero frequency autore- gressive unit root tests, Hylleberg, Engle, Granger, and Yoo (1990) developed regression-based tests for unit roots at the seasonal frequencies in quarterly time series. We develop likelihood ratio tests for seasonal unit roots and show...... that these tests are "nearly efficient" in the sense of Elliott, Rothenberg, and Stock (1996), i.e. that their local asymptotic power functions are indistinguishable from the Gaussian power envelope. Currently available nearly efficient testing procedures for seasonal unit roots are regression-based and require...... the choice of a GLS detrending parameter, which our likelihood ratio tests do not....

  10. LDR: A Package for Likelihood-Based Sufficient Dimension Reduction

    Directory of Open Access Journals (Sweden)

    R. Dennis Cook

    2011-03-01

    Full Text Available We introduce a new mlab software package that implements several recently proposed likelihood-based methods for sufficient dimension reduction. Current capabilities include estimation of reduced subspaces with a fixed dimension d, as well as estimation of d by use of likelihood-ratio testing, permutation testing and information criteria. The methods are suitable for preprocessing data for both regression and classification. Implementations of related estimators are also available. Although the software is more oriented to command-line operation, a graphical user interface is also provided for prototype computations.

  11. Likelihood ratio decisions in memory: three implied regularities.

    Science.gov (United States)

    Glanzer, Murray; Hilford, Andrew; Maloney, Laurence T

    2009-06-01

    We analyze four general signal detection models for recognition memory that differ in their distributional assumptions. Our analyses show that a basic assumption of signal detection theory, the likelihood ratio decision axis, implies three regularities in recognition memory: (1) the mirror effect, (2) the variance effect, and (3) the z-ROC length effect. For each model, we present the equations that produce the three regularities and show, in computed examples, how they do so. We then show that the regularities appear in data from a range of recognition studies. The analyses and data in our study support the following generalization: Individuals make efficient recognition decisions on the basis of likelihood ratios.

  12. Balance improvement and reduction of likelihood of falls in older women after Cawthorne and Cooksey exercises.

    Science.gov (United States)

    Ribeiro, Angela dos Santos Bersot; Pereira, João Santos

    2005-01-01

    Vestibular system is the absolute referential for the maintenance of balance. Functional deficit with aging can result in balance disturbance and in increase of likelihood of falls. To verify whether specific therapeutic approach of the system can promote motor learning and can contribute to the improvement of balance and to decrease of likelihood of falls. Clinical prospective. Fifteen women, aged 60 to 69, mean = 64.8 years old (+/- 2.95), resident in Barra Mansa-RJ, were submitted to Cawthorne and Cooksey exercises during three months, three times a week, during sixty minutes. They were evaluated with Berg Balance Scale (BBS), whose scores determine the possibility of fall (PQ). Comparing the data obtained before and after intervention, we observed significant difference (pelderly people.

  13. Maximum likelihood pixel labeling using a spatially variant finite mixture model

    International Nuclear Information System (INIS)

    Gopal, S.S.; Hebert, T.J.

    1996-01-01

    We propose a spatially-variant mixture model for pixel labeling. Based on this spatially-variant mixture model we derive an expectation maximization algorithm for maximum likelihood estimation of the pixel labels. While most algorithms using mixture models entail the subsequent use of a Bayes classifier for pixel labeling, the proposed algorithm yields maximum likelihood estimates of the labels themselves and results in unambiguous pixel labels. The proposed algorithm is fast, robust, easy to implement, flexible in that it can be applied to any arbitrary image data where the number of classes is known and, most importantly, obviates the need for an explicit labeling rule. The algorithm is evaluated both quantitatively and qualitatively on simulated data and on clinical magnetic resonance images of the human brain

  14. Likelihood Estimation of the Systemic Poison-Induced Morbidity in an Adult North Eastern Romanian Population

    Directory of Open Access Journals (Sweden)

    Cătălina Lionte

    2016-12-01

    Full Text Available Purpose: Acute exposure to a systemic poison represents an important segment of medical emergencies. We aimed to estimate the likelihood of systemic poison-induced morbidity in a population admitted in a tertiary referral center from North East Romania, based on the determinant factors. Methodology: This was a prospective observational cohort study on adult poisoned patients. Demographic, clinical and laboratory characteristics were recorded in all patients. We analyzed three groups of patients, based on the associated morbidity during hospitalization. We identified significant differences between groups and predictors with significant effects on morbidity using multiple multinomial logistic regressions. ROC analysis proved that a combination of tests could improve diagnostic accuracy of poison-related morbidity. Main findings: Of the 180 patients included, aged 44.7 ± 17.2 years, 51.1% males, 49.4% had no poison-related morbidity, 28.9% developed a mild morbidity, and 21.7% had a severe morbidity, followed by death in 16 patients (8.9%. Multiple complications and deaths were recorded in patients aged 53.4 ± 17.6 years (p .001, with a lower Glasgow Coma Scale (GCS score upon admission and a significantly higher heart rate (101 ± 32 beats/min, p .011. Routine laboratory tests were significantly higher in patients with a recorded morbidity. Multiple logistic regression analysis demonstrated that a GCS < 8, a high white blood cells count (WBC, alanine aminotransferase (ALAT, myoglobin, glycemia and brain natriuretic peptide (BNP are strongly predictive for in-hospital severe morbidity. Originality: This is the first Romanian prospective study on adult poisoned patients, which identifies the factors responsible for in-hospital morbidity using logistic regression analyses, with resulting receiver operating characteristic (ROC curves. Conclusion: In acute intoxication with systemic poisons, we identified several clinical and laboratory variables

  15. Seasonal species interactions minimize the impact of species turnover on the likelihood of community persistence.

    Science.gov (United States)

    Saavedra, Serguei; Rohr, Rudolf P; Fortuna, Miguel A; Selva, Nuria; Bascompte, Jordi

    2016-04-01

    Many of the observed species interactions embedded in ecological communities are not permanent, but are characterized by temporal changes that are observed along with abiotic and biotic variations. While work has been done describing and quantifying these changes, little is known about their consequences for species coexistence. Here, we investigate the extent to which changes of species composition impact the likelihood of persistence of the predator-prey community in the highly seasonal Białowieza Primeval Forest (northeast Poland), and the extent to which seasonal changes of species interactions (predator diet) modulate the expected impact. This likelihood is estimated extending recent developments on the study of structural stability in ecological communities. We find that the observed species turnover strongly varies the likelihood of community persistence between summer and winter. Importantly, we demonstrate that the observed seasonal interaction changes minimize the variation in the likelihood of persistence associated with species turnover across the year. We find that these community dynamics can be explained as the coupling of individual species to their environment by minimizing both the variation in persistence conditions and the interaction changes between seasons. Our results provide a homeostatic explanation for seasonal species interactions and suggest that monitoring the association of interactions changes with the level of variation in community dynamics can provide a good indicator of the response of species to environmental pressures.

  16. Self-Reflection of Video-Recorded High-Fidelity Simulations and Development of Clinical Judgment.

    Science.gov (United States)

    Bussard, Michelle E

    2016-09-01

    Nurse educators are increasingly using high-fidelity simulators to improve prelicensure nursing students' ability to develop clinical judgment. Traditionally, oral debriefing sessions have immediately followed the simulation scenarios as a method for students to connect theory to practice and therefore develop clinical judgment. Recently, video recording of the simulation scenarios is being incorporated. This qualitative, interpretive description study was conducted to identify whether self-reflection on video-recorded high-fidelity simulation (HFS) scenarios helped prelicensure nursing students to develop clinical judgment. Tanner's clinical judgment model was the framework for this study. Four themes emerged from this study: Confidence, Communication, Decision Making, and Change in Clinical Practice. This study indicated that self-reflection of video-recorded HFS scenarios is beneficial for prelicensure nursing students to develop clinical judgment. [J Nurs Educ. 2016;55(9):522-527.]. Copyright 2016, SLACK Incorporated.

  17. Estimating likelihood of future crashes for crash-prone drivers

    Directory of Open Access Journals (Sweden)

    Subasish Das

    2015-06-01

    Full Text Available At-fault crash-prone drivers are usually considered as the high risk group for possible future incidents or crashes. In Louisiana, 34% of crashes are repeatedly committed by the at-fault crash-prone drivers who represent only 5% of the total licensed drivers in the state. This research has conducted an exploratory data analysis based on the driver faultiness and proneness. The objective of this study is to develop a crash prediction model to estimate the likelihood of future crashes for the at-fault drivers. The logistic regression method is used by employing eight years' traffic crash data (2004–2011 in Louisiana. Crash predictors such as the driver's crash involvement, crash and road characteristics, human factors, collision type, and environmental factors are considered in the model. The at-fault and not-at-fault status of the crashes are used as the response variable. The developed model has identified a few important variables, and is used to correctly classify at-fault crashes up to 62.40% with a specificity of 77.25%. This model can identify as many as 62.40% of the crash incidence of at-fault drivers in the upcoming year. Traffic agencies can use the model for monitoring the performance of an at-fault crash-prone drivers and making roadway improvements meant to reduce crash proneness. From the findings, it is recommended that crash-prone drivers should be targeted for special safety programs regularly through education and regulations.

  18. Reconsidering Clinical Staging Model: A Case of Genetic High Risk for Schizophrenia

    OpenAIRE

    Lee, Tae Young; Kim, Minah; Kim, Sung Nyun; Kwon, Jun Soo

    2016-01-01

    The clinical staging model is considered a useful and practical method not only in dealing with the early stage of psychosis overcoming the debate about diagnostic boundaries but also in emerging mood disorder. However, its one limitation is that it cannot discriminate the heterogeneity of individuals at clinical high risk for psychosis, but lumps them all together. Even a healthy offspring of schizophrenia can eventually show clinical symptoms and progress to schizophrenia under the influenc...

  19. Exploring neural dysfunction in 'clinical high risk' for psychosis: a quantitative review of fMRI studies.

    Science.gov (United States)

    Dutt, Anirban; Tseng, Huai-Hsuan; Fonville, Leon; Drakesmith, Mark; Su, Liang; Evans, John; Zammit, Stanley; Jones, Derek; Lewis, Glyn; David, Anthony S

    2015-02-01

    Individuals at clinical high risk (CHR) of developing psychosis present with widespread functional abnormalities in the brain. Cognitive deficits, including working memory (WM) problems, as commonly elicited by n-back tasks, are observed in CHR individuals. However, functional MRI (fMRI) studies, comprising a heterogeneous cluster of general and social cognition paradigms, have not necessarily demonstrated consistent and conclusive results in this population. Hence, a comprehensive review of fMRI studies, spanning almost one decade, was carried out to observe for general trends with respect to brain regions and cognitive systems most likely to be dysfunctional in CHR individuals. 32 studies were included for this review, out of which 22 met the criteria for quantitative analysis using activation likelihood estimation (ALE). Task related contrast activations were firstly analysed by comparing CHR and healthy control participants in the total pooled sample, followed by a comparison of general cognitive function studies (excluding social cognition paradigms), and finally by only looking at n-back working memory task based studies. Findings from the ALE implicated four key dysfunctional and distinct neural regions in the CHR group, namely the right inferior parietal lobule (rIPL), the left medial frontal gyrus (lmFG), the left superior temporal gyrus (lSTG) and the right fronto-polar cortex (rFPC) of the superior frontal gyrus (SFG). Narrowing down to relatively few significant dysfunctional neural regions is a step forward in reducing the apparent ambiguity of overall findings, which would help to target specific neural regions and pathways of interest for future research in CHR populations. Copyright © 2014. Published by Elsevier Ltd.

  20. Comparison of likelihood testing procedures for parallel systems with covariances

    International Nuclear Information System (INIS)

    Ayman Baklizi; Isa Daud; Noor Akma Ibrahim

    1998-01-01

    In this paper we considered investigating and comparing the behavior of the likelihood ratio, the Rao's and the Wald's statistics for testing hypotheses on the parameters of the simple linear regression model based on parallel systems with covariances. These statistics are asymptotically equivalent (Barndorff-Nielsen and Cox, 1994). However, their relative performances in finite samples are generally known. A Monte Carlo experiment is conducted to stimulate the sizes and the powers of these statistics for complete samples and in the presence of time censoring. Comparisons of the statistics are made according to the attainment of assumed size of the test and their powers at various points in the parameter space. The results show that the likelihood ratio statistics appears to have the best performance in terms of the attainment of the assumed size of the test. Power comparisons show that the Rao statistic has some advantage over the Wald statistic in almost all of the space of alternatives while likelihood ratio statistic occupies either the first or the last position in term of power. Overall, the likelihood ratio statistic appears to be more appropriate to the model under study, especially for small sample sizes

  1. Maximum likelihood estimation of the attenuated ultrasound pulse

    DEFF Research Database (Denmark)

    Rasmussen, Klaus Bolding

    1994-01-01

    The attenuated ultrasound pulse is divided into two parts: a stationary basic pulse and a nonstationary attenuation pulse. A standard ARMA model is used for the basic pulse, and a nonstandard ARMA model is derived for the attenuation pulse. The maximum likelihood estimator of the attenuated...

  2. Robust Gaussian Process Regression with a Student-t Likelihood

    NARCIS (Netherlands)

    Jylänki, P.P.; Vanhatalo, J.; Vehtari, A.

    2011-01-01

    This paper considers the robust and efficient implementation of Gaussian process regression with a Student-t observation model, which has a non-log-concave likelihood. The challenge with the Student-t model is the analytically intractable inference which is why several approximative methods have

  3. MAXIMUM-LIKELIHOOD-ESTIMATION OF THE ENTROPY OF AN ATTRACTOR

    NARCIS (Netherlands)

    SCHOUTEN, JC; TAKENS, F; VANDENBLEEK, CM

    In this paper, a maximum-likelihood estimate of the (Kolmogorov) entropy of an attractor is proposed that can be obtained directly from a time series. Also, the relative standard deviation of the entropy estimate is derived; it is dependent on the entropy and on the number of samples used in the

  4. A simplification of the likelihood ratio test statistic for testing ...

    African Journals Online (AJOL)

    The traditional likelihood ratio test statistic for testing hypothesis about goodness of fit of multinomial probabilities in one, two and multi – dimensional contingency table was simplified. Advantageously, using the simplified version of the statistic to test the null hypothesis is easier and faster because calculating the expected ...

  5. Adaptive Unscented Kalman Filter using Maximum Likelihood Estimation

    DEFF Research Database (Denmark)

    Mahmoudi, Zeinab; Poulsen, Niels Kjølstad; Madsen, Henrik

    2017-01-01

    The purpose of this study is to develop an adaptive unscented Kalman filter (UKF) by tuning the measurement noise covariance. We use the maximum likelihood estimation (MLE) and the covariance matching (CM) method to estimate the noise covariance. The multi-step prediction errors generated...

  6. LIKELIHOOD ESTIMATION OF PARAMETERS USING SIMULTANEOUSLY MONITORED PROCESSES

    DEFF Research Database (Denmark)

    Friis-Hansen, Peter; Ditlevsen, Ove Dalager

    2004-01-01

    The topic is maximum likelihood inference from several simultaneously monitored response processes of a structure to obtain knowledge about the parameters of other not monitored but important response processes when the structure is subject to some Gaussian load field in space and time. The consi....... The considered example is a ship sailing with a given speed through a Gaussian wave field....

  7. Likelihood-based inference for clustered line transect data

    DEFF Research Database (Denmark)

    Waagepetersen, Rasmus; Schweder, Tore

    2006-01-01

    The uncertainty in estimation of spatial animal density from line transect surveys depends on the degree of spatial clustering in the animal population. To quantify the clustering we model line transect data as independent thinnings of spatial shot-noise Cox processes. Likelihood-based inference...

  8. Likelihood-based Dynamic Factor Analysis for Measurement and Forecasting

    NARCIS (Netherlands)

    Jungbacker, B.M.J.P.; Koopman, S.J.

    2015-01-01

    We present new results for the likelihood-based analysis of the dynamic factor model. The latent factors are modelled by linear dynamic stochastic processes. The idiosyncratic disturbance series are specified as autoregressive processes with mutually correlated innovations. The new results lead to

  9. Likelihood-based inference for clustered line transect data

    DEFF Research Database (Denmark)

    Waagepetersen, Rasmus Plenge; Schweder, Tore

    The uncertainty in estimation of spatial animal density from line transect surveys depends on the degree of spatial clustering in the animal population. To quantify the clustering we model line transect data as independent thinnings of spatial shot-noise Cox processes. Likelihood-based inference...

  10. Composite likelihood and two-stage estimation in family studies

    DEFF Research Database (Denmark)

    Andersen, Elisabeth Anne Wreford

    2004-01-01

    In this paper register based family studies provide the motivation for linking a two-stage estimation procedure in copula models for multivariate failure time data with a composite likelihood approach. The asymptotic properties of the estimators in both parametric and semi-parametric models are d...

  11. Reconceptualizing Social Influence in Counseling: The Elaboration Likelihood Model.

    Science.gov (United States)

    McNeill, Brian W.; Stoltenberg, Cal D.

    1989-01-01

    Presents Elaboration Likelihood Model (ELM) of persuasion (a reconceptualization of the social influence process) as alternative model of attitude change. Contends ELM unifies conflicting social psychology results and can potentially account for inconsistent research findings in counseling psychology. Provides guidelines on integrating…

  12. Counseling Pretreatment and the Elaboration Likelihood Model of Attitude Change.

    Science.gov (United States)

    Heesacker, Martin

    1986-01-01

    Results of the application of the Elaboration Likelihood Model (ELM) to a counseling context revealed that more favorable attitudes toward counseling occurred as subjects' ego involvement increased and as intervention quality improved. Counselor credibility affected the degree to which subjects' attitudes reflected argument quality differences.…

  13. Cases in which ancestral maximum likelihood will be confusingly misleading.

    Science.gov (United States)

    Handelman, Tomer; Chor, Benny

    2017-05-07

    Ancestral maximum likelihood (AML) is a phylogenetic tree reconstruction criteria that "lies between" maximum parsimony (MP) and maximum likelihood (ML). ML has long been known to be statistically consistent. On the other hand, Felsenstein (1978) showed that MP is statistically inconsistent, and even positively misleading: There are cases where the parsimony criteria, applied to data generated according to one tree topology, will be optimized on a different tree topology. The question of weather AML is statistically consistent or not has been open for a long time. Mossel et al. (2009) have shown that AML can "shrink" short tree edges, resulting in a star tree with no internal resolution, which yields a better AML score than the original (resolved) model. This result implies that AML is statistically inconsistent, but not that it is positively misleading, because the star tree is compatible with any other topology. We show that AML is confusingly misleading: For some simple, four taxa (resolved) tree, the ancestral likelihood optimization criteria is maximized on an incorrect (resolved) tree topology, as well as on a star tree (both with specific edge lengths), while the tree with the original, correct topology, has strictly lower ancestral likelihood. Interestingly, the two short edges in the incorrect, resolved tree topology are of length zero, and are not adjacent, so this resolved tree is in fact a simple path. While for MP, the underlying phenomenon can be described as long edge attraction, it turns out that here we have long edge repulsion. Copyright © 2017. Published by Elsevier Ltd.

  14. Pendeteksian Outlier pada Regresi Nonlinier dengan Metode statistik Likelihood Displacement

    Directory of Open Access Journals (Sweden)

    Siti Tabi'atul Hasanah

    2012-11-01

    Full Text Available Outlier is an observation that much different (extreme from the other observational data, or data can be interpreted that do not follow the general pattern of the model. Sometimes outliers provide information that can not be provided by other data. That's why outliers should not just be eliminated. Outliers can also be an influential observation. There are many methods that can be used to detect of outliers. In previous studies done on outlier detection of linear regression. Next will be developed detection of outliers in nonlinear regression. Nonlinear regression here is devoted to multiplicative nonlinear regression. To detect is use of statistical method likelihood displacement. Statistical methods abbreviated likelihood displacement (LD is a method to detect outliers by removing the suspected outlier data. To estimate the parameters are used to the maximum likelihood method, so we get the estimate of the maximum. By using LD method is obtained i.e likelihood displacement is thought to contain outliers. Further accuracy of LD method in detecting the outliers are shown by comparing the MSE of LD with the MSE from the regression in general. Statistic test used is Λ. Initial hypothesis was rejected when proved so is an outlier.

  15. High-dose intravenous vitamin C combined with cytotoxic chemotherapy in patients with advanced cancer: a phase I-II clinical trial.

    Directory of Open Access Journals (Sweden)

    L John Hoffer

    Full Text Available Biological and some clinical evidence suggest that high-dose intravenous vitamin C (IVC could increase the effectiveness of cancer chemotherapy. IVC is widely used by integrative and complementary cancer therapists, but rigorous data are lacking as to its safety and which cancers and chemotherapy regimens would be the most promising to investigate in detail.We carried out a phase I-II safety, tolerability, pharmacokinetic and efficacy trial of IVC combined with chemotherapy in patients whose treating oncologist judged that standard-of-care or off-label chemotherapy offered less than a 33% likelihood of a meaningful response. We documented adverse events and toxicity associated with IVC infusions, determined pre- and post-chemotherapy vitamin C and oxalic acid pharmacokinetic profiles, and monitored objective clinical responses, mood and quality of life. Fourteen patients were enrolled. IVC was safe and generally well tolerated, although some patients experienced transient adverse events during or after IVC infusions. The pre- and post-chemotherapy pharmacokinetic profiles suggested that tissue uptake of vitamin C increases after chemotherapy, with no increase in urinary oxalic acid excretion. Three patients with different types of cancer experienced unexpected transient stable disease, increased energy and functional improvement.Despite IVC's biological and clinical plausibility, career cancer investigators currently ignore it while integrative cancer therapists use it widely but without reporting the kind of clinical data that is normally gathered in cancer drug development. The present study neither proves nor disproves IVC's value in cancer therapy, but it provides practical information, and indicates a feasible way to evaluate this plausible but unproven therapy in an academic environment that is currently uninterested in it. If carried out in sufficient numbers, simple studies like this one could identify specific clusters of cancer type

  16. High-dose intravenous vitamin C combined with cytotoxic chemotherapy in patients with advanced cancer: a phase I-II clinical trial.

    Science.gov (United States)

    Hoffer, L John; Robitaille, Line; Zakarian, Robert; Melnychuk, David; Kavan, Petr; Agulnik, Jason; Cohen, Victor; Small, David; Miller, Wilson H

    2015-01-01

    Biological and some clinical evidence suggest that high-dose intravenous vitamin C (IVC) could increase the effectiveness of cancer chemotherapy. IVC is widely used by integrative and complementary cancer therapists, but rigorous data are lacking as to its safety and which cancers and chemotherapy regimens would be the most promising to investigate in detail. We carried out a phase I-II safety, tolerability, pharmacokinetic and efficacy trial of IVC combined with chemotherapy in patients whose treating oncologist judged that standard-of-care or off-label chemotherapy offered less than a 33% likelihood of a meaningful response. We documented adverse events and toxicity associated with IVC infusions, determined pre- and post-chemotherapy vitamin C and oxalic acid pharmacokinetic profiles, and monitored objective clinical responses, mood and quality of life. Fourteen patients were enrolled. IVC was safe and generally well tolerated, although some patients experienced transient adverse events during or after IVC infusions. The pre- and post-chemotherapy pharmacokinetic profiles suggested that tissue uptake of vitamin C increases after chemotherapy, with no increase in urinary oxalic acid excretion. Three patients with different types of cancer experienced unexpected transient stable disease, increased energy and functional improvement. Despite IVC's biological and clinical plausibility, career cancer investigators currently ignore it while integrative cancer therapists use it widely but without reporting the kind of clinical data that is normally gathered in cancer drug development. The present study neither proves nor disproves IVC's value in cancer therapy, but it provides practical information, and indicates a feasible way to evaluate this plausible but unproven therapy in an academic environment that is currently uninterested in it. If carried out in sufficient numbers, simple studies like this one could identify specific clusters of cancer type, chemotherapy

  17. Characterizing clinical isolates of Acanthamoeba castellanii with high resistance to polyhexamethylene biguanide in Taiwan

    Directory of Open Access Journals (Sweden)

    Fu-Chin Huang

    2017-10-01

    Conclusion: Our results confirm the existence of clinical isolates of A. castellanii with high resistance to PHMB in Taiwan and present the alternative drug tolerance of A. castellanii in addition to the transformation of pseudocyst/cyst.

  18. Modeling gene expression measurement error: a quasi-likelihood approach

    Directory of Open Access Journals (Sweden)

    Strimmer Korbinian

    2003-03-01

    Full Text Available Abstract Background Using suitable error models for gene expression measurements is essential in the statistical analysis of microarray data. However, the true probabilistic model underlying gene expression intensity readings is generally not known. Instead, in currently used approaches some simple parametric model is assumed (usually a transformed normal distribution or the empirical distribution is estimated. However, both these strategies may not be optimal for gene expression data, as the non-parametric approach ignores known structural information whereas the fully parametric models run the risk of misspecification. A further related problem is the choice of a suitable scale for the model (e.g. observed vs. log-scale. Results Here a simple semi-parametric model for gene expression measurement error is presented. In this approach inference is based an approximate likelihood function (the extended quasi-likelihood. Only partial knowledge about the unknown true distribution is required to construct this function. In case of gene expression this information is available in the form of the postulated (e.g. quadratic variance structure of the data. As the quasi-likelihood behaves (almost like a proper likelihood, it allows for the estimation of calibration and variance parameters, and it is also straightforward to obtain corresponding approximate confidence intervals. Unlike most other frameworks, it also allows analysis on any preferred scale, i.e. both on the original linear scale as well as on a transformed scale. It can also be employed in regression approaches to model systematic (e.g. array or dye effects. Conclusions The quasi-likelihood framework provides a simple and versatile approach to analyze gene expression data that does not make any strong distributional assumptions about the underlying error model. For several simulated as well as real data sets it provides a better fit to the data than competing models. In an example it also

  19. Exclusion probabilities and likelihood ratios with applications to mixtures.

    Science.gov (United States)

    Slooten, Klaas-Jan; Egeland, Thore

    2016-01-01

    The statistical evidence obtained from mixed DNA profiles can be summarised in several ways in forensic casework including the likelihood ratio (LR) and the Random Man Not Excluded (RMNE) probability. The literature has seen a discussion of the advantages and disadvantages of likelihood ratios and exclusion probabilities, and part of our aim is to bring some clarification to this debate. In a previous paper, we proved that there is a general mathematical relationship between these statistics: RMNE can be expressed as a certain average of the LR, implying that the expected value of the LR, when applied to an actual contributor to the mixture, is at least equal to the inverse of the RMNE. While the mentioned paper presented applications for kinship problems, the current paper demonstrates the relevance for mixture cases, and for this purpose, we prove some new general properties. We also demonstrate how to use the distribution of the likelihood ratio for donors of a mixture, to obtain estimates for exceedance probabilities of the LR for non-donors, of which the RMNE is a special case corresponding to L R>0. In order to derive these results, we need to view the likelihood ratio as a random variable. In this paper, we describe how such a randomization can be achieved. The RMNE is usually invoked only for mixtures without dropout. In mixtures, artefacts like dropout and drop-in are commonly encountered and we address this situation too, illustrating our results with a basic but widely implemented model, a so-called binary model. The precise definitions, modelling and interpretation of the required concepts of dropout and drop-in are not entirely obvious, and we attempt to clarify them here in a general likelihood framework for a binary model.

  20. FlowMax: A Computational Tool for Maximum Likelihood Deconvolution of CFSE Time Courses.

    Directory of Open Access Journals (Sweden)

    Maxim Nikolaievich Shokhirev

    Full Text Available The immune response is a concerted dynamic multi-cellular process. Upon infection, the dynamics of lymphocyte populations are an aggregate of molecular processes that determine the activation, division, and longevity of individual cells. The timing of these single-cell processes is remarkably widely distributed with some cells undergoing their third division while others undergo their first. High cell-to-cell variability and technical noise pose challenges for interpreting popular dye-dilution experiments objectively. It remains an unresolved challenge to avoid under- or over-interpretation of such data when phenotyping gene-targeted mouse models or patient samples. Here we develop and characterize a computational methodology to parameterize a cell population model in the context of noisy dye-dilution data. To enable objective interpretation of model fits, our method estimates fit sensitivity and redundancy by stochastically sampling the solution landscape, calculating parameter sensitivities, and clustering to determine the maximum-likelihood solution ranges. Our methodology accounts for both technical and biological variability by using a cell fluorescence model as an adaptor during population model fitting, resulting in improved fit accuracy without the need for ad hoc objective functions. We have incorporated our methodology into an integrated phenotyping tool, FlowMax, and used it to analyze B cells from two NFκB knockout mice with distinct phenotypes; we not only confirm previously published findings at a fraction of the expended effort and cost, but reveal a novel phenotype of nfkb1/p105/50 in limiting the proliferative capacity of B cells following B-cell receptor stimulation. In addition to complementing experimental work, FlowMax is suitable for high throughput analysis of dye dilution studies within clinical and pharmacological screens with objective and quantitative conclusions.

  1. Efficient estimators for likelihood ratio sensitivity indices of complex stochastic dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Arampatzis, Georgios; Katsoulakis, Markos A.; Rey-Bellet, Luc [Department of Mathematics and Statistics, University of Massachusetts, Amherst, Massachusetts 01003 (United States)

    2016-03-14

    We demonstrate that centered likelihood ratio estimators for the sensitivity indices of complex stochastic dynamics are highly efficient with low, constant in time variance and consequently they are suitable for sensitivity analysis in long-time and steady-state regimes. These estimators rely on a new covariance formulation of the likelihood ratio that includes as a submatrix a Fisher information matrix for stochastic dynamics and can also be used for fast screening of insensitive parameters and parameter combinations. The proposed methods are applicable to broad classes of stochastic dynamics such as chemical reaction networks, Langevin-type equations and stochastic models in finance, including systems with a high dimensional parameter space and/or disparate decorrelation times between different observables. Furthermore, they are simple to implement as a standard observable in any existing simulation algorithm without additional modifications.

  2. Concerns about Genetic Testing for Schizophrenia among Young Adults at Clinical High Risk for Psychosis.

    Science.gov (United States)

    Lawrence, Ryan E; Friesen, Phoebe; Brucato, Gary; Girgis, Ragy R; Dixon, Lisa

    Genetic tests for schizophrenia may introduce risks and benefits. Among young adults at clinical high-risk for psychosis, little is known about their concerns and how they assess potential risks. We conducted semi-structured interviews with 15 young adults at clinical high-risk for psychosis to ask about their concerns. Participants expressed concerns about test reliability, data interpretation, stigma, psychological harm, family planning, and privacy. Participants' responses showed some departure from the ethics literature insofar as participants were primarily interested in reporting their results to people to whom they felt emotionally close, and expressed little consideration of biological closeness. Additionally, if tests showed an increased genetic risk for schizophrenia, four clinical high-risk persons felt obligated to tell an employer and another three would "maybe" tell an employer, even in the absence of clinical symptoms. These findings suggest opportunities for clinicians and genetic counselors to intervene with education and support.

  3. Pre-market clinical evaluations of innovative high-risk medical devices in Europe.

    Science.gov (United States)

    Hulstaert, Frank; Neyt, Mattias; Vinck, Imgard; Stordeur, Sabine; Huić, Mirjana; Sauerland, Stefan; Kuijpers, Marja R; Abrishami, Payam; Vondeling, Hindrik; Flamion, Bruno; Garattini, Silvio; Pavlovic, Mira; van Brabandt, Hans

    2012-07-01

    High-quality clinical evidence is most often lacking when novel high-risk devices enter the European market. At the same time, a randomized controlled trial (RCT) is often initiated as a requirement for obtaining market access in the US. Should coverage in Europe be postponed until RCT data are available? We studied the premarket clinical evaluation of innovative high-risk medical devices in Europe compared with the US, and with medicines, where appropriate. The literature and regulatory documents were checked. Representatives from industry, Competent Authorities, Notified Bodies, Ethics Committees, and HTA agencies were consulted. We also discuss patient safety and the transparency of information. In contrast to the US, there is no requirement in Europe to demonstrate the clinical efficacy of high-risk devices in the premarket phase. Patients in Europe can thus have earlier access to a potentially lifesaving device, but at the risk of insufficiently documented efficacy and safety. Variations in the stringency of clinical reviews, both at the level of Notified Bodies and Competent Authorities, do not guarantee patient safety. We tried to document the design of premarket trials in Europe and number of patients exposed, but failed as this information is not made public. Furthermore, the Helsinki Declaration is not followed with respect to the registration and publication of premarket trials. For innovative high-risk devices, new EU legislation should require the premarket demonstration of clinical efficacy and safety, using an RCT if possible, and a transparent clinical review, preferably centralized.

  4. Development and external validation of a clinical prognostic score for death in visceral leishmaniasis patients in a high HIV co-infection burden area in Ethiopia.

    Directory of Open Access Journals (Sweden)

    Charles Abongomera

    Full Text Available In Ethiopia, case fatality rates among subgroups of visceral leishmaniasis (VL patients are high. A clinical prognostic score for death in VL patients could contribute to optimal management and reduction of these case fatality rates. We aimed to identify predictors of death from VL, and to develop and externally validate a clinical prognostic score for death in VL patients, in a high HIV co-infection burden area in Ethiopia.We conducted a retrospective cohort study in north west Ethiopia. Predictors with an adjusted likelihood ratio ≥1.5 or ≤0.67 were retained to calculate the predictor score. The derivation cohort consisted of 1686 VL patients treated at an upgraded health center and the external validation cohort consisted of 404 VL patients treated in hospital. There were 99 deaths in the derivation cohort and 53 deaths in the external validation cohort. The predictors of death were: age >40 years (score +1; HIV seropositive (score +1; HIV seronegative (score -1; hemoglobin ≤6.5 g/dl (score +1; bleeding (score +1; jaundice (score +1; edema (score +1; ascites (score +2 and tuberculosis (score +1. The total predictor score per patient ranged from -1 to +5. A score of -1, indicated a low risk of death (1.0%, a score of 0 an intermediate risk of death (3.8% and a score of +1 to +5, a high risk of death (10.4-85.7%. The area under the receiver operating characteristic curve was 0.83 (95% confidence interval: 0.79-0.87 in derivation, and 0.78 (95% confidence interval: 0.72-0.83 in external validation.The overall performance of the score was good. The score can enable the early detection of VL cases at high risk of death, which can inform operational, clinical management guidelines, and VL program management. Implementation of focused strategies could contribute to optimal management and reduction of the case fatality rates.

  5. A composite likelihood approach for spatially correlated survival data

    Science.gov (United States)

    Paik, Jane; Ying, Zhiliang

    2013-01-01

    The aim of this paper is to provide a composite likelihood approach to handle spatially correlated survival data using pairwise joint distributions. With e-commerce data, a recent question of interest in marketing research has been to describe spatially clustered purchasing behavior and to assess whether geographic distance is the appropriate metric to describe purchasing dependence. We present a model for the dependence structure of time-to-event data subject to spatial dependence to characterize purchasing behavior from the motivating example from e-commerce data. We assume the Farlie-Gumbel-Morgenstern (FGM) distribution and then model the dependence parameter as a function of geographic and demographic pairwise distances. For estimation of the dependence parameters, we present pairwise composite likelihood equations. We prove that the resulting estimators exhibit key properties of consistency and asymptotic normality under certain regularity conditions in the increasing-domain framework of spatial asymptotic theory. PMID:24223450

  6. A composite likelihood approach for spatially correlated survival data.

    Science.gov (United States)

    Paik, Jane; Ying, Zhiliang

    2013-01-01

    The aim of this paper is to provide a composite likelihood approach to handle spatially correlated survival data using pairwise joint distributions. With e-commerce data, a recent question of interest in marketing research has been to describe spatially clustered purchasing behavior and to assess whether geographic distance is the appropriate metric to describe purchasing dependence. We present a model for the dependence structure of time-to-event data subject to spatial dependence to characterize purchasing behavior from the motivating example from e-commerce data. We assume the Farlie-Gumbel-Morgenstern (FGM) distribution and then model the dependence parameter as a function of geographic and demographic pairwise distances. For estimation of the dependence parameters, we present pairwise composite likelihood equations. We prove that the resulting estimators exhibit key properties of consistency and asymptotic normality under certain regularity conditions in the increasing-domain framework of spatial asymptotic theory.

  7. Secondary Analysis under Cohort Sampling Designs Using Conditional Likelihood

    Directory of Open Access Journals (Sweden)

    Olli Saarela

    2012-01-01

    Full Text Available Under cohort sampling designs, additional covariate data are collected on cases of a specific type and a randomly selected subset of noncases, primarily for the purpose of studying associations with a time-to-event response of interest. With such data available, an interest may arise to reuse them for studying associations between the additional covariate data and a secondary non-time-to-event response variable, usually collected for the whole study cohort at the outset of the study. Following earlier literature, we refer to such a situation as secondary analysis. We outline a general conditional likelihood approach for secondary analysis under cohort sampling designs and discuss the specific situations of case-cohort and nested case-control designs. We also review alternative methods based on full likelihood and inverse probability weighting. We compare the alternative methods for secondary analysis in two simulated settings and apply them in a real-data example.

  8. Likelihood inference for a nonstationary fractional autoregressive model

    DEFF Research Database (Denmark)

    Johansen, Søren; Ørregård Nielsen, Morten

    2010-01-01

    This paper discusses model-based inference in an autoregressive model for fractional processes which allows the process to be fractional of order d or d-b. Fractional differencing involves infinitely many past values and because we are interested in nonstationary processes we model the data X1......,...,X_{T} given the initial values X_{-n}, n=0,1,..., as is usually done. The initial values are not modeled but assumed to be bounded. This represents a considerable generalization relative to all previous work where it is assumed that initial values are zero. For the statistical analysis we assume...... the conditional Gaussian likelihood and for the probability analysis we also condition on initial values but assume that the errors in the autoregressive model are i.i.d. with suitable moment conditions. We analyze the conditional likelihood and its derivatives as stochastic processes in the parameters, including...

  9. Physical constraints on the likelihood of life on exoplanets

    Science.gov (United States)

    Lingam, Manasvi; Loeb, Abraham

    2018-04-01

    One of the most fundamental questions in exoplanetology is to determine whether a given planet is habitable. We estimate the relative likelihood of a planet's propensity towards habitability by considering key physical characteristics such as the role of temperature on ecological and evolutionary processes, and atmospheric losses via hydrodynamic escape and stellar wind erosion. From our analysis, we demonstrate that Earth-sized exoplanets in the habitable zone around M-dwarfs seemingly display much lower prospects of being habitable relative to Earth, owing to the higher incident ultraviolet fluxes and closer distances to the host star. We illustrate our results by specifically computing the likelihood (of supporting life) for the recently discovered exoplanets, Proxima b and TRAPPIST-1e, which we find to be several orders of magnitude smaller than that of Earth.

  10. THESEUS: maximum likelihood superpositioning and analysis of macromolecular structures.

    Science.gov (United States)

    Theobald, Douglas L; Wuttke, Deborah S

    2006-09-01

    THESEUS is a command line program for performing maximum likelihood (ML) superpositions and analysis of macromolecular structures. While conventional superpositioning methods use ordinary least-squares (LS) as the optimization criterion, ML superpositions provide substantially improved accuracy by down-weighting variable structural regions and by correcting for correlations among atoms. ML superpositioning is robust and insensitive to the specific atoms included in the analysis, and thus it does not require subjective pruning of selected variable atomic coordinates. Output includes both likelihood-based and frequentist statistics for accurate evaluation of the adequacy of a superposition and for reliable analysis of structural similarities and differences. THESEUS performs principal components analysis for analyzing the complex correlations found among atoms within a structural ensemble. ANSI C source code and selected binaries for various computing platforms are available under the GNU open source license from http://monkshood.colorado.edu/theseus/ or http://www.theseus3d.org.

  11. Impact of neurocognition on social and role functioning in individuals at clinical high risk for psychosis.

    Science.gov (United States)

    Carrión, Ricardo E; Goldberg, Terry E; McLaughlin, Danielle; Auther, Andrea M; Correll, Christoph U; Cornblatt, Barbara A

    2011-08-01

    Cognitive deficits have been well documented in schizophrenia and have been shown to impair quality of life and to compromise everyday functioning. Recent studies of adolescents and young adults at high risk for developing psychosis show that neurocognitive impairments are detectable before the onset of psychotic symptoms. However, it remains unclear how cognitive impairments affect functioning before the onset of psychosis. The authors assessed cognitive impairment in adolescents at clinical high risk for psychosis and examined its impact on social and role functioning. A sample of 127 treatment-seeking patients at clinical high risk for psychosis and a group of 80 healthy comparison subjects were identified and recruited for research in the Recognition and Prevention Program. At baseline, participants were assessed with a comprehensive neurocognitive battery as well as measures of social and role functioning. Relative to healthy comparison subjects, clinical high-risk patients showed significant impairments in the domains of processing speed, verbal memory, executive function, working memory, visuospatial processing, motor speed, sustained attention, and language. Clinical high-risk patients also displayed impaired social and role functioning at baseline. Among patients with attenuated positive symptoms, processing speed was related to social and role functioning at baseline. These findings demonstrate that cognitive and functional impairments are detectable in patients at clinical high risk for psychosis before the onset of psychotic illness and that processing speed appears to be an important cognitive predictor of poor functioning.

  12. Deformation of log-likelihood loss function for multiclass boosting.

    Science.gov (United States)

    Kanamori, Takafumi

    2010-09-01

    The purpose of this paper is to study loss functions in multiclass classification. In classification problems, the decision function is estimated by minimizing an empirical loss function, and then, the output label is predicted by using the estimated decision function. We propose a class of loss functions which is obtained by a deformation of the log-likelihood loss function. There are four main reasons why we focus on the deformed log-likelihood loss function: (1) this is a class of loss functions which has not been deeply investigated so far, (2) in terms of computation, a boosting algorithm with a pseudo-loss is available to minimize the proposed loss function, (3) the proposed loss functions provide a clear correspondence between the decision functions and conditional probabilities of output labels, (4) the proposed loss functions satisfy the statistical consistency of the classification error rate which is a desirable property in classification problems. Based on (3), we show that the deformed log-likelihood loss provides a model of mislabeling which is useful as a statistical model of medical diagnostics. We also propose a robust loss function against outliers in multiclass classification based on our approach. The robust loss function is a natural extension of the existing robust loss function for binary classification. A model of mislabeling and a robust loss function are useful to cope with noisy data. Some numerical studies are presented to show the robustness of the proposed loss function. A mathematical characterization of the deformed log-likelihood loss function is also presented. Copyright 2010 Elsevier Ltd. All rights reserved.

  13. Bayesian interpretation of Generalized empirical likelihood by maximum entropy

    OpenAIRE

    Rochet , Paul

    2011-01-01

    We study a parametric estimation problem related to moment condition models. As an alternative to the generalized empirical likelihood (GEL) and the generalized method of moments (GMM), a Bayesian approach to the problem can be adopted, extending the MEM procedure to parametric moment conditions. We show in particular that a large number of GEL estimators can be interpreted as a maximum entropy solution. Moreover, we provide a more general field of applications by proving the method to be rob...

  14. Menyoal Elaboration Likelihood Model (ELM) dan Teori Retorika

    OpenAIRE

    Yudi Perbawaningsih

    2012-01-01

    Abstract: Persuasion is a communication process to establish or change attitudes, which can be understood through theory of Rhetoric and theory of Elaboration Likelihood Model (ELM). This study elaborates these theories in a Public Lecture series which to persuade the students in choosing their concentration of study. The result shows that in term of persuasion effectiveness it is not quite relevant to separate the message and its source. The quality of source is determined by the quality of ...

  15. Maximum Likelihood, Consistency and Data Envelopment Analysis: A Statistical Foundation

    OpenAIRE

    Rajiv D. Banker

    1993-01-01

    This paper provides a formal statistical basis for the efficiency evaluation techniques of data envelopment analysis (DEA). DEA estimators of the best practice monotone increasing and concave production function are shown to be also maximum likelihood estimators if the deviation of actual output from the efficient output is regarded as a stochastic variable with a monotone decreasing probability density function. While the best practice frontier estimator is biased below the theoretical front...

  16. Multiple Improvements of Multiple Imputation Likelihood Ratio Tests

    OpenAIRE

    Chan, Kin Wai; Meng, Xiao-Li

    2017-01-01

    Multiple imputation (MI) inference handles missing data by first properly imputing the missing values $m$ times, and then combining the $m$ analysis results from applying a complete-data procedure to each of the completed datasets. However, the existing method for combining likelihood ratio tests has multiple defects: (i) the combined test statistic can be negative in practice when the reference null distribution is a standard $F$ distribution; (ii) it is not invariant to re-parametrization; ...

  17. Menyoal Elaboration Likelihood Model (ELM) Dan Teori Retorika

    OpenAIRE

    Perbawaningsih, Yudi

    2012-01-01

    : Persuasion is a communication process to establish or change attitudes, which can be understood through theory of Rhetoric and theory of Elaboration Likelihood Model (ELM). This study elaborates these theories in a Public Lecture series which to persuade the students in choosing their concentration of study. The result shows that in term of persuasion effectiveness it is not quite relevant to separate the message and its source. The quality of source is determined by the quality of the mess...

  18. Penggunaan Elaboration Likelihood Model dalam Menganalisis Penerimaan Teknologi Informasi

    OpenAIRE

    vitrian, vitrian2

    2010-01-01

    This article discusses some technology acceptance models in an organization. Thorough analysis of how technology is acceptable help managers make any planning to implement new teachnology and make sure that new technology could enhance organization's performance. Elaboration Likelihood Model (ELM) is the one which sheds light on some behavioral factors in acceptance of information technology. The basic tenet of ELM states that human behavior in principle can be influenced through central r...

  19. Statistical Bias in Maximum Likelihood Estimators of Item Parameters.

    Science.gov (United States)

    1982-04-01

    34 a> E r’r~e r ,C Ie I# ne,..,.rVi rnd Id.,flfv b1 - bindk numb.r) I; ,t-i i-cd I ’ tiie bias in the maximum likelihood ,st i- i;, ’ t iIeiIrs in...NTC, IL 60088 Psychometric Laboratory University of North Carolina I ERIC Facility-Acquisitions Davie Hall 013A 4833 Rugby Avenue Chapel Hill, NC

  20. Empirical Likelihood in Nonignorable Covariate-Missing Data Problems.

    Science.gov (United States)

    Xie, Yanmei; Zhang, Biao

    2017-04-20

    Missing covariate data occurs often in regression analysis, which frequently arises in the health and social sciences as well as in survey sampling. We study methods for the analysis of a nonignorable covariate-missing data problem in an assumed conditional mean function when some covariates are completely observed but other covariates are missing for some subjects. We adopt the semiparametric perspective of Bartlett et al. (Improving upon the efficiency of complete case analysis when covariates are MNAR. Biostatistics 2014;15:719-30) on regression analyses with nonignorable missing covariates, in which they have introduced the use of two working models, the working probability model of missingness and the working conditional score model. In this paper, we study an empirical likelihood approach to nonignorable covariate-missing data problems with the objective of effectively utilizing the two working models in the analysis of covariate-missing data. We propose a unified approach to constructing a system of unbiased estimating equations, where there are more equations than unknown parameters of interest. One useful feature of these unbiased estimating equations is that they naturally incorporate the incomplete data into the data analysis, making it possible to seek efficient estimation of the parameter of interest even when the working regression function is not specified to be the optimal regression function. We apply the general methodology of empirical likelihood to optimally combine these unbiased estimating equations. We propose three maximum empirical likelihood estimators of the underlying regression parameters and compare their efficiencies with other existing competitors. We present a simulation study to compare the finite-sample performance of various methods with respect to bias, efficiency, and robustness to model misspecification. The proposed empirical likelihood method is also illustrated by an analysis of a data set from the US National Health and

  1. Democracy, Autocracy and the Likelihood of International Conflict

    OpenAIRE

    Tangerås, Thomas

    2008-01-01

    This is a game-theoretic analysis of the link between regime type and international conflict. The democratic electorate can credibly punish the leader for bad conflict outcomes, whereas the autocratic selectorate cannot. For the fear of being thrown out of office, democratic leaders are (i) more selective about the wars they initiate and (ii) on average win more of the wars they start. Foreign policy behaviour is found to display strategic complementarities. The likelihood of interstate war, ...

  2. Moment Conditions Selection Based on Adaptive Penalized Empirical Likelihood

    Directory of Open Access Journals (Sweden)

    Yunquan Song

    2014-01-01

    Full Text Available Empirical likelihood is a very popular method and has been widely used in the fields of artificial intelligence (AI and data mining as tablets and mobile application and social media dominate the technology landscape. This paper proposes an empirical likelihood shrinkage method to efficiently estimate unknown parameters and select correct moment conditions simultaneously, when the model is defined by moment restrictions in which some are possibly misspecified. We show that our method enjoys oracle-like properties; that is, it consistently selects the correct moment conditions and at the same time its estimator is as efficient as the empirical likelihood estimator obtained by all correct moment conditions. Moreover, unlike the GMM, our proposed method allows us to carry out confidence regions for the parameters included in the model without estimating the covariances of the estimators. For empirical implementation, we provide some data-driven procedures for selecting the tuning parameter of the penalty function. The simulation results show that the method works remarkably well in terms of correct moment selection and the finite sample properties of the estimators. Also, a real-life example is carried out to illustrate the new methodology.

  3. Maximum likelihood as a common computational framework in tomotherapy

    International Nuclear Information System (INIS)

    Olivera, G.H.; Shepard, D.M.; Reckwerdt, P.J.; Ruchala, K.; Zachman, J.; Fitchard, E.E.; Mackie, T.R.

    1998-01-01

    Tomotherapy is a dose delivery technique using helical or axial intensity modulated beams. One of the strengths of the tomotherapy concept is that it can incorporate a number of processes into a single piece of equipment. These processes include treatment optimization planning, dose reconstruction and kilovoltage/megavoltage image reconstruction. A common computational technique that could be used for all of these processes would be very appealing. The maximum likelihood estimator, originally developed for emission tomography, can serve as a useful tool in imaging and radiotherapy. We believe that this approach can play an important role in the processes of optimization planning, dose reconstruction and kilovoltage and/or megavoltage image reconstruction. These processes involve computations that require comparable physical methods. They are also based on equivalent assumptions, and they have similar mathematical solutions. As a result, the maximum likelihood approach is able to provide a common framework for all three of these computational problems. We will demonstrate how maximum likelihood methods can be applied to optimization planning, dose reconstruction and megavoltage image reconstruction in tomotherapy. Results for planning optimization, dose reconstruction and megavoltage image reconstruction will be presented. Strengths and weaknesses of the methodology are analysed. Future directions for this work are also suggested. (author)

  4. Facial emotion perception differs in young persons at genetic and clinical high-risk for psychosis.

    Science.gov (United States)

    Kohler, Christian G; Richard, Jan A; Brensinger, Colleen M; Borgmann-Winter, Karin E; Conroy, Catherine G; Moberg, Paul J; Gur, Ruben C; Gur, Raquel E; Calkins, Monica E

    2014-05-15

    A large body of literature has documented facial emotion perception impairments in schizophrenia. More recently, emotion perception has been investigated in persons at genetic and clinical high-risk for psychosis. This study compared emotion perception abilities in groups of young persons with schizophrenia, clinical high-risk, genetic risk and healthy controls. Groups, ages 13-25, included 24 persons at clinical high-risk, 52 first-degree relatives at genetic risk, 91 persons with schizophrenia and 90 low risk persons who completed computerized testing of emotion recognition and differentiation. Groups differed by overall emotion recognition abilities and recognition of happy, sad, anger and fear expressions. Pairwise comparisons revealed comparable impairments in recognition of happy, angry, and fearful expressions for persons at clinical high-risk and schizophrenia, while genetic risk participants were less impaired, showing reduced recognition of fearful expressions. Groups also differed for differentiation of happy and sad expressions, but differences were mainly between schizophrenia and control groups. Emotion perception impairments are observable in young persons at-risk for psychosis. Preliminary results with clinical high-risk participants, when considered along findings in genetic risk relatives, suggest social cognition abilities to reflect pathophysiological processes involved in risk of schizophrenia. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  5. COSMIC MICROWAVE BACKGROUND LIKELIHOOD APPROXIMATION BY A GAUSSIANIZED BLACKWELL-RAO ESTIMATOR

    International Nuclear Information System (INIS)

    Rudjord, Oe.; Groeneboom, N. E.; Eriksen, H. K.; Huey, Greg; Gorski, K. M.; Jewell, J. B.

    2009-01-01

    We introduce a new cosmic microwave background (CMB) temperature likelihood approximation called the Gaussianized Blackwell-Rao estimator. This estimator is derived by transforming the observed marginal power spectrum distributions obtained by the CMB Gibbs sampler into standard univariate Gaussians, and then approximating their joint transformed distribution by a multivariate Gaussian. The method is exact for full-sky coverage and uniform noise and an excellent approximation for sky cuts and scanning patterns relevant for modern satellite experiments such as the Wilkinson Microwave Anisotropy Probe (WMAP) and Planck. The result is a stable, accurate, and computationally very efficient CMB temperature likelihood representation that allows the user to exploit the unique error propagation capabilities of the Gibbs sampler to high ls. A single evaluation of this estimator between l = 2 and 200 takes ∼0.2 CPU milliseconds, while for comparison, a singe pixel space likelihood evaluation between l = 2 and 30 for a map with ∼2500 pixels requires ∼20 s. We apply this tool to the five-year WMAP temperature data, and re-estimate the angular temperature power spectrum, C l , and likelihood, L(C l ), for l ≤ 200, and derive new cosmological parameters for the standard six-parameter ΛCDM model. Our spectrum is in excellent agreement with the official WMAP spectrum, but we find slight differences in the derived cosmological parameters. Most importantly, the spectral index of scalar perturbations is n s = 0.973 ± 0.014, 1.9σ away from unity and 0.6σ higher than the official WMAP result, n s = 0.965 ± 0.014. This suggests that an exact likelihood treatment is required to higher ls than previously believed, reinforcing and extending our conclusions from the three-year WMAP analysis. In that case, we found that the suboptimal likelihood approximation adopted between l = 12 and 30 by the WMAP team biased n s low by 0.4σ, while here we find that the same approximation

  6. Communicating likelihoods and probabilities in forecasts of volcanic eruptions

    Science.gov (United States)

    Doyle, Emma E. H.; McClure, John; Johnston, David M.; Paton, Douglas

    2014-02-01

    The issuing of forecasts and warnings of natural hazard events, such as volcanic eruptions, earthquake aftershock sequences and extreme weather often involves the use of probabilistic terms, particularly when communicated by scientific advisory groups to key decision-makers, who can differ greatly in relative expertise and function in the decision making process. Recipients may also differ in their perception of relative importance of political and economic influences on interpretation. Consequently, the interpretation of these probabilistic terms can vary greatly due to the framing of the statements, and whether verbal or numerical terms are used. We present a review from the psychology literature on how the framing of information influences communication of these probability terms. It is also unclear as to how people rate their perception of an event's likelihood throughout a time frame when a forecast time window is stated. Previous research has identified that, when presented with a 10-year time window forecast, participants viewed the likelihood of an event occurring ‘today’ as being of less than that in year 10. Here we show that this skew in perception also occurs for short-term time windows (under one week) that are of most relevance for emergency warnings. In addition, unlike the long-time window statements, the use of the phrasing “within the next…” instead of “in the next…” does not mitigate this skew, nor do we observe significant differences between the perceived likelihoods of scientists and non-scientists. This finding suggests that effects occurring due to the shorter time window may be ‘masking’ any differences in perception due to wording or career background observed for long-time window forecasts. These results have implications for scientific advice, warning forecasts, emergency management decision-making, and public information as any skew in perceived event likelihood towards the end of a forecast time window may result in

  7. Achieving organisational competence for clinical leadership: the role of high performance work systems.

    Science.gov (United States)

    Leggat, Sandra G; Balding, Cathy

    2013-01-01

    While there has been substantial discussion about the potential for clinical leadership in improving quality and safety in healthcare, there has been little robust study. The purpose of this paper is to present the results of a qualitative study with clinicians and clinician managers to gather opinions on the appropriate content of an educational initiative being planned to improve clinical leadership in quality and safety among medical, nursing and allied health professionals working in primary, community and secondary care. In total, 28 clinicians and clinician managers throughout the state of Victoria, Australia, participated in focus groups to provide advice on the development of a clinical leadership program in quality and safety. An inductive, thematic analysis was completed to enable the themes to emerge from the data. Overwhelmingly the participants conceptualised clinical leadership in relation to organisational factors. Only four individual factors, comprising emotional intelligence, resilience, self-awareness and understanding of other clinical disciplines, were identified as being important for clinical leaders. Conversely seven organisational factors, comprising role clarity and accountability, security and sustainability for clinical leaders, selective recruitment into clinical leadership positions, teamwork and decentralised decision making, training, information sharing, and transformational leadership, were seen as essential, but the participants indicated they were rarely addressed. The human resource management literature includes these seven components, with contingent reward, reduced status distinctions and measurement of management practices, as the essential organisational underpinnings of high performance work systems. The results of this study propose that clinical leadership is an organisational property, suggesting that capability frameworks and educational programs for clinical leadership need a broader organisation focus. The paper

  8. A score to estimate the likelihood of detecting advanced colorectal neoplasia at colonoscopy.

    Science.gov (United States)

    Kaminski, Michal F; Polkowski, Marcin; Kraszewska, Ewa; Rupinski, Maciej; Butruk, Eugeniusz; Regula, Jaroslaw

    2014-07-01

    This study aimed to develop and validate a model to estimate the likelihood of detecting advanced colorectal neoplasia in Caucasian patients. We performed a cross-sectional analysis of database records for 40-year-old to 66-year-old patients who entered a national primary colonoscopy-based screening programme for colorectal cancer in 73 centres in Poland in the year 2007. We used multivariate logistic regression to investigate the associations between clinical variables and the presence of advanced neoplasia in a randomly selected test set, and confirmed the associations in a validation set. We used model coefficients to develop a risk score for detection of advanced colorectal neoplasia. Advanced colorectal neoplasia was detected in 2544 of the 35,918 included participants (7.1%). In the test set, a logistic-regression model showed that independent risk factors for advanced colorectal neoplasia were: age, sex, family history of colorectal cancer, cigarette smoking (padvanced neoplasia: 1.00 (95% CI 0.95 to 1.06)) and had moderate discriminatory power (c-statistic 0.62). We developed a score that estimated the likelihood of detecting advanced neoplasia in the validation set, from 1.32% for patients scoring 0, to 19.12% for patients scoring 7-8. Developed and internally validated score consisting of simple clinical factors successfully estimates the likelihood of detecting advanced colorectal neoplasia in asymptomatic Caucasian patients. Once externally validated, it may be useful for counselling or designing primary prevention studies. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  9. Automatic identification of high impact articles in PubMed to support clinical decision making.

    Science.gov (United States)

    Bian, Jiantao; Morid, Mohammad Amin; Jonnalagadda, Siddhartha; Luo, Gang; Del Fiol, Guilherme

    2017-09-01

    The practice of evidence-based medicine involves integrating the latest best available evidence into patient care decisions. Yet, critical barriers exist for clinicians' retrieval of evidence that is relevant for a particular patient from primary sources such as randomized controlled trials and meta-analyses. To help address those barriers, we investigated machine learning algorithms that find clinical studies with high clinical impact from PubMed®. Our machine learning algorithms use a variety of features including bibliometric features (e.g., citation count), social media attention, journal impact factors, and citation metadata. The algorithms were developed and evaluated with a gold standard composed of 502 high impact clinical studies that are referenced in 11 clinical evidence-based guidelines on the treatment of various diseases. We tested the following hypotheses: (1) our high impact classifier outperforms a state-of-the-art classifier based on citation metadata and citation terms, and PubMed's® relevance sort algorithm; and (2) the performance of our high impact classifier does not decrease significantly after removing proprietary features such as citation count. The mean top 20 precision of our high impact classifier was 34% versus 11% for the state-of-the-art classifier and 4% for PubMed's® relevance sort (p=0.009); and the performance of our high impact classifier did not decrease significantly after removing proprietary features (mean top 20 precision=34% vs. 36%; p=0.085). The high impact classifier, using features such as bibliometrics, social media attention and MEDLINE® metadata, outperformed previous approaches and is a promising alternative to identifying high impact studies for clinical decision support. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. Perioperative mortality in cats and dogs undergoing spay or castration at a high-volume clinic.

    Science.gov (United States)

    Levy, J K; Bard, K M; Tucker, S J; Diskant, P D; Dingman, P A

    2017-06-01

    High volume spay-neuter (spay-castration) clinics have been established to improve population control of cats and dogs to reduce the number of animals admitted to and euthanazed in animal shelters. The rise in the number of spay-neuter clinics in the USA has been accompanied by concern about the quality of animal care provided in high volume facilities, which focus on minimally invasive, time saving techniques, high throughput and simultaneous management of multiple animals under various stages of anesthesia. The aim of this study was to determine perioperative mortality for cats and dogs in a high volume spay-neuter clinic in the USA. Electronic medical records and a written mortality log were used to collect data for 71,557 cats and 42,349 dogs undergoing spay-neuter surgery from 2010 to 2016 at a single high volume clinic in Florida. Perioperative mortality was defined as deaths occurring in the 24h period starting with the administration of the first sedation or anesthetic drugs. Perioperative mortality was reported for 34 cats and four dogs for an overall mortality of 3.3 animals/10,000 surgeries (0.03%). The risk of mortality was more than twice as high for females (0.05%) as for males (0.02%) (P=0.008) and five times as high for cats (0.05%) as for dogs (0.009%) (P=0.0007). High volume spay-neuter surgery was associated with a lower mortality rate than that previously reported in low volume clinics, approaching that achieved in human surgery. This is likely to be due to the young, healthy population of dogs and cats, and the continuous refinement of techniques based on experience and the skills and proficiency of teams that specialize in a limited spectrum of procedures. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Maximum Likelihood DOA Estimation of Multiple Wideband Sources in the Presence of Nonuniform Sensor Noise

    Directory of Open Access Journals (Sweden)

    K. Yao

    2007-12-01

    Full Text Available We investigate the maximum likelihood (ML direction-of-arrival (DOA estimation of multiple wideband sources in the presence of unknown nonuniform sensor noise. New closed-form expression for the direction estimation Cramér-Rao-Bound (CRB has been derived. The performance of the conventional wideband uniform ML estimator under nonuniform noise has been studied. In order to mitigate the performance degradation caused by the nonuniformity of the noise, a new deterministic wideband nonuniform ML DOA estimator is derived and two associated processing algorithms are proposed. The first algorithm is based on an iterative procedure which stepwise concentrates the log-likelihood function with respect to the DOAs and the noise nuisance parameters, while the second is a noniterative algorithm that maximizes the derived approximately concentrated log-likelihood function. The performance of the proposed algorithms is tested through extensive computer simulations. Simulation results show the stepwise-concentrated ML algorithm (SC-ML requires only a few iterations to converge and both the SC-ML and the approximately-concentrated ML algorithm (AC-ML attain a solution close to the derived CRB at high signal-to-noise ratio.

  12. On the performance of social network and likelihood-based expert weighting schemes

    International Nuclear Information System (INIS)

    Cooke, Roger M.; ElSaadany, Susie; Huang Xinzheng

    2008-01-01

    Using expert judgment data from the TU Delft's expert judgment database, we compare the performance of different weighting schemes, namely equal weighting, performance-based weighting from the classical model [Cooke RM. Experts in uncertainty. Oxford: Oxford University Press; 1991.], social network (SN) weighting and likelihood weighting. The picture that emerges with regard to SN weights is rather mixed. SN theory does not provide an alternative to performance-based combination of expert judgments, since the statistical accuracy of the SN decision maker is sometimes unacceptably low. On the other hand, it does outperform equal weighting in the majority of cases. The results here, though not overwhelmingly positive, do nonetheless motivate further research into social interaction methods for nominating and weighting experts. Indeed, a full expert judgment study with performance measurement requires an investment in time and effort, with a view to securing external validation. If high confidence in a comparable level of validation can be obtained by less intensive methods, this would be very welcome, and would facilitate the application of structured expert judgment in situations where the resources for a full study are not available. Likelihood weights are just as resource intensive as performance-based weights, and the evidence presented here suggests that they are inferior to performance-based weights with regard to those scoring variables which are optimized in performance weights (calibration and information). Perhaps surprisingly, they are also inferior with regard to likelihood. Their use is further discouraged by the fact that they constitute a strongly improper scoring rule

  13. Computation of the Likelihood in Biallelic Diffusion Models Using Orthogonal Polynomials

    Directory of Open Access Journals (Sweden)

    Claus Vogl

    2014-11-01

    Full Text Available In population genetics, parameters describing forces such as mutation, migration and drift are generally inferred from molecular data. Lately, approximate methods based on simulations and summary statistics have been widely applied for such inference, even though these methods waste information. In contrast, probabilistic methods of inference can be shown to be optimal, if their assumptions are met. In genomic regions where recombination rates are high relative to mutation rates, polymorphic nucleotide sites can be assumed to evolve independently from each other. The distribution of allele frequencies at a large number of such sites has been called “allele-frequency spectrum” or “site-frequency spectrum” (SFS. Conditional on the allelic proportions, the likelihoods of such data can be modeled as binomial. A simple model representing the evolution of allelic proportions is the biallelic mutation-drift or mutation-directional selection-drift diffusion model. With series of orthogonal polynomials, specifically Jacobi and Gegenbauer polynomials, or the related spheroidal wave function, the diffusion equations can be solved efficiently. In the neutral case, the product of the binomial likelihoods with the sum of such polynomials leads to finite series of polynomials, i.e., relatively simple equations, from which the exact likelihoods can be calculated. In this article, the use of orthogonal polynomials for inferring population genetic parameters is investigated.

  14. Laser-Based Slam with Efficient Occupancy Likelihood Map Learning for Dynamic Indoor Scenes

    Science.gov (United States)

    Li, Li; Yao, Jian; Xie, Renping; Tu, Jinge; Feng, Chen

    2016-06-01

    Location-Based Services (LBS) have attracted growing attention in recent years, especially in indoor environments. The fundamental technique of LBS is the map building for unknown environments, this technique also named as simultaneous localization and mapping (SLAM) in robotic society. In this paper, we propose a novel approach for SLAMin dynamic indoor scenes based on a 2D laser scanner mounted on a mobile Unmanned Ground Vehicle (UGV) with the help of the grid-based occupancy likelihood map. Instead of applying scan matching in two adjacent scans, we propose to match current scan with the occupancy likelihood map learned from all previous scans in multiple scales to avoid the accumulation of matching errors. Due to that the acquisition of the points in a scan is sequential but not simultaneous, there unavoidably exists the scan distortion at different extents. To compensate the scan distortion caused by the motion of the UGV, we propose to integrate a velocity of a laser range finder (LRF) into the scan matching optimization framework. Besides, to reduce the effect of dynamic objects such as walking pedestrians often existed in indoor scenes as much as possible, we propose a new occupancy likelihood map learning strategy by increasing or decreasing the probability of each occupancy grid after each scan matching. Experimental results in several challenged indoor scenes demonstrate that our proposed approach is capable of providing high-precision SLAM results.

  15. L.U.St: a tool for approximated maximum likelihood supertree reconstruction.

    Science.gov (United States)

    Akanni, Wasiu A; Creevey, Christopher J; Wilkinson, Mark; Pisani, Davide

    2014-06-12

    Supertrees combine disparate, partially overlapping trees to generate a synthesis that provides a high level perspective that cannot be attained from the inspection of individual phylogenies. Supertrees can be seen as meta-analytical tools that can be used to make inferences based on results of previous scientific studies. Their meta-analytical application has increased in popularity since it was realised that the power of statistical tests for the study of evolutionary trends critically depends on the use of taxon-dense phylogenies. Further to that, supertrees have found applications in phylogenomics where they are used to combine gene trees and recover species phylogenies based on genome-scale data sets. Here, we present the L.U.St package, a python tool for approximate maximum likelihood supertree inference and illustrate its application using a genomic data set for the placental mammals. L.U.St allows the calculation of the approximate likelihood of a supertree, given a set of input trees, performs heuristic searches to look for the supertree of highest likelihood, and performs statistical tests of two or more supertrees. To this end, L.U.St implements a winning sites test allowing ranking of a collection of a-priori selected hypotheses, given as a collection of input supertree topologies. It also outputs a file of input-tree-wise likelihood scores that can be used as input to CONSEL for calculation of standard tests of two trees (e.g. Kishino-Hasegawa, Shimidoara-Hasegawa and Approximately Unbiased tests). This is the first fully parametric implementation of a supertree method, it has clearly understood properties, and provides several advantages over currently available supertree approaches. It is easy to implement and works on any platform that has python installed. bitBucket page - https://afro-juju@bitbucket.org/afro-juju/l.u.st.git. Davide.Pisani@bristol.ac.uk.

  16. Applying exclusion likelihoods from LHC searches to extended Higgs sectors

    International Nuclear Information System (INIS)

    Bechtle, Philip; Heinemeyer, Sven; Staal, Oscar; Stefaniak, Tim; Weiglein, Georg

    2015-01-01

    LHC searches for non-standard Higgs bosons decaying into tau lepton pairs constitute a sensitive experimental probe for physics beyond the Standard Model (BSM), such as supersymmetry (SUSY). Recently, the limits obtained from these searches have been presented by the CMS collaboration in a nearly model-independent fashion - as a narrow resonance model - based on the full 8 TeV dataset. In addition to publishing a 95 % C.L. exclusion limit, the full likelihood information for the narrowresonance model has been released. This provides valuable information that can be incorporated into global BSM fits. We present a simple algorithm that maps an arbitrary model with multiple neutral Higgs bosons onto the narrow resonance model and derives the corresponding value for the exclusion likelihood from the CMS search. This procedure has been implemented into the public computer code HiggsBounds (version 4.2.0 and higher). We validate our implementation by cross-checking against the official CMS exclusion contours in three Higgs benchmark scenarios in the Minimal Supersymmetric Standard Model (MSSM), and find very good agreement. Going beyond validation, we discuss the combined constraints of the ττ search and the rate measurements of the SM-like Higgs at 125 GeV in a recently proposed MSSM benchmark scenario, where the lightest Higgs boson obtains SM-like couplings independently of the decoupling of the heavier Higgs states. Technical details for how to access the likelihood information within HiggsBounds are given in the appendix. The program is available at http:// higgsbounds.hepforge.org. (orig.)

  17. A maximum-likelihood reconstruction algorithm for tomographic gamma-ray nondestructive assay

    International Nuclear Information System (INIS)

    Prettyman, T.H.; Estep, R.J.; Cole, R.A.; Sheppard, G.A.

    1994-01-01

    A new tomographic reconstruction algorithm for nondestructive assay with high resolution gamma-ray spectroscopy (HRGS) is presented. The reconstruction problem is formulated using a maximum-likelihood approach in which the statistical structure of both the gross and continuum measurements used to determine the full-energy response in HRGS is precisely modeled. An accelerated expectation-maximization algorithm is used to determine the optimal solution. The algorithm is applied to safeguards and environmental assays of large samples (for example, 55-gal. drums) in which high continuum levels caused by Compton scattering are routinely encountered. Details of the implementation of the algorithm and a comparative study of the algorithm's performance are presented

  18. Penalised Maximum Likelihood Simultaneous Longitudinal PET Image Reconstruction with Difference-Image Priors.

    Science.gov (United States)

    Ellis, Sam; Reader, Andrew J

    2018-04-26

    Many clinical contexts require the acquisition of multiple positron emission tomography (PET) scans of a single subject, for example to observe and quantify changes in functional behaviour in tumours after treatment in oncology. Typically, the datasets from each of these scans are reconstructed individually, without exploiting the similarities between them. We have recently shown that sharing information between longitudinal PET datasets by penalising voxel-wise differences during image reconstruction can improve reconstructed images by reducing background noise and increasing the contrast-to-noise ratio of high activity lesions. Here we present two additional novel longitudinal difference-image priors and evaluate their performance using 2D simulation studies and a 3D real dataset case study. We have previously proposed a simultaneous difference-image-based penalised maximum likelihood (PML) longitudinal image reconstruction method that encourages sparse difference images (DS-PML), and in this work we propose two further novel prior terms. The priors are designed to encourage longitudinal images with corresponding differences which have i) low entropy (DE-PML), and ii) high sparsity in their spatial gradients (DTV-PML). These two new priors and the originally proposed longitudinal prior were applied to 2D simulated treatment response [ 18 F]fluorodeoxyglucose (FDG) brain tumour datasets and compared to standard maximum likelihood expectation-maximisation (MLEM) reconstructions. These 2D simulation studies explored the effects of penalty strengths, tumour behaviour, and inter-scan coupling on reconstructed images. Finally, a real two-scan longitudinal data series acquired from a head and neck cancer patient was reconstructed with the proposed methods and the results compared to standard reconstruction methods. Using any of the three priors with an appropriate penalty strength produced images with noise levels equivalent to those seen when using standard

  19. The high vaginal swab in general practice: clinical correlates of possible pathogens.

    Science.gov (United States)

    Dykhuizen, R S; Harvey, G; Gould, I M

    1995-06-01

    Clinical features, diagnosis and treatment of 286 women whose high vaginal swabs (HVS) submitted by their general practitioners showed pure, heavy growth of Staphylococcus aureus, beta haemolytic streptococci groups A, C or G, Streptococcus milleri, Streptococcus pneumoniae or Haemophilus influenzae were analysed. Women with group A, C and G streptococci frequently had clinical vulvovaginitis and although the numbers were too small for statistical confirmation, S. pneumoniae and H. influenzae appeared to cause clinical disease as well. The association of S. aureus or S. milleri with clinical vulvovaginitis was much less convincing. It seems relevant for laboratories to report sensitivities for group A, C and G streptococci. Further research is needed to determine the pathogenicity of S. pneumoniae and H. influenzae.

  20. Australian food life style segments and elaboration likelihood differences

    DEFF Research Database (Denmark)

    Brunsø, Karen; Reid, Mike

    As the global food marketing environment becomes more competitive, the international and comparative perspective of consumers' attitudes and behaviours becomes more important for both practitioners and academics. This research employs the Food-Related Life Style (FRL) instrument in Australia...... in order to 1) determine Australian Life Style Segments and compare these with their European counterparts, and to 2) explore differences in elaboration likelihood among the Australian segments, e.g. consumers' interest and motivation to perceive product related communication. The results provide new...

  1. Maximum-likelihood method for numerical inversion of Mellin transform

    International Nuclear Information System (INIS)

    Iqbal, M.

    1997-01-01

    A method is described for inverting the Mellin transform which uses an expansion in Laguerre polynomials and converts the Mellin transform to Laplace transform, then the maximum-likelihood regularization method is used to recover the original function of the Mellin transform. The performance of the method is illustrated by the inversion of the test functions available in the literature (J. Inst. Math. Appl., 20 (1977) 73; Math. Comput., 53 (1989) 589). Effectiveness of the method is shown by results obtained through demonstration by means of tables and diagrams

  2. How to Improve the Likelihood of CDM Approval?

    DEFF Research Database (Denmark)

    Brandt, Urs Steiner; Svendsen, Gert Tinggaard

    2014-01-01

    How can the likelihood of Clean Development Mechanism (CDM) approval be improved in the face of institutional shortcomings? To answer this question, we focus on the three institutional shortcomings of income sharing, risk sharing and corruption prevention concerning afforestation/reforestation (A....../R). Furthermore, three main stakeholders are identified, namely investors, governments and agents in a principal-agent model regarding monitoring and enforcement capacity. Developing countries such as West Africa have, despite huge potentials, not been integrated in A/R CDM projects yet. Remote sensing, however...

  3. Maximum Likelihood and Bayes Estimation in Randomly Censored Geometric Distribution

    Directory of Open Access Journals (Sweden)

    Hare Krishna

    2017-01-01

    Full Text Available In this article, we study the geometric distribution under randomly censored data. Maximum likelihood estimators and confidence intervals based on Fisher information matrix are derived for the unknown parameters with randomly censored data. Bayes estimators are also developed using beta priors under generalized entropy and LINEX loss functions. Also, Bayesian credible and highest posterior density (HPD credible intervals are obtained for the parameters. Expected time on test and reliability characteristics are also analyzed in this article. To compare various estimates developed in the article, a Monte Carlo simulation study is carried out. Finally, for illustration purpose, a randomly censored real data set is discussed.

  4. Likelihood-Based Inference in Nonlinear Error-Correction Models

    DEFF Research Database (Denmark)

    Kristensen, Dennis; Rahbæk, Anders

    We consider a class of vector nonlinear error correction models where the transfer function (or loadings) of the stationary relation- ships is nonlinear. This includes in particular the smooth transition models. A general representation theorem is given which establishes the dynamic properties...... and a linear trend in general. Gaussian likelihood-based estimators are considered for the long- run cointegration parameters, and the short-run parameters. Asymp- totic theory is provided for these and it is discussed to what extend asymptotic normality and mixed normaity can be found. A simulation study...

  5. Process criticality accident likelihoods, consequences and emergency planning

    International Nuclear Information System (INIS)

    McLaughlin, T.P.

    1992-01-01

    Evaluation of criticality accident risks in the processing of significant quantities of fissile materials is both complex and subjective, largely due to the lack of accident statistics. Thus, complying with national and international standards and regulations which require an evaluation of the net benefit of a criticality accident alarm system, is also subjective. A review of guidance found in the literature on potential accident magnitudes is presented for different material forms and arrangements. Reasoned arguments are also presented concerning accident prevention and accident likelihoods for these material forms and arrangements. (Author)

  6. Likelihood Estimation of Gamma Ray Bursts Duration Distribution

    OpenAIRE

    Horvath, Istvan

    2005-01-01

    Two classes of Gamma Ray Bursts have been identified so far, characterized by T90 durations shorter and longer than approximately 2 seconds. It was shown that the BATSE 3B data allow a good fit with three Gaussian distributions in log T90. In the same Volume in ApJ. another paper suggested that the third class of GRBs is may exist. Using the full BATSE catalog here we present the maximum likelihood estimation, which gives us 0.5% probability to having only two subclasses. The MC simulation co...

  7. Process criticality accident likelihoods, consequences, and emergency planning

    Energy Technology Data Exchange (ETDEWEB)

    McLaughlin, T.P.

    1991-01-01

    Evaluation of criticality accident risks in the processing of significant quantities of fissile materials is both complex and subjective, largely due to the lack of accident statistics. Thus, complying with standards such as ISO 7753 which mandates that the need for an alarm system be evaluated, is also subjective. A review of guidance found in the literature on potential accident magnitudes is presented for different material forms and arrangements. Reasoned arguments are also presented concerning accident prevention and accident likelihoods for these material forms and arrangements. 13 refs., 1 fig., 1 tab.

  8. Improved Likelihood Function in Particle-based IR Eye Tracking

    DEFF Research Database (Denmark)

    Satria, R.; Sorensen, J.; Hammoud, R.

    2005-01-01

    In this paper we propose a log likelihood-ratio function of foreground and background models used in a particle filter to track the eye region in dark-bright pupil image sequences. This model fuses information from both dark and bright pupil images and their difference image into one model. Our...... enhanced tracker overcomes the issues of prior selection of static thresholds during the detection of feature observations in the bright-dark difference images. The auto-initialization process is performed using cascaded classifier trained using adaboost and adapted to IR eye images. Experiments show good...

  9. Similar tests and the standardized log likelihood ratio statistic

    DEFF Research Database (Denmark)

    Jensen, Jens Ledet

    1986-01-01

    When testing an affine hypothesis in an exponential family the 'ideal' procedure is to calculate the exact similar test, or an approximation to this, based on the conditional distribution given the minimal sufficient statistic under the null hypothesis. By contrast to this there is a 'primitive......' approach in which the marginal distribution of a test statistic considered and any nuisance parameter appearing in the test statistic is replaced by an estimate. We show here that when using standardized likelihood ratio statistics the 'primitive' procedure is in fact an 'ideal' procedure to order O(n -3...

  10. Factors Associated With the Likelihood of Hospitalization Following Emergency Department Visits for Behavioral Health Conditions.

    Science.gov (United States)

    Hamilton, Jane E; Desai, Pratikkumar V; Hoot, Nathan R; Gearing, Robin E; Jeong, Shin; Meyer, Thomas D; Soares, Jair C; Begley, Charles E

    2016-11-01

    Behavioral health-related emergency department (ED) visits have been linked with ED overcrowding, an increased demand on limited resources, and a longer length of stay (LOS) due in part to patients being admitted to the hospital but waiting for an inpatient bed. This study examines factors associated with the likelihood of hospital admission for ED patients with behavioral health conditions at 16 hospital-based EDs in a large urban area in the southern United States. Using Andersen's Behavioral Model of Health Service Use for guidance, the study examined the relationship between predisposing (characteristics of the individual, i.e., age, sex, race/ethnicity), enabling (system or structural factors affecting healthcare access), and need (clinical) factors and the likelihood of hospitalization following ED visits for behavioral health conditions (n = 28,716 ED visits). In the adjusted analysis, a logistic fixed-effects model with blockwise entry was used to estimate the relative importance of predisposing, enabling, and need variables added separately as blocks while controlling for variation in unobserved hospital-specific practices across hospitals and time in years. Significant predisposing factors associated with an increased likelihood of hospitalization following an ED visit included increasing age, while African American race was associated with a lower likelihood of hospitalization. Among enabling factors, arrival by emergency transport and a longer ED LOS were associated with a greater likelihood of hospitalization while being uninsured and the availability of community-based behavioral health services within 5 miles of the ED were associated with lower odds. Among need factors, having a discharge diagnosis of schizophrenia/psychotic spectrum disorder, an affective disorder, a personality disorder, dementia, or an impulse control disorder as well as secondary diagnoses of suicidal ideation and/or suicidal behavior increased the likelihood of hospitalization

  11. Finding the Right Distribution for Highly Skewed Zero-inflated Clinical Data

    Directory of Open Access Journals (Sweden)

    Resmi Gupta

    2013-03-01

    Full Text Available Discrete, highly skewed distributions with excess numbers of zeros often result in biased estimates and misleading inferences if the zeros are not properly addressed. A clinical example of children with electrophysiologic disorders in which many of the children are treated without surgery is provided. The purpose of the current study was to identify the optimal modeling strategy for highly skewed, zeroinflated data often observed in the clinical setting by: (a simulating skewed, zero-inflated count data; (b fitting simulated data with Poisson, Negative Binomial, Zero-Inflated Poisson (ZIP and Zero-inflated Negative Binomial (ZINB models; and, (c applying the aforementioned models to actual, highlyskewed, clinical data of children with an EP disorder. The ZIP model was observed to be the optimal model based on traditional fit statistics as well as estimates of bias, mean-squared error, and coverage.  

  12. Chemotherapy and novel therapeutics before radical prostatectomy for high-risk clinically localized prostate cancer.

    Science.gov (United States)

    Cha, Eugene K; Eastham, James A

    2015-05-01

    Although both surgery and radiation are potential curative options for men with clinically localized prostate cancer, a significant proportion of men with high-risk and locally advanced disease will demonstrate biochemical and potentially clinical progression of their disease. Neoadjuvant systemic therapy before radical prostatectomy (RP) is a logical strategy to improve treatment outcomes for men with clinically localized high-risk prostate cancer. Furthermore, delivery of chemotherapy and other systemic agents before RP affords an opportunity to explore the efficacy of these agents with pathologic end points. Neoadjuvant chemotherapy, primarily with docetaxel (with or without androgen deprivation therapy), has demonstrated feasibility and safety in men undergoing RP, but no study to date has established the efficacy of neoadjuvant chemotherapy or neoadjuvant chemohormonal therapies. Other novel agents, such as those targeting the vascular endothelial growth factor receptor, epidermal growth factor receptor, platelet-derived growth factor receptor, clusterin, and immunomodulatory therapeutics, are currently under investigation. Copyright © 2015 Elsevier Inc. All rights reserved.

  13. Temporal association of cannabis use with symptoms in individuals at clinical high risk for psychosis.

    Science.gov (United States)

    Corcoran, Cheryl M; Kimhy, David; Stanford, Arielle; Khan, Shamir; Walsh, Julie; Thompson, Judy; Schobel, Scott; Harkavy-Friedman, Jill; Goetz, Ray; Colibazzi, Tiziano; Cressman, Victoria; Malaspina, Dolores

    2008-12-01

    Cannabis use is reported to increase the risk for psychosis, but no prospective study has longitudinally examined drug use and symptoms concurrently in clinical high risk cases. We prospectively followed for up to 2 years 32 cases who met research criteria for prodromal psychosis to examine the relationship between substance use and clinical measures. Cases with a baseline history of cannabis use (41%) were older, but did not differ in clinical measures. Longitudinal assessments showed these cases had significantly more perceptual disturbances and worse functioning during epochs of increased cannabis use that were unexplained by concurrent use of other drugs or medications. These data demonstrate that cannabis use may be a risk factor for the exacerbation of subthreshold psychotic symptoms, specifically perceptual disturbances, in high risk cases.

  14. Correlation of findings in clinical and high resolution ultrasonography examinations of the painful shoulder

    Directory of Open Access Journals (Sweden)

    Raphael Micheroli

    2015-03-01

    Full Text Available Objective: High resolution ultrasonography is a non-painful and non-invasive imaging technique which is useful for the assessment of shoulder pain causes, as clinical examination often does not allow an exact diagnosis. The aim of this study was to compare the fi ndings of clinical examination and high resolution ultrasonography in patients presenting with painful shoulder. Methods: Non-interventional observational study of 100 adult patients suffering from unilateral shoulder pain. Exclusion criteria were shoulder fractures, prior shoulder joint surgery and shoulder injections in the past month. The physicians performing the most common clinical shoulder examinations were blinded to the results of the high resolution ultrasonography and vice versa. Results: In order to detect pathology of the m. supraspinatus tendon, the Hawkins and Kennedy impingement test showed the highest sensitivity (0.86 whereas the Jobe supraspinatus test showed the highest specifi city (0.55. To identify m. subscapularis tendon pathology the Gerber lift off test showed a sensitivity of 1, whereas the belly press test showed the higher specifi city (0.72. The infraspinatus test showed a high sensitivity (0.90 and specifi city (0.74. All AC tests (painful arc IIa, AC joint tendernessb, cross body adduction stress testc showed high specifi cities (a0.96, b0.99, c 0.96. Evaluating the long biceps tendon, the palm up test showed the highest sensitivity (0.47 and the Yergason test the highest specifi city (0.88. Conclusion: Knowledge of sensitivity and specifi city of various clinical tests is important for the interpretation of clinical examination test results. High resolution ultrasonography is needed in most cases to establish a clear diagnosis.

  15. Detection of High Frequency Oscillations by Hybrid Depth Electrodes in Standard Clinical Intracranial EEG Recordings

    Directory of Open Access Journals (Sweden)

    Efstathios D Kondylis

    2014-08-01

    Full Text Available High frequency oscillations (HFOs have been proposed as a novel marker for epileptogenic tissue, spurring tremendous research interest into the characterization of these transient events. A wealth of continuously recorded intracranial electroencephalographic (iEEG data is currently available from patients undergoing invasive monitoring for the surgical treatment of epilepsy. In contrast to data recorded on research-customized recording systems, data from clinical acquisition systems remain an underutilized resource for HFO detection in most centers. The effective and reliable use of this clinically obtained data would be an important advance in the ongoing study of HFOs and their relationship to ictogenesis. The diagnostic utility of HFOs ultimately will be limited by the ability of clinicians to detect these brief, sporadic, and low amplitude events in an electrically noisy clinical environment. Indeed, one of the most significant factors limiting the use of such clinical recordings for research purposes is their low signal to noise ratio, especially in the higher frequency bands. In order to investigate the presence of HFOs in clinical data, we first obtained continuous intracranial recordings in a typical clinical environment using a commercially available, commonly utilized data acquisition system and off the shelf hybrid macro/micro depth electrodes. This data was then inspected for the presence of HFOs using semi-automated methods and expert manual review. With targeted removal of noise frequency content, HFOs were detected on both macro- and micro-contacts, and preferentially localized to seizure onset zones. HFOs detected by the offline, semi-automated method were also validated in the clinical viewer, demonstrating that 1 this clinical system allows for the visualization of HFOs, and 2 with effective signal processing, clinical recordings can yield valuable information for offline analysis.

  16. A combination of clinical balance measures and FRAX® to improve identification of high-risk fallers.

    Science.gov (United States)

    Najafi, David A; Dahlberg, Leif E; Hansson, Eva Ekvall

    2016-05-03

    The FRAX® algorithm quantifies a patient's 10-year probability of a hip or major osteoporotic fracture without taking an individual's balance into account. Balance measures assess the functional ability of an individual and the FRAX® algorithm is a model that integrates the individual patients clinical risk factors [not balance] and bone mineral density. Thus, clinical balance measures capture aspects that the FRAX® algorithm does not, and vice versa. It is therefore possible that combining FRAX® and clinical balance measures can improve the identification of patients at high fall risk and thereby high fracture risk. Our study aim was to explore whether there is an association between clinical balance measures and fracture prediction obtained from FRAX®. A cross-sectional study design was used where post hoc was performed on a dataset of 82 participants (54 to 89 years of age, mean age 71.4, 77 female), with a fall-related wrist-fracture between 2008 and 2012. Balance was measured by tandem stance, standing one leg, walking in the figure of eight, walking heel to toe on a line, walking as fast as possible for 30 m and five times sit to stand balance measures [tandem stance and standing one leg measured first with open and then with closed eyes] and each one analyzed for bivariate relations with the 10-year probability values for hip and major osteoporotic fractures as calculated by FRAX® using Spearman's rank correlation test. Individuals with high FRAX® values had poor outcome in balance measures; however the significance level of the correlation differed between tests. Standing one leg eyes closed had strongest correlation to FRAX® (0.610 p = balance measures and FRAX®. Hence, the use of clinical balance measures and FRAX® in combination, might improve the identification of individuals with high risk of falls and thereby following fractures. Results enable healthcare providers to optimize treatment and prevention of fall-related fractures. The study has

  17. High Intensity Laser Therapy (HILT) versus TENS and NSAIDs in low back pain: clinical study

    Science.gov (United States)

    Zati, Allesandro; Fortuna, Damiano; Valent, A.; Filippi, M. V.; Bilotta, Teresa W.

    2004-09-01

    Low back pain, caused by lumbar disc herniation, is prevalently treated with a conservative approach. In this study we valued the efficacy of High Intensity Laser Therapy (HILT), compared with accepted therapies such as TENS and NSAIDs. Laser therapy obtained similar results in the short term, but better clinical effect over time than TENS and NSAIDs. In conclusion high intensity laser therapy appears to be a interesting new treatment, worthy of further research.

  18. Clinical profile of high-risk febrile neutropenia in a tertiary care hospital

    Directory of Open Access Journals (Sweden)

    Mohan V Bhojaraja

    2016-06-01

    Full Text Available Background Infection in the immunocompromised host has been a reason of concern in the clinical setting and a topic of debate for decades. In this study, the aim was to analyse the clinical profile of high-risk febrile neutropenic patients. Aims To study the clinical profile of high risk febrile neutropenia patients with the objective of identifying the most common associated malignancy, most common associated pathogen, the source of infection, to correlate the treatment and management with that of the Infectious Diseases Society of America (IDSA 2010 guidelines and to assess the clinical outcome. Methods A cross-sectional time bound study was carried out and a total of 80 episodes of high-risk febrile neutropenia were recorded among patients with malignancies from September 2011 to July 2013 with each episode being taken as a new case. Results Non-Hodgkin’s lymphoma (30 per cent was the most common malignancy associated, commonest source of infection was due to central venous catheters, the commonest pathogens were gram negative (52 per cent the treatment and management of each episode of high risk febrile neutropenia correlated with that of IDSA 2010 guidelines and the mortality rate was 13.75 per cent. Conclusion Febrile neutropenia is one of the major complications and cause of mortality in patients with malignancy and hence understanding its entire spectrum can help us reduce morbidity and mortality.

  19. Clinical use of pulmonary function tests and high-resolution tomography in interstitial lung diseases

    International Nuclear Information System (INIS)

    Garcia C, Clara P; Mejia M, Luis F

    2010-01-01

    Diagnosis of interstitial lung diseases is generally arrived at by clinical history, physical examination, and radiologic images, especially high-resolution CT-scanning. It is important to note that, while these diseases have different clinical and histological characteristics, they share a basic pattern of abnormal lung function. With regard to high-resolution tomography, the characteristics of these diseases are similar, although there are specific differences that can be helpful for correct diagnosis. These diseases have severe consequences on respiratory gas exchange. These alterations, combined with other abnormalities of lung function, cause the signs and symptoms and have an impact on quality of life. The use of physiologic parameters is not only helpful for diagnosis, but can also assess severity, help to define the consequences of treatment, and aid in the follow-up. Although some pulmonary function tests can remain completely normal with severe radiographic findings, 10% of patients have impaired lung function before radiologic changes. High-resolution tomography is an essential imaging tool for the study of these patients. This is true not only for diagnosis, but also with regard to clinical parameters and follow-up. Its prognostic use is continually gaining importance. In this article we assess the clinical use of pulmonary function tests and high-resolution tomography in interstitial lung diseases.

  20. High-Resolution Anoscopy: Clinical Features of Anal Intraepithelial Neoplasia in HIV-positive Men

    NARCIS (Netherlands)

    Richel, Olivier; Hallensleben, Nora D. L.; Kreuter, Alexander; van Noesel, Carel J. M.; Prins, Jan M.; de Vries, Henry J. C.

    2013-01-01

    BACKGROUND: High-resolution anoscopy is increasingly advocated to screen HIV+ men who have sex with men for anal cancer and its precursor lesions, anal intraepithelial neoplasia. A systematic comparison between clinical features and the histopathology of suspect lesions is lacking. OBJECTIVE: This

  1. Evaluation of pulmonary embolism in a pediatric population with high clinical suspicion

    International Nuclear Information System (INIS)

    Victoria, Teresa; Mong, Andrew; Altes, Talissa; Hernandez, Andrea; Gonzalez, Leonardo; Kramer, Sandra S.; Jawad, Abbas F.; Raffini, Leslie

    2009-01-01

    Pulmonary embolism (PE) is an underdiagnosed entity in the pediatric population in part because of the low level of suspicion and awareness in the clinical world. To examine its relative prevalence, associated risk factors and imaging features in our pediatric population. A total of 92 patients age 21 years and younger with a high clinical suspicion of PE and who had available radiographic studies were identified from January 2003 to September 2006. Patients with a positive CT scan or a high probability ventilation/perfusion scan formed the case group; patients with a high clinical suspicion of PE and no radiographic evidence of PE or deep venous thrombosis (DVT), randomly matched in age and sex, became the matched control group. We reviewed the charts of both groups and analyzed the imaging studies. In our hospital, the prevalence of PE in patients with a strong suspicion of PE was 14%. The overall prevalence of thromboembolic disease (PE and/or DVT) was 25%. Recent surgery or orthopedic procedure, blood dyscrasias and contraceptive use were more common in patients with PE. No child died of PE in our study. The youngest child with PE in our study was 13 years. Girls were twice as likely to develop PE as boys. PE is a relatively common diagnosis in our tertiary care pediatric population when the clinical suspicion is high. We suggest increased awareness and index of suspicion in order to initiate prompt diagnostic imaging and treatment. (orig.)

  2. The Strauss and Carpenter Prognostic Scale in subjects clinically at high risk of psychosis

    NARCIS (Netherlands)

    Nieman, D. H.; Velthorst, E.; Becker, H. E.; de Haan, L.; Dingemans, P. M.; Linszen, D. H.; Birchwood, M.; Patterson, P.; Salokangas, R. K. R.; Heinimaa, M.; Heinz, A.; Juckel, G.; von Reventlow, H. G.; Morrison, A.; Schultze-Lutter, F.; Klosterkötter, J.; Ruhrmann, S.; McGorry, Patrick D.; McGlashan, Thomas H.; Knapp, Martin; van de Fliert, Reinaud; Klaassen, Rianne; Picker, Heinz; Neumann, Meike; Brockhaus-Dumke, Anke; Pukrop, Ralf; Svirskis, Tanja; Huttunen, Jukka; Laine, Tiina; Ilonen, Tuula; Ristkari, Terja; Hietala, Jarmo; Skeate, Amanda; Gudlowski, Yehonala; Ozgürdal, Seza; French, Paul; Stevens, Helen

    2013-01-01

    To investigate the predictive value of the Strauss and Carpenter Prognostic Scale (SCPS) for transition to a first psychotic episode in subjects clinically at high risk (CHR) of psychosis. Two hundred and forty-four CHR subjects participating in the European Prediction of Psychosis Study were

  3. Clinical approaches involving thrombopoietin to shorten the period of thrombocytopenia after high-dose chemotherapy

    NARCIS (Netherlands)

    Tijssen, Marloes R.; van der Schoot, C. Ellen; Voermans, Carlijn; Zwaginga, Jaap Jan

    2006-01-01

    High-dose chemotherapy followed by a peripheral blood stem cell transplant is successfully used for a wide variety of malignancies. A major drawback, however, is the delay in platelet recovery. Several clinical strategies using thrombopoietin (Tpo) have been developed in an attempt to speed up

  4. Clinical Application of Esophageal High-resolution Manometry in the Diagnosis of Esophageal Motility Disorders

    NARCIS (Netherlands)

    van Hoeij, Froukje B.; Bredenoord, Albert J.

    2016-01-01

    Esophageal high-resolution manometry (HRM) is replacing conventional manometry in the clinical evaluation of patients with esophageal symptoms, especially dysphagia. The introduction of HRM gave rise to new objective metrics and recognizable patterns of esophageal motor function, requiring a new

  5. High frequency audiometry in prospective clinical research of ototoxicity due to platinum derivatives

    NARCIS (Netherlands)

    van der Hulst, R. J.; Dreschler, W. A.; Urbanus, N. A.

    1988-01-01

    The results of clinical use of routine high frequency audiometry in monitoring the ototoxic side effects of platinum and its derivatives are described in this prospective study. After demonstrating the reproducibility of the technique, we discuss the first results of an analysis of ototoxic side

  6. Philosophy and phylogenetic inference: a comparison of likelihood and parsimony methods in the context of Karl Popper's writings on corroboration.

    Science.gov (United States)

    de Queiroz, K; Poe, S

    2001-06-01

    Advocates of cladistic parsimony methods have invoked the philosophy of Karl Popper in an attempt to argue for the superiority of those methods over phylogenetic methods based on Ronald Fisher's statistical principle of likelihood. We argue that the concept of likelihood in general, and its application to problems of phylogenetic inference in particular, are highly compatible with Popper's philosophy. Examination of Popper's writings reveals that his concept of corroboration is, in fact, based on likelihood. Moreover, because probabilistic assumptions are necessary for calculating the probabilities that define Popper's corroboration, likelihood methods of phylogenetic inference--with their explicit probabilistic basis--are easily reconciled with his concept. In contrast, cladistic parsimony methods, at least as described by certain advocates of those methods, are less easily reconciled with Popper's concept of corroboration. If those methods are interpreted as lacking probabilistic assumptions, then they are incompatible with corroboration. Conversely, if parsimony methods are to be considered compatible with corroboration, then they must be interpreted as carrying implicit probabilistic assumptions. Thus, the non-probabilistic interpretation of cladistic parsimony favored by some advocates of those methods is contradicted by an attempt by the same authors to justify parsimony methods in terms of Popper's concept of corroboration. In addition to being compatible with Popperian corroboration, the likelihood approach to phylogenetic inference permits researchers to test the assumptions of their analytical methods (models) in a way that is consistent with Popper's ideas about the provisional nature of background knowledge.

  7. Likelihood Approximation With Parallel Hierarchical Matrices For Large Spatial Datasets

    KAUST Repository

    Litvinenko, Alexander

    2017-11-01

    The main goal of this article is to introduce the parallel hierarchical matrix library HLIBpro to the statistical community. We describe the HLIBCov package, which is an extension of the HLIBpro library for approximating large covariance matrices and maximizing likelihood functions. We show that an approximate Cholesky factorization of a dense matrix of size $2M\\\\times 2M$ can be computed on a modern multi-core desktop in few minutes. Further, HLIBCov is used for estimating the unknown parameters such as the covariance length, variance and smoothness parameter of a Matérn covariance function by maximizing the joint Gaussian log-likelihood function. The computational bottleneck here is expensive linear algebra arithmetics due to large and dense covariance matrices. Therefore covariance matrices are approximated in the hierarchical ($\\\\H$-) matrix format with computational cost $\\\\mathcal{O}(k^2n \\\\log^2 n/p)$ and storage $\\\\mathcal{O}(kn \\\\log n)$, where the rank $k$ is a small integer (typically $k<25$), $p$ the number of cores and $n$ the number of locations on a fairly general mesh. We demonstrate a synthetic example, where the true values of known parameters are known. For reproducibility we provide the C++ code, the documentation, and the synthetic data.

  8. Likelihood Approximation With Parallel Hierarchical Matrices For Large Spatial Datasets

    KAUST Repository

    Litvinenko, Alexander; Sun, Ying; Genton, Marc G.; Keyes, David E.

    2017-01-01

    The main goal of this article is to introduce the parallel hierarchical matrix library HLIBpro to the statistical community. We describe the HLIBCov package, which is an extension of the HLIBpro library for approximating large covariance matrices and maximizing likelihood functions. We show that an approximate Cholesky factorization of a dense matrix of size $2M\\times 2M$ can be computed on a modern multi-core desktop in few minutes. Further, HLIBCov is used for estimating the unknown parameters such as the covariance length, variance and smoothness parameter of a Matérn covariance function by maximizing the joint Gaussian log-likelihood function. The computational bottleneck here is expensive linear algebra arithmetics due to large and dense covariance matrices. Therefore covariance matrices are approximated in the hierarchical ($\\H$-) matrix format with computational cost $\\mathcal{O}(k^2n \\log^2 n/p)$ and storage $\\mathcal{O}(kn \\log n)$, where the rank $k$ is a small integer (typically $k<25$), $p$ the number of cores and $n$ the number of locations on a fairly general mesh. We demonstrate a synthetic example, where the true values of known parameters are known. For reproducibility we provide the C++ code, the documentation, and the synthetic data.

  9. Superfast maximum-likelihood reconstruction for quantum tomography

    Science.gov (United States)

    Shang, Jiangwei; Zhang, Zhengyun; Ng, Hui Khoon

    2017-06-01

    Conventional methods for computing maximum-likelihood estimators (MLE) often converge slowly in practical situations, leading to a search for simplifying methods that rely on additional assumptions for their validity. In this work, we provide a fast and reliable algorithm for maximum-likelihood reconstruction that avoids this slow convergence. Our method utilizes the state-of-the-art convex optimization scheme, an accelerated projected-gradient method, that allows one to accommodate the quantum nature of the problem in a different way than in the standard methods. We demonstrate the power of our approach by comparing its performance with other algorithms for n -qubit state tomography. In particular, an eight-qubit situation that purportedly took weeks of computation time in 2005 can now be completed in under a minute for a single set of data, with far higher accuracy than previously possible. This refutes the common claim that MLE reconstruction is slow and reduces the need for alternative methods that often come with difficult-to-verify assumptions. In fact, recent methods assuming Gaussian statistics or relying on compressed sensing ideas are demonstrably inapplicable for the situation under consideration here. Our algorithm can be applied to general optimization problems over the quantum state space; the philosophy of projected gradients can further be utilized for optimization contexts with general constraints.

  10. Likelihood inference for a fractionally cointegrated vector autoregressive model

    DEFF Research Database (Denmark)

    Johansen, Søren; Ørregård Nielsen, Morten

    2012-01-01

    such that the process X_{t} is fractional of order d and cofractional of order d-b; that is, there exist vectors ß for which ß'X_{t} is fractional of order d-b, and no other fractionality order is possible. We define the statistical model by 0inference when the true values satisfy b0¿1/2 and d0-b0......We consider model based inference in a fractionally cointegrated (or cofractional) vector autoregressive model with a restricted constant term, ¿, based on the Gaussian likelihood conditional on initial values. The model nests the I(d) VAR model. We give conditions on the parameters...... process in the parameters when errors are i.i.d. with suitable moment conditions and initial values are bounded. When the limit is deterministic this implies uniform convergence in probability of the conditional likelihood function. If the true value b0>1/2, we prove that the limit distribution of (ß...

  11. Simulation-based marginal likelihood for cluster strong lensing cosmology

    Science.gov (United States)

    Killedar, M.; Borgani, S.; Fabjan, D.; Dolag, K.; Granato, G.; Meneghetti, M.; Planelles, S.; Ragone-Figueroa, C.

    2018-01-01

    Comparisons between observed and predicted strong lensing properties of galaxy clusters have been routinely used to claim either tension or consistency with Λ cold dark matter cosmology. However, standard approaches to such cosmological tests are unable to quantify the preference for one cosmology over another. We advocate approximating the relevant Bayes factor using a marginal likelihood that is based on the following summary statistic: the posterior probability distribution function for the parameters of the scaling relation between Einstein radii and cluster mass, α and β. We demonstrate, for the first time, a method of estimating the marginal likelihood using the X-ray selected z > 0.5 Massive Cluster Survey clusters as a case in point and employing both N-body and hydrodynamic simulations of clusters. We investigate the uncertainty in this estimate and consequential ability to compare competing cosmologies, which arises from incomplete descriptions of baryonic processes, discrepancies in cluster selection criteria, redshift distribution and dynamical state. The relation between triaxial cluster masses at various overdensities provides a promising alternative to the strong lensing test.

  12. Risk factors and likelihood of Campylobacter colonization in broiler flocks

    Directory of Open Access Journals (Sweden)

    SL Kuana

    2007-09-01

    Full Text Available Campylobacter was investigated in cecal droppings, feces, and cloacal swabs of 22 flocks of 3 to 5 week-old broilers. Risk factors and the likelihood of the presence of this agent in these flocks were determined. Management practices, such as cleaning and disinfection, feeding, drinkers, and litter treatments, were assessed. Results were evaluated using Odds Ratio (OR test, and their significance was tested by Fisher's test (p<0.05. A Campylobacter prevalence of 81.8% was found in the broiler flocks (18/22, and within positive flocks, it varied between 85 and 100%. Campylobacter incidence among sample types was homogenous, being 81.8% in cecal droppings, 80.9% in feces, and 80.4% in cloacal swabs (230. Flocks fed by automatic feeding systems presented higher incidence of Campylobacter as compared to those fed by tube feeders. Litter was reused in 63.6% of the farm, and, despite the lack of statistical significance, there was higher likelihood of Campylobacter incidence when litter was reused. Foot bath was not used in 45.5% of the flocks, whereas the use of foot bath associated to deficient lime management increased the number of positive flocks, although with no statiscal significance. The evaluated parameters were not significantly associated with Campylobacter colonization in the assessed broiler flocks.

  13. Menyoal Elaboration Likelihood Model (ELM dan Teori Retorika

    Directory of Open Access Journals (Sweden)

    Yudi Perbawaningsih

    2012-06-01

    Full Text Available Abstract: Persuasion is a communication process to establish or change attitudes, which can be understood through theory of Rhetoric and theory of Elaboration Likelihood Model (ELM. This study elaborates these theories in a Public Lecture series which to persuade the students in choosing their concentration of study. The result shows that in term of persuasion effectiveness it is not quite relevant to separate the message and its source. The quality of source is determined by the quality of the message, and vice versa. Separating the two routes of the persuasion process as described in the ELM theory would not be relevant. Abstrak: Persuasi adalah proses komunikasi untuk membentuk atau mengubah sikap, yang dapat dipahami dengan teori Retorika dan teori Elaboration Likelihood Model (ELM. Penelitian ini mengelaborasi teori tersebut dalam Kuliah Umum sebagai sarana mempersuasi mahasiswa untuk memilih konsentrasi studi studi yang didasarkan pada proses pengolahan informasi. Menggunakan metode survey, didapatkan hasil yaitu tidaklah cukup relevan memisahkan pesan dan narasumber dalam melihat efektivitas persuasi. Keduanya menyatu yang berarti bahwa kualitas narasumber ditentukan oleh kualitas pesan yang disampaikannya, dan sebaliknya. Memisahkan proses persuasi dalam dua lajur seperti yang dijelaskan dalam ELM teori menjadi tidak relevan.

  14. Corporate brand extensions based on the purchase likelihood: governance implications

    Directory of Open Access Journals (Sweden)

    Spyridon Goumas

    2018-03-01

    Full Text Available This paper is examining the purchase likelihood of hypothetical service brand extensions from product companies focusing on consumer electronics based on sector categorization and perceptions of fit between the existing product category and image of the company. Prior research has recognized that levels of brand knowledge eases the transference of associations and affect to the new products. Similarity to the existing products of the parent company and perceived image also influence the success of brand extensions. However, sector categorization may interfere with this relationship. The purpose of this study is to examine Greek consumers’ attitudes towards hypothetical brand extensions, and how these are affected by consumers’ existing knowledge about the brand, sector categorization and perceptions of image and category fit of cross-sector extensions. This aim is examined in the context of technological categories, where less-known companies exhibited significance in purchase likelihood, and contradictory with the existing literature, service companies did not perform as positively as expected. Additional insights to the existing literature about sector categorization are provided. The effect of both image and category fit is also examined and predictions regarding the effect of each are made.

  15. Gauging the likelihood of stable cavitation from ultrasound contrast agents.

    Science.gov (United States)

    Bader, Kenneth B; Holland, Christy K

    2013-01-07

    The mechanical index (MI) was formulated to gauge the likelihood of adverse bioeffects from inertial cavitation. However, the MI formulation did not consider bubble activity from stable cavitation. This type of bubble activity can be readily nucleated from ultrasound contrast agents (UCAs) and has the potential to promote beneficial bioeffects. Here, the presence of stable cavitation is determined numerically by tracking the onset of subharmonic oscillations within a population of bubbles for frequencies up to 7 MHz and peak rarefactional pressures up to 3 MPa. In addition, the acoustic pressure rupture threshold of an UCA population was determined using the Marmottant model. The threshold for subharmonic emissions of optimally sized bubbles was found to be lower than the inertial cavitation threshold for all frequencies studied. The rupture thresholds of optimally sized UCAs were found to be lower than the threshold for subharmonic emissions for either single cycle or steady state acoustic excitations. Because the thresholds of both subharmonic emissions and UCA rupture are linearly dependent on frequency, an index of the form I(CAV) = P(r)/f (where P(r) is the peak rarefactional pressure in MPa and f is the frequency in MHz) was derived to gauge the likelihood of subharmonic emissions due to stable cavitation activity nucleated from UCAs.

  16. Safe semi-supervised learning based on weighted likelihood.

    Science.gov (United States)

    Kawakita, Masanori; Takeuchi, Jun'ichi

    2014-05-01

    We are interested in developing a safe semi-supervised learning that works in any situation. Semi-supervised learning postulates that n(') unlabeled data are available in addition to n labeled data. However, almost all of the previous semi-supervised methods require additional assumptions (not only unlabeled data) to make improvements on supervised learning. If such assumptions are not met, then the methods possibly perform worse than supervised learning. Sokolovska, Cappé, and Yvon (2008) proposed a semi-supervised method based on a weighted likelihood approach. They proved that this method asymptotically never performs worse than supervised learning (i.e., it is safe) without any assumption. Their method is attractive because it is easy to implement and is potentially general. Moreover, it is deeply related to a certain statistical paradox. However, the method of Sokolovska et al. (2008) assumes a very limited situation, i.e., classification, discrete covariates, n(')→∞ and a maximum likelihood estimator. In this paper, we extend their method by modifying the weight. We prove that our proposal is safe in a significantly wide range of situations as long as n≤n('). Further, we give a geometrical interpretation of the proof of safety through the relationship with the above-mentioned statistical paradox. Finally, we show that the above proposal is asymptotically safe even when n(')

  17. Applying the elaboration likelihood model of persuasion to a videotape-based eating disorders primary prevention program for adolescent girls.

    Science.gov (United States)

    Withers, Giselle F; Wertheim, Eleanor H

    2004-01-01

    This study applied principles from the Elaboration Likelihood Model of Persuasion to the prevention of disordered eating. Early adolescent girls watched either a preventive videotape only (n=114) or video plus post-video activity (verbal discussion, written exercises, or control discussion) (n=187); or had no intervention (n=104). Significantly more body image and knowledge improvements occurred at post video and follow-up in the intervention groups compared to no intervention. There were no outcome differences among intervention groups, or between girls with high or low elaboration likelihood. Further research is needed in integrating the videotape into a broader prevention package.

  18. The likelihood of Latino women to seek help in response to interpersonalvictimization: An examination of individual, interpersonal and socioculturalinfluences

    Directory of Open Access Journals (Sweden)

    Chiara Sabina

    2014-07-01

    Full Text Available Help-seeking is a process that is influenced by individual, interpersonal, and sociocultural factors. Thecurrent study examined these influences on the likelihood of seeking help (police, pressing charges,medical services, social services, and informal help for interpersonal violence among a national sample ofLatino women. Women living in high-density Latino neighborhoods in the USA were interviewed by phonein their preferred language. Women reporting being, on average, between "somewhat likely" and "verylikely" to seek help should they experience interpersonal victimization. Sequential linear regression resultsindicated that individual (age, depression, interpersonal (having children, past victimization, andsociocultural factors (immigrant status, acculturation were associated with the self-reported likelihood ofseeking help for interpersonal violence. Having children was consistently related to a greater likelihood toseek all forms of help. Overall, women appear to respond to violence in ways that reflects their ecologicalcontext. Help-seeking is best understood within a multi-layered and dynamic context.

  19. Multidetector computed tomographic pulmonary angiography in patients with a high clinical probability of pulmonary embolism.

    Science.gov (United States)

    Moores, L; Kline, J; Portillo, A K; Resano, S; Vicente, A; Arrieta, P; Corres, J; Tapson, V; Yusen, R D; Jiménez, D

    2016-01-01

    ESSENTIALS: When high probability of pulmonary embolism (PE), sensitivity of computed tomography (CT) is unclear. We investigated the sensitivity of multidetector CT among 134 patients with a high probability of PE. A normal CT alone may not safely exclude PE in patients with a high clinical pretest probability. In patients with no clear alternative diagnosis after CTPA, further testing should be strongly considered. Whether patients with a negative multidetector computed tomographic pulmonary angiography (CTPA) result and a high clinical pretest probability of pulmonary embolism (PE) should be further investigated is controversial. This was a prospective investigation of the sensitivity of multidetector CTPA among patients with a priori clinical assessment of a high probability of PE according to the Wells criteria. Among patients with a negative CTPA result, the diagnosis of PE required at least one of the following conditions: ventilation/perfusion lung scan showing a high probability of PE in a patient with no history of PE, abnormal findings on venous ultrasonography in a patient without previous deep vein thrombosis at that site, or the occurrence of venous thromboembolism (VTE) in a 3-month follow-up period after anticoagulation was withheld because of a negative multidetector CTPA result. We identified 498 patients with a priori clinical assessment of a high probability of PE and a completed CTPA study. CTPA excluded PE in 134 patients; in these patients, the pooled incidence of VTE was 5.2% (seven of 134 patients; 95% confidence interval [CI] 1.5-9.0). Five patients had VTEs that were confirmed by an additional imaging test despite a negative CTPA result (five of 48 patients; 10.4%; 95% CI 1.8-19.1), and two patients had objectively confirmed VTEs that occurred during clinical follow-up of at least 3 months (two of 86 patients; 2.3%; 95% CI 0-5.5). None of the patients had a fatal PE during follow-up. A normal multidetector CTPA result alone may not safely

  20. Specialized surveillance for individuals at high risk for melanoma: a cost analysis of a high-risk clinic.

    Science.gov (United States)

    Watts, Caroline G; Cust, Anne E; Menzies, Scott W; Coates, Elliot; Mann, Graham J; Morton, Rachael L

    2015-02-01

    Regular surveillance of individuals at high risk for cutaneous melanoma improves early detection and reduces unnecessary excisions; however, a cost analysis of this specialized service has not been undertaken. To determine the mean cost per patient of surveillance in a high-risk clinic from the health service and societal perspectives. We used a bottom-up microcosting method to measure resource use in a consecutive sample of 102 patients treated in a high-risk hospital-based clinic in Australia during a 12-month period. Surveillance and treatment of melanoma. All surveillance and treatment procedures were identified through direct observation, review of medical records, and interviews with staff and were valued using scheduled fees from the Australian government. Societal costs included transportation and loss of productivity. The mean number of clinic visits per year was 2.7 (95% CI, 2.5-2.8) for surveillance and 3.8 (95% CI, 3.4-4.1) for patients requiring surgical excisions. The mean annual cost per patient to the health system was A $882 (95% CI, A $783-$982) (US $599 [95% CI, US $532-$665]); the cost discounted across 20 years was A $11,546 (95% CI, A $10,263-$12,829) (US $7839 [95% CI, US $6969-$8710]). The mean annual societal cost per patient (excluding health system costs) was A $972 (95% CI, A $899-$1045) (US $660 [95% CI, US $611-$710]); the cost discounted across 20 years was A $12,721 (95% CI, A $12,554-$14,463) (US $8637 [95% CI, US $8523-$9820]). Diagnosis of melanoma or nonmelanoma skin cancer and frequent excisions for benign lesions in a relatively small number of patients was responsible for positively skewed health system costs. Microcosting techniques provide an accurate cost estimate for the provision of a specialized service. The high societal cost reflects the time that patients are willing to invest to attend the high-risk clinic. This alternative model of care for a high-risk population has relevance for decision making about health policy.

  1. Influencing Attitudes Regarding Special Class Placement Using a Psychoeducational Report: An Investigation of the Elaboration Likelihood Model.

    Science.gov (United States)

    Andrews, Lester W.; Gutkin, Terry B.

    1994-01-01

    Investigates variables drawn from the Elaboration Likelihood Model (ELM) that might be manipulated to enhance the persuasiveness of a psychoeducational report. Results showed teachers in training were more persuaded by reports with high message quality. Findings are discussed in terms of the ELM and professional school psychology practice. (RJM)

  2. Failed refutations: further comments on parsimony and likelihood methods and their relationship to Popper's degree of corroboration.

    Science.gov (United States)

    de Queiroz, Kevin; Poe, Steven

    2003-06-01

    Kluge's (2001, Syst. Biol. 50:322-330) continued arguments that phylogenetic methods based on the statistical principle of likelihood are incompatible with the philosophy of science described by Karl Popper are based on false premises related to Kluge's misrepresentations of Popper's philosophy. Contrary to Kluge's conjectures, likelihood methods are not inherently verificationist; they do not treat every instance of a hypothesis as confirmation of that hypothesis. The historical nature of phylogeny does not preclude phylogenetic hypotheses from being evaluated using the probability of evidence. The low absolute probabilities of hypotheses are irrelevant to the correct interpretation of Popper's concept termed degree of corroboration, which is defined entirely in terms of relative probabilities. Popper did not advocate minimizing background knowledge; in any case, the background knowledge of both parsimony and likelihood methods consists of the general assumption of descent with modification and additional assumptions that are deterministic, concerning which tree is considered most highly corroborated. Although parsimony methods do not assume (in the sense of entailing) that homoplasy is rare, they do assume (in the sense of requiring to obtain a correct phylogenetic inference) certain things about patterns of homoplasy. Both parsimony and likelihood methods assume (in the sense of implying by the manner in which they operate) various things about evolutionary processes, although violation of those assumptions does not always cause the methods to yield incorrect phylogenetic inferences. Test severity is increased by sampling additional relevant characters rather than by character reanalysis, although either interpretation is compatible with the use of phylogenetic likelihood methods. Neither parsimony nor likelihood methods assess test severity (critical evidence) when used to identify a most highly corroborated tree(s) based on a single method or model and a

  3. FPGA Acceleration of the phylogenetic likelihood function for Bayesian MCMC inference methods

    Directory of Open Access Journals (Sweden)

    Bakos Jason D

    2010-04-01

    Full Text Available Abstract Background Likelihood (ML-based phylogenetic inference has become a popular method for estimating the evolutionary relationships among species based on genomic sequence data. This method is used in applications such as RAxML, GARLI, MrBayes, PAML, and PAUP. The Phylogenetic Likelihood Function (PLF is an important kernel computation for this method. The PLF consists of a loop with no conditional behavior or dependencies between iterations. As such it contains a high potential for exploiting parallelism using micro-architectural techniques. In this paper, we describe a technique for mapping the PLF and supporting logic onto a Field Programmable Gate Array (FPGA-based co-processor. By leveraging the FPGA's on-chip DSP modules and the high-bandwidth local memory attached to the FPGA, the resultant co-processor can accelerate ML-based methods and outperform state-of-the-art multi-core processors. Results We use the MrBayes 3 tool as a framework for designing our co-processor. For large datasets, we estimate that our accelerated MrBayes, if run on a current-generation FPGA, achieves a 10× speedup relative to software running on a state-of-the-art server-class microprocessor. The FPGA-based implementation achieves its performance by deeply pipelining the likelihood computations, performing multiple floating-point operations in parallel, and through a natural log approximation that is chosen specifically to leverage a deeply pipelined custom architecture. Conclusions Heterogeneous computing, which combines general-purpose processors with special-purpose co-processors such as FPGAs and GPUs, is a promising approach for high-performance phylogeny inference as shown by the growing body of literature in this field. FPGAs in particular are well-suited for this task because of their low power consumption as compared to many-core processors and Graphics Processor Units (GPUs 1.

  4. Microarray background correction: maximum likelihood estimation for the normal-exponential convolution

    DEFF Research Database (Denmark)

    Silver, Jeremy D; Ritchie, Matthew E; Smyth, Gordon K

    2009-01-01

    exponentially distributed, representing background noise and signal, respectively. Using a saddle-point approximation, Ritchie and others (2007) found normexp to be the best background correction method for 2-color microarray data. This article develops the normexp method further by improving the estimation...... is developed for exact maximum likelihood estimation (MLE) using high-quality optimization software and using the saddle-point estimates as starting values. "MLE" is shown to outperform heuristic estimators proposed by other authors, both in terms of estimation accuracy and in terms of performance on real data...

  5. A likelihood ratio test for species membership based on DNA sequence data

    DEFF Research Database (Denmark)

    Matz, Mikhail V.; Nielsen, Rasmus

    2005-01-01

    DNA barcoding as an approach for species identification is rapidly increasing in popularity. However, it remains unclear which statistical procedures should accompany the technique to provide a measure of uncertainty. Here we describe a likelihood ratio test which can be used to test if a sampled...... sequence is a member of an a priori specified species. We investigate the performance of the test using coalescence simulations, as well as using the real data from butterflies and frogs representing two kinds of challenge for DNA barcoding: extremely low and extremely high levels of sequence variability....

  6. Inference for the Sharpe Ratio Using a Likelihood-Based Approach

    Directory of Open Access Journals (Sweden)

    Ying Liu

    2012-01-01

    Full Text Available The Sharpe ratio is the prominent risk-adjusted performance measure used by practitioners. Statistical testing of this ratio using its asymptotic distribution has lagged behind its use. In this paper, highly accurate likelihood analysis is applied for inference on the Sharpe ratio. Both the one- and two-sample problems are considered. The methodology has O(n−3/2 distributional accuracy and can be implemented using any parametric return distribution structure. Simulations are provided to demonstrate the method's superior accuracy over existing methods used for testing in the literature.

  7. Characterization of HIV Recent Infection Among High-Risk Men at Public STI Clinics in Mumbai.

    Science.gov (United States)

    Truong, Hong-Ha M; Fatch, Robin; Grant, Robert M; Mathur, Meenakshi; Kumta, Sameer; Jerajani, Hemangi; Kellogg, Timothy A; Lindan, Christina P

    2018-02-16

    We examined associations with HIV recent infection and estimated transmitted drug resistance (TDR) prevalence among 3345 men at sexually transmitted infection clinics in Mumbai (2002-2005). HIV seroincidence was 7.92% by the BED-CEIA and was higher at a clinic located near brothels (12.39%) than at a hospital-based clinic (3.94%). HIV recent infection was associated with a lifetime history of female sex worker (FSW) partners, HSV-2, genital warts, and gonorrhea. TDR prevalence among recent infection cases was 5.7%. HIV testing services near sex venues may enhance case detection among high-risk men who represent a bridging population between FSWs and the men's other sexual partners.

  8. Endocrine therapy for breast cancer prevention in high-risk women: clinical and economic considerations.

    Science.gov (United States)

    Groom, Amy G; Younis, Tallal

    2016-01-01

    The global burden of breast cancer highlights the need for primary prevention strategies that demonstrate both favorable clinical benefit/risk profile and good value for money. Endocrine therapy with selective estrogen-receptor modulators (SERMs) or aromatase inhibitors (AIs) has been associated with a favorable clinical benefit/risk profile in the prevention of breast cancer in women at high risk of developing the disease. The available endocrine therapy strategies differ in terms of their relative reductions of breast cancer risk, potential side effects, and upfront drug acquisition costs, among others. This review highlights the clinical trials of SERMs and AIs for the primary prevention of breast cancer, and the cost-effectiveness /cost-utility studies that have examined their "value for money" in various health care jurisdictions.

  9. The asymptotic behaviour of the maximum likelihood function of Kriging approximations using the Gaussian correlation function

    CSIR Research Space (South Africa)

    Kok, S

    2012-07-01

    Full Text Available continuously as the correlation function hyper-parameters approach zero. Since the global minimizer of the maximum likelihood function is an asymptote in this case, it is unclear if maximum likelihood estimation (MLE) remains valid. Numerical ill...

  10. High purity of human oligodendrocyte progenitor cells obtained from neural stem cells: suitable for clinical application.

    Science.gov (United States)

    Wang, Caiying; Luan, Zuo; Yang, Yinxiang; Wang, Zhaoyan; Wang, Qian; Lu, Yabin; Du, Qingan

    2015-01-30

    Recent studies have suggested that the transplantation of oligodendrocyte progenitor cells (OPCs) may be a promising potential therapeutic strategy for a broad range of diseases affecting myelin, such as multiple sclerosis, periventricular leukomalacia, and spinal cord injury. Clinical interest arose from the potential of human stem cells to be directed to OPCs for the clinical application of treating these diseases since large quantities of high quality OPCs are needed. However, to date, there have been precious few studies about OPC induction from human neural stem cells (NSCs). Here we successfully directed human fetal NSCs into highly pure OPCs using a cocktail of basic fibroblast growth factor, platelet-derived growth factor, and neurotrophic factor-3. These cells had typical morphology of OPCs, and 80-90% of them expressed specific OPC markers such as A2B5, O4, Sox10 and PDGF-αR. When exposed to differentiation medium, 90% of the cells differentiated into oligodendrocytes. The OPCs could be amplified in our culture medium and passaged at least 10 times. Compared to a recent published method, this protocol had much higher stability and repeatability, and OPCs could be obtained from NSCs from passage 5 to 38. It also obtained more highly pure OPCs (80-90%) via simpler and more convenient manipulation. This study provided an easy and efficient method to obtain large quantities of high-quality human OPCs to meet clinical demand. Copyright © 2014 Elsevier B.V. All rights reserved.

  11. Transfer Entropy as a Log-Likelihood Ratio

    Science.gov (United States)

    Barnett, Lionel; Bossomaier, Terry

    2012-09-01

    Transfer entropy, an information-theoretic measure of time-directed information transfer between joint processes, has steadily gained popularity in the analysis of complex stochastic dynamics in diverse fields, including the neurosciences, ecology, climatology, and econometrics. We show that for a broad class of predictive models, the log-likelihood ratio test statistic for the null hypothesis of zero transfer entropy is a consistent estimator for the transfer entropy itself. For finite Markov chains, furthermore, no explicit model is required. In the general case, an asymptotic χ2 distribution is established for the transfer entropy estimator. The result generalizes the equivalence in the Gaussian case of transfer entropy and Granger causality, a statistical notion of causal influence based on prediction via vector autoregression, and establishes a fundamental connection between directed information transfer and causality in the Wiener-Granger sense.

  12. A Non-standard Empirical Likelihood for Time Series

    DEFF Research Database (Denmark)

    Nordman, Daniel J.; Bunzel, Helle; Lahiri, Soumendra N.

    Standard blockwise empirical likelihood (BEL) for stationary, weakly dependent time series requires specifying a fixed block length as a tuning parameter for setting confidence regions. This aspect can be difficult and impacts coverage accuracy. As an alternative, this paper proposes a new version...... of BEL based on a simple, though non-standard, data-blocking rule which uses a data block of every possible length. Consequently, the method involves no block selection and is also anticipated to exhibit better coverage performance. Its non-standard blocking scheme, however, induces non......-standard asymptotics and requires a significantly different development compared to standard BEL. We establish the large-sample distribution of log-ratio statistics from the new BEL method for calibrating confidence regions for mean or smooth function parameters of time series. This limit law is not the usual chi...

  13. Neutron spectra unfolding with maximum entropy and maximum likelihood

    International Nuclear Information System (INIS)

    Itoh, Shikoh; Tsunoda, Toshiharu

    1989-01-01

    A new unfolding theory has been established on the basis of the maximum entropy principle and the maximum likelihood method. This theory correctly embodies the Poisson statistics of neutron detection, and always brings a positive solution over the whole energy range. Moreover, the theory unifies both problems of overdetermined and of underdetermined. For the latter, the ambiguity in assigning a prior probability, i.e. the initial guess in the Bayesian sense, has become extinct by virtue of the principle. An approximate expression of the covariance matrix for the resultant spectra is also presented. An efficient algorithm to solve the nonlinear system, which appears in the present study, has been established. Results of computer simulation showed the effectiveness of the present theory. (author)

  14. Narrow band interference cancelation in OFDM: Astructured maximum likelihood approach

    KAUST Repository

    Sohail, Muhammad Sadiq

    2012-06-01

    This paper presents a maximum likelihood (ML) approach to mitigate the effect of narrow band interference (NBI) in a zero padded orthogonal frequency division multiplexing (ZP-OFDM) system. The NBI is assumed to be time variant and asynchronous with the frequency grid of the ZP-OFDM system. The proposed structure based technique uses the fact that the NBI signal is sparse as compared to the ZP-OFDM signal in the frequency domain. The structure is also useful in reducing the computational complexity of the proposed method. The paper also presents a data aided approach for improved NBI estimation. The suitability of the proposed method is demonstrated through simulations. © 2012 IEEE.

  15. H.264 SVC Complexity Reduction Based on Likelihood Mode Decision

    Directory of Open Access Journals (Sweden)

    L. Balaji

    2015-01-01

    Full Text Available H.264 Advanced Video Coding (AVC was prolonged to Scalable Video Coding (SVC. SVC executes in different electronics gadgets such as personal computer, HDTV, SDTV, IPTV, and full-HDTV in which user demands various scaling of the same content. The various scaling is resolution, frame rate, quality, heterogeneous networks, bandwidth, and so forth. Scaling consumes more encoding time and computational complexity during mode selection. In this paper, to reduce encoding time and computational complexity, a fast mode decision algorithm based on likelihood mode decision (LMD is proposed. LMD is evaluated in both temporal and spatial scaling. From the results, we conclude that LMD performs well, when compared to the previous fast mode decision algorithms. The comparison parameters are time, PSNR, and bit rate. LMD achieve time saving of 66.65% with 0.05% detriment in PSNR and 0.17% increment in bit rate compared with the full search method.

  16. H.264 SVC Complexity Reduction Based on Likelihood Mode Decision.

    Science.gov (United States)

    Balaji, L; Thyagharajan, K K

    2015-01-01

    H.264 Advanced Video Coding (AVC) was prolonged to Scalable Video Coding (SVC). SVC executes in different electronics gadgets such as personal computer, HDTV, SDTV, IPTV, and full-HDTV in which user demands various scaling of the same content. The various scaling is resolution, frame rate, quality, heterogeneous networks, bandwidth, and so forth. Scaling consumes more encoding time and computational complexity during mode selection. In this paper, to reduce encoding time and computational complexity, a fast mode decision algorithm based on likelihood mode decision (LMD) is proposed. LMD is evaluated in both temporal and spatial scaling. From the results, we conclude that LMD performs well, when compared to the previous fast mode decision algorithms. The comparison parameters are time, PSNR, and bit rate. LMD achieve time saving of 66.65% with 0.05% detriment in PSNR and 0.17% increment in bit rate compared with the full search method.

  17. Likelihood Approximation With Hierarchical Matrices For Large Spatial Datasets

    KAUST Repository

    Litvinenko, Alexander

    2017-09-03

    We use available measurements to estimate the unknown parameters (variance, smoothness parameter, and covariance length) of a covariance function by maximizing the joint Gaussian log-likelihood function. To overcome cubic complexity in the linear algebra, we approximate the discretized covariance function in the hierarchical (H-) matrix format. The H-matrix format has a log-linear computational cost and storage O(kn log n), where the rank k is a small integer and n is the number of locations. The H-matrix technique allows us to work with general covariance matrices in an efficient way, since H-matrices can approximate inhomogeneous covariance functions, with a fairly general mesh that is not necessarily axes-parallel, and neither the covariance matrix itself nor its inverse have to be sparse. We demonstrate our method with Monte Carlo simulations and an application to soil moisture data. The C, C++ codes and data are freely available.

  18. Music genre classification via likelihood fusion from multiple feature models

    Science.gov (United States)

    Shiu, Yu; Kuo, C.-C. J.

    2005-01-01

    Music genre provides an efficient way to index songs in a music database, and can be used as an effective means to retrieval music of a similar type, i.e. content-based music retrieval. A new two-stage scheme for music genre classification is proposed in this work. At the first stage, we examine a couple of different features, construct their corresponding parametric models (e.g. GMM and HMM) and compute their likelihood functions to yield soft classification results. In particular, the timbre, rhythm and temporal variation features are considered. Then, at the second stage, these soft classification results are integrated to result in a hard decision for final music genre classification. Experimental results are given to demonstrate the performance of the proposed scheme.

  19. Marginal Maximum Likelihood Estimation of Item Response Models in R

    Directory of Open Access Journals (Sweden)

    Matthew S. Johnson

    2007-02-01

    Full Text Available Item response theory (IRT models are a class of statistical models used by researchers to describe the response behaviors of individuals to a set of categorically scored items. The most common IRT models can be classified as generalized linear fixed- and/or mixed-effect models. Although IRT models appear most often in the psychological testing literature, researchers in other fields have successfully utilized IRT-like models in a wide variety of applications. This paper discusses the three major methods of estimation in IRT and develops R functions utilizing the built-in capabilities of the R environment to find the marginal maximum likelihood estimates of the generalized partial credit model. The currently available R packages ltm is also discussed.

  20. Maximum likelihood estimation of phase-type distributions

    DEFF Research Database (Denmark)

    Esparza, Luz Judith R

    for both univariate and multivariate cases. Methods like the EM algorithm and Markov chain Monte Carlo are applied for this purpose. Furthermore, this thesis provides explicit formulae for computing the Fisher information matrix for discrete and continuous phase-type distributions, which is needed to find......This work is concerned with the statistical inference of phase-type distributions and the analysis of distributions with rational Laplace transform, known as matrix-exponential distributions. The thesis is focused on the estimation of the maximum likelihood parameters of phase-type distributions...... confidence regions for their estimated parameters. Finally, a new general class of distributions, called bilateral matrix-exponential distributions, is defined. These distributions have the entire real line as domain and can be used, for instance, for modelling. In addition, this class of distributions...

  1. The elaboration likelihood model and communication about food risks.

    Science.gov (United States)

    Frewer, L J; Howard, C; Hedderley, D; Shepherd, R

    1997-12-01

    Factors such as hazard type and source credibility have been identified as important in the establishment of effective strategies for risk communication. The elaboration likelihood model was adapted to investigate the potential impact of hazard type, information source, and persuasive content of information on individual engagement in elaborative, or thoughtful, cognitions about risk messages. One hundred sixty respondents were allocated to one of eight experimental groups, and the effects of source credibility, persuasive content of information and hazard type were systematically varied. The impact of the different factors on beliefs about the information and elaborative processing examined. Low credibility was particularly important in reducing risk perceptions, although persuasive content and hazard type were also influential in determining whether elaborative processing occurred.

  2. Maximum Likelihood Blood Velocity Estimator Incorporating Properties of Flow Physics

    DEFF Research Database (Denmark)

    Schlaikjer, Malene; Jensen, Jørgen Arendt

    2004-01-01

    )-data under investigation. The flow physic properties are exploited in the second term, as the range of velocity values investigated in the cross-correlation analysis are compared to the velocity estimates in the temporal and spatial neighborhood of the signal segment under investigation. The new estimator...... has been compared to the cross-correlation (CC) estimator and the previously developed maximum likelihood estimator (MLE). The results show that the CMLE can handle a larger velocity search range and is capable of estimating even low velocity levels from tissue motion. The CC and the MLE produce...... for the CC and the MLE. When the velocity search range is set to twice the limit of the CC and the MLE, the number of incorrect velocity estimates are 0, 19.1, and 7.2% for the CMLE, CC, and MLE, respectively. The ability to handle a larger search range and estimating low velocity levels was confirmed...

  3. Transferring Aviation Practices into Clinical Medicine for the Promotion of High Reliability.

    Science.gov (United States)

    Powell-Dunford, Nicole; McPherson, Mark K; Pina, Joseph S; Gaydos, Steven J

    2017-05-01

    Aviation is a classic example of a high reliability organization (HRO)-an organization in which catastrophic events are expected to occur without control measures. As health care systems transition toward high reliability, aviation practices are increasingly transferred for clinical implementation. A PubMed search using the terms aviation, crew resource management, and patient safety was undertaken. Manuscripts authored by physician pilots and accident investigation regulations were analyzed. Subject matter experts involved in adoption of aviation practices into the medical field were interviewed. A PubMed search yielded 621 results with 22 relevant for inclusion. Improved clinical outcomes were noted in five research trials in which aviation practices were adopted, particularly with regard to checklist usage and crew resource-management training. Effectiveness of interventions was influenced by intensity of application, leadership involvement, and provision of staff training. The usefulness of incorporating mishap investigation techniques has not been established. Whereas aviation accident investigation is highly standardized, the investigation of medical error is characterized by variation. The adoption of aviation practices into clinical medicine facilitates an evolution toward high reliability. Evidence for the efficacy of the checklist and crew resource-management training is robust. Transference of aviation accident investigation practices is preliminary. A standardized, independent investigation process could facilitate the development of a safety culture commensurate with that achieved in the aviation industry.Powell-Dunford N, McPherson MK, Pina JS, Gaydos SJ. Transferring aviation practices into clinical medicine for the promotion of high reliability. Aerosp Med Hum Perform. 2017; 88(5):487-491.

  4. A novel high resolution and high efficiency dual head detector for molecular breast imaging: New results from clinical trials

    Energy Technology Data Exchange (ETDEWEB)

    Garibaldi, F., E-mail: franco.garibaldi@iss.infn.i [ISS and INFN Roma, gr. Sanita, Rome (Italy); Cisbani, E.; Colilli, S.; Cusanno, F.; Fratoni, R.; Giuliani, F.; Gricia, M.; Lucentini, M.; Magliozzi, M.L.; Santavenere, F.; Torrioli, S. [ISS and INFN Roma, gr. Sanita, Rome (Italy); Musico, P. [INFN Genova, Genova (Italy); Argentieri, A. [INFN Bari, Bari (Italy); Cossu, E.; Padovano, F.; Simonetti, G. [ISS and INFN Roma, gr. Sanita, Rome (Italy); Schillaci, O. [University of Tor Vergata, Rome (Italy); Majewski, S. [West Virginia University, Morgantown, West Virginia (United States)

    2010-05-21

    Detecting small breast tumors is a challenging task. Molecular Breast Imaging with radionuclides has a central role to play in this respect. Our group has recently designed and implemented a dual detector setup that allows spot compression and improves significantly the performance of the system. The single head detector has been successfully used for clinical trials with 10 patients in comparison with a commercial high resolution detector. Then the dual head system has been showed to have significant advantages for the detection of small tumors.

  5. Accelerated maximum likelihood parameter estimation for stochastic biochemical systems

    Directory of Open Access Journals (Sweden)

    Daigle Bernie J

    2012-05-01

    Full Text Available Abstract Background A prerequisite for the mechanistic simulation of a biochemical system is detailed knowledge of its kinetic parameters. Despite recent experimental advances, the estimation of unknown parameter values from observed data is still a bottleneck for obtaining accurate simulation results. Many methods exist for parameter estimation in deterministic biochemical systems; methods for discrete stochastic systems are less well developed. Given the probabilistic nature of stochastic biochemical models, a natural approach is to choose parameter values that maximize the probability of the observed data with respect to the unknown parameters, a.k.a. the maximum likelihood parameter estimates (MLEs. MLE computation for all but the simplest models requires the simulation of many system trajectories that are consistent with experimental data. For models with unknown parameters, this presents a computational challenge, as the generation of consistent trajectories can be an extremely rare occurrence. Results We have developed Monte Carlo Expectation-Maximization with Modified Cross-Entropy Method (MCEM2: an accelerated method for calculating MLEs that combines advances in rare event simulation with a computationally efficient version of the Monte Carlo expectation-maximization (MCEM algorithm. Our method requires no prior knowledge regarding parameter values, and it automatically provides a multivariate parameter uncertainty estimate. We applied the method to five stochastic systems of increasing complexity, progressing from an analytically tractable pure-birth model to a computationally demanding model of yeast-polarization. Our results demonstrate that MCEM2 substantially accelerates MLE computation on all tested models when compared to a stand-alone version of MCEM. Additionally, we show how our method identifies parameter values for certain classes of models more accurately than two recently proposed computationally efficient methods

  6. Likelihood of illegal alcohol sales at professional sport stadiums.

    Science.gov (United States)

    Toomey, Traci L; Erickson, Darin J; Lenk, Kathleen M; Kilian, Gunna R

    2008-11-01

    Several studies have assessed the propensity for illegal alcohol sales at licensed alcohol establishments and community festivals, but no previous studies examined the propensity for these sales at professional sport stadiums. In this study, we assessed the likelihood of alcohol sales to both underage youth and obviously intoxicated patrons at professional sports stadiums across the United States, and assessed the factors related to likelihood of both types of alcohol sales. We conducted pseudo-underage (i.e., persons age 21 or older who appear under 21) and pseudo-intoxicated (i.e., persons feigning intoxication) alcohol purchase attempts at stadiums that house professional hockey, basketball, baseball, and football teams. We conducted the purchase attempts at 16 sport stadiums located in 5 states. We measured 2 outcome variables: pseudo-underage sale (yes, no) and pseudo-intoxicated sale (yes, no), and 3 types of independent variables: (1) seller characteristics, (2) purchase attempt characteristics, and (3) event characteristics. Following univariate and bivariate analyses, we a separate series of logistic generalized mixed regression models for each outcome variable. The overall sales rates to the pseudo-underage and pseudo-intoxicated buyers were 18% and 74%, respectively. In the multivariate logistic analyses, we found that the odds of a sale to a pseudo-underage buyer in the stands was 2.9 as large as the odds of a sale at the concession booths (30% vs. 13%; p = 0.01). The odds of a sale to an obviously intoxicated buyer in the stands was 2.9 as large as the odds of a sale at the concession booths (89% vs. 73%; p = 0.02). Similar to studies assessing illegal alcohol sales at licensed alcohol establishments and community festivals, findings from this study shows the need for interventions specifically focused on illegal alcohol sales at professional sporting events.

  7. Targeted maximum likelihood estimation for a binary treatment: A tutorial.

    Science.gov (United States)

    Luque-Fernandez, Miguel Angel; Schomaker, Michael; Rachet, Bernard; Schnitzer, Mireille E

    2018-04-23

    When estimating the average effect of a binary treatment (or exposure) on an outcome, methods that incorporate propensity scores, the G-formula, or targeted maximum likelihood estimation (TMLE) are preferred over naïve regression approaches, which are biased under misspecification of a parametric outcome model. In contrast propensity score methods require the correct specification of an exposure model. Double-robust methods only require correct specification of either the outcome or the exposure model. Targeted maximum likelihood estimation is a semiparametric double-robust method that improves the chances of correct model specification by allowing for flexible estimation using (nonparametric) machine-learning methods. It therefore requires weaker assumptions than its competitors. We provide a step-by-step guided implementation of TMLE and illustrate it in a realistic scenario based on cancer epidemiology where assumptions about correct model specification and positivity (ie, when a study participant had 0 probability of receiving the treatment) are nearly violated. This article provides a concise and reproducible educational introduction to TMLE for a binary outcome and exposure. The reader should gain sufficient understanding of TMLE from this introductory tutorial to be able to apply the method in practice. Extensive R-code is provided in easy-to-read boxes throughout the article for replicability. Stata users will find a testing implementation of TMLE and additional material in the Appendix S1 and at the following GitHub repository: https://github.com/migariane/SIM-TMLE-tutorial. © 2018 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd.

  8. Low versus high volume of culture medium during embryo transfer: a randomized clinical trial.

    Science.gov (United States)

    Sigalos, George Α; Michalopoulos, Yannis; Kastoras, Athanasios G; Triantafyllidou, Olga; Vlahos, Nikos F

    2018-04-01

    The aim of this prospective randomized control trial was to evaluate if the use of two different volumes (20-25 vs 40-45 μl) of media used for embryo transfer affects the clinical outcomes in fresh in vitro fertilization (IVF) cycles. In total, 236 patients were randomized in two groups, i.e., "low volume" group (n = 118) transferring the embryos with 20-25 μl of medium and "high volume" group (n = 118) transferring the embryos with 40-45 μl of medium. The clinical pregnancy, implantation, and ongoing pregnancy rates were compared between the two groups. No statistically significant differences were observed in clinical pregnancy (46.8 vs 54.3%, p = 0.27), implantation (23.7 vs 27.8%, p = 0.30), and ongoing pregnancy (33.3 vs 40.0%, p = 0.31) rates between low and high volume group, respectively. Higher volume of culture medium to load the embryo into the catheter during embryo transfer does not influence the clinical outcome in fresh IVF cycles. NCT03350646.

  9. Axis I diagnoses and transition to psychosis in clinical high-risk patients EPOS project: Prospective follow-up of 245 clinical high-risk outpatients in four countries

    NARCIS (Netherlands)

    Salokangas, Raimo K. R.; Ruhrmann, Stephan; von Reventlow, Heinrich Graf; Heinimaa, Markus; Svirskis, Tanja; From, Tiina; Luutonen, Sinikka; Juckel, Georg; Linszen, Don; Dingemans, Peter; Birchwood, Max; Patterson, Paul; Schultze-Lutter, Frauke; Klosterkötter, Joachim; Picke, Heinz; Neumann, Meike; Brockhaus-Dumke, Anke; Pukrop, Ralf; Huttunen, Jukka; Laine, Tiina; Ilonen, Tuula; Ristkari, Terja; Hietala, Jarmo; Becker, Hiske; Nieman, Dorien; Skeate, Amanda; Gudlowski, Yehonala; Ozgürdal, Seza; Witthaus, Henning; French, Paul; Stevens, Helen

    2012-01-01

    Background: In selected samples, a considerable number of patients at clinical high risk of psychosis (CHR) are found to meet criteria for co-morbid clinical psychiatric disorders. It is not known how clinical diagnoses correspond to or even predict transitions to psychosis (TTP). Our aim was to

  10. Maximum-likelihood estimation of the hyperbolic parameters from grouped observations

    DEFF Research Database (Denmark)

    Jensen, Jens Ledet

    1988-01-01

    a least-squares problem. The second procedure Hypesti first approaches the maximum-likelihood estimate by iterating in the profile-log likelihood function for the scale parameter. Close to the maximum of the likelihood function, the estimation is brought to an end by iteration, using all four parameters...

  11. A short proof that phylogenetic tree reconstruction by maximum likelihood is hard.

    Science.gov (United States)

    Roch, Sebastien

    2006-01-01

    Maximum likelihood is one of the most widely used techniques to infer evolutionary histories. Although it is thought to be intractable, a proof of its hardness has been lacking. Here, we give a short proof that computing the maximum likelihood tree is NP-hard by exploiting a connection between likelihood and parsimony observed by Tuffley and Steel.

  12. A Short Proof that Phylogenetic Tree Reconstruction by Maximum Likelihood is Hard

    OpenAIRE

    Roch, S.

    2005-01-01

    Maximum likelihood is one of the most widely used techniques to infer evolutionary histories. Although it is thought to be intractable, a proof of its hardness has been lacking. Here, we give a short proof that computing the maximum likelihood tree is NP-hard by exploiting a connection between likelihood and parsimony observed by Tuffley and Steel.

  13. 103PD brachytherapy and external beam irradiation for clinically localized, high-risk prostatic carcinoma

    International Nuclear Information System (INIS)

    Dattoli, Michael; Wallner, Kent; Sorace, Richard; Koval, John; Cash, Jennifer; Acosta, Rudolph; Brown, Charles; Etheridge, James; Binder, Michael; Brunelle, Richard; Kirwan, Novelle; Sanchez, Servando; Stein, Douglas; Wasserman, Stuart

    1996-01-01

    Purpose: To summarize biochemical failure rates and morbidity of external beam irradiation (EBRT) combined with palladium ( 103 Pd) boost for clinically localized high-risk prostate carcinoma. Methods and Materials: Seventy-three consecutive patients with stage T2a-T3 prostatic carcinoma were treated from 1991 through 1994. Each patient had at least one of the following risk factors for extracapsular disease extension: Stage T2b or greater (71 patients), Gleason score 7-10 (40 patients), prostate specific antigen (PSA) >15 (32 patients), or elevated prostatic acid phosphatase (PAP) (17 patients). Patients received 41 Gy EBRT to a limited pelvic field, followed 4 weeks later by a 103 Pd boost (prescription dose: 80 Gy). Biochemical failure was defined as a PSA greater than 1.0 ng/ml (normal 103 Pd brachytherapy for clinically localized, high-risk prostate cancer compare favorably with that reported after conventional dose EBRT alone. Morbidity has been acceptable

  14. Further Evaluation of Covariate Analysis using Empirical Bayes Estimates in Population Pharmacokinetics: the Perception of Shrinkage and Likelihood Ratio Test.

    Science.gov (United States)

    Xu, Xu Steven; Yuan, Min; Yang, Haitao; Feng, Yan; Xu, Jinfeng; Pinheiro, Jose

    2017-01-01

    Covariate analysis based on population pharmacokinetics (PPK) is used to identify clinically relevant factors. The likelihood ratio test (LRT) based on nonlinear mixed effect model fits is currently recommended for covariate identification, whereas individual empirical Bayesian estimates (EBEs) are considered unreliable due to the presence of shrinkage. The objectives of this research were to investigate the type I error for LRT and EBE approaches, to confirm the similarity of power between the LRT and EBE approaches from a previous report and to explore the influence of shrinkage on LRT and EBE inferences. Using an oral one-compartment PK model with a single covariate impacting on clearance, we conducted a wide range of simulations according to a two-way factorial design. The results revealed that the EBE-based regression not only provided almost identical power for detecting a covariate effect, but also controlled the false positive rate better than the LRT approach. Shrinkage of EBEs is likely not the root cause for decrease in power or inflated false positive rate although the size of the covariate effect tends to be underestimated at high shrinkage. In summary, contrary to the current recommendations, EBEs may be a better choice for statistical tests in PPK covariate analysis compared to LRT. We proposed a three-step covariate modeling approach for population PK analysis to utilize the advantages of EBEs while overcoming their shortcomings, which allows not only markedly reducing the run time for population PK analysis, but also providing more accurate covariate tests.

  15. Clinical Application of Esophageal High-resolution Manometry in the Diagnosis of Esophageal Motility Disorders

    OpenAIRE

    van Hoeij, Froukje B; Bredenoord, Albert J

    2016-01-01

    Esophageal high-resolution manometry (HRM) is replacing conventional manometry in the clinical evaluation of patients with esophageal symptoms, especially dysphagia. The introduction of HRM gave rise to new objective metrics and recognizable patterns of esophageal motor function, requiring a new classification scheme: the Chicago classification. HRM measurements are more detailed and more easily performed compared to conventional manometry. The visual presentation of acquired data improved th...

  16. Thyroid hormones, interpersonal violence and personality traits : clinical studies in high-risk psychiatric cohorts

    OpenAIRE

    Sinai, Cave

    2015-01-01

    Suicidal and violent behaviors as well as early life adversity are prevalent in clinical high-risk populations. Early life adversity is related to developmental dysregulation of behavioral and emotional traits. The neuroendocrine systems involved in the development of dysfunctional behavior and impulsive aggressive traits are not fully known. The overall aim of this thesis was to investigate the relationship between thyroid hormones and personality traits, as well as to exposur...

  17. Changes in patellofemoral alignment do not cause clinical impact after open-wedge high tibial osteotomy.

    Science.gov (United States)

    Lee, Yong Seuk; Lee, Sang Bok; Oh, Won Seok; Kwon, Yong Eok; Lee, Beom Koo

    2016-01-01

    The objectives of this study were (1) to evaluate the clinical and radiologic outcomes of open-wedge high tibial osteotomy focusing on patellofemoral alignment and (2) to search for correlation between variables and patellofemoral malalignment. A total of 46 knees (46 patients) from 32 females and 14 males who underwent open-wedge high tibial osteotomy were included in this retrospective case series. Outcomes were evaluated using clinical scales and radiologic parameters at the last follow-up. Pre-operative and final follow-up values were compared for the outcome analysis. For the focused analysis of the patellofemoral joint, correlation analyses between patellofemoral variables and pre- and post-operative weight-bearing line (WBL), clinical score, posterior slope, Blackburn Peel ratio, lateral patellar tilt, lateral patellar shift, and congruence angle were performed. The minimum follow-up period was 2 years and median follow-up period was 44 months (range 24-88 months). The percentage of weight-bearing line was shifted from 17.2 ± 11.1 to 56.7 ± 12.7%, and it was statistically significant (p patellofemoral malalignment, the pre-operative weight-bearing line showed an association with the change in lateral patellar tilt and lateral patellar shift (correlation coefficient: 0.3). After open-wedge high tibial osteotomy, clinical results showed improvement, compared to pre-operative values. The patellar tilt and lateral patellar shift were not changed; however, descent of the patella was observed. Therefore, mild patellofemoral problems should not be a contraindication of the open-wedge high tibial osteotomy. Case series, Level IV.

  18. High intensity focused ultrasound treatment of small renal masses: Clinical effectiveness and technological advances

    Science.gov (United States)

    Nabi, G.; Goodman, C.; Melzer, A.

    2010-01-01

    The review summarises the technological advances in the application of high-intensity focused ultrasound for small renal masses presumed to be cancer including the systematic review of its clinical application. Current progress in the area of magnetic resonance image guided ultrasound ablation is also appraised. Specifically, organ tracking and real time monitoring of temperature changes during the treatment are discussed. Finally, areas of future research interest are outlined. PMID:21116349

  19. High intensity focused ultrasound treatment of small renal masses: Clinical effectiveness and technological advances

    OpenAIRE

    Nabi, G.; Goodman, C.; Melzer, A.

    2010-01-01

    The review summarises the technological advances in the application of high-intensity focused ultrasound for small renal masses presumed to be cancer including the systematic review of its clinical application. Current progress in the area of magnetic resonance image guided ultrasound ablation is also appraised. Specifically, organ tracking and real time monitoring of temperature changes during the treatment are discussed. Finally, areas of future research interest are outlined.

  20. Effects of deceptive packaging and product involvement on purchase intention: an elaboration likelihood model perspective.

    Science.gov (United States)

    Lammers, H B

    2000-04-01

    From an Elaboration Likelihood Model perspective, it was hypothesized that postexposure awareness of deceptive packaging claims would have a greater negative effect on scores for purchase intention by consumers lowly involved rather than highly involved with a product (n = 40). Undergraduates who were classified as either highly or lowly (ns = 20 and 20) involved with M&Ms examined either a deceptive or non-deceptive package design for M&Ms candy and were subsequently informed of the deception employed in the packaging before finally rating their intention to purchase. As anticipated, highly deceived subjects who were low in involvement rated intention to purchase lower than their highly involved peers. Overall, the results attest to the robustness of the model and suggest that the model has implications beyond advertising effects and into packaging effects.

  1. High mercury seafood consumption associated with fatigue at specialty medical clinics on Long Island, NY

    Directory of Open Access Journals (Sweden)

    Shivam Kothari

    2015-01-01

    Full Text Available We investigated the association between seafood consumption and symptoms related to potential mercury toxicity in patients presenting to specialty medical clinics at Stony Brook Medical Center on Long Island, New York. We surveyed 118 patients from April–August 2012 about their seafood consumption patterns, specifically how frequently they were eating each type of fish, to assess mercury exposure. We also asked about symptoms associated with mercury toxicity including depression, fatigue, balance difficulties, or tingling around the mouth. Of the 118 adults surveyed, 14 consumed high mercury seafood (tuna steak, marlin, swordfish, or shark at least weekly. This group was more likely to suffer from fatigue than other patients (p = 0.02. Logistic regression confirmed this association of fatigue with frequent high mercury fish consumption in both unadjusted analysis (OR = 5.53; 95% CI: 1.40–21.90 and analysis adjusted for age, race, sex, income, and clinic type (OR = 7.89; 95% CI: 1.63–38.15. No associations were observed between fish intake and depression, balance difficulties, or tingling around the mouth. Findings suggest that fatigue may be associated with eating high mercury fish but sample size is small. Larger studies are needed to determine whether fish intake patterns or blood mercury tests warrant consideration as part of the clinical work-up in coastal regions.

  2. High HIV prevalence among a high-risk subgroup of women attending sexually transmitted infection clinics in Pune, India.

    Science.gov (United States)

    Mehta, Shruti H; Gupta, Amita; Sahay, Seema; Godbole, Sheela V; Joshi, Smita N; Reynolds, Steven J; Celentano, David D; Risbud, Arun; Mehendale, Sanjay M; Bollinger, Robert C

    2006-01-01

    To investigate changes over a decade in prevalence and correlates of HIV among high-risk women attending sexually transmitted infection (STI) clinics in Pune, India, who deny a history of commercial sex work (CSW). Cross-sectional. From 1993 to 2002, 2376 women attending 3 STI clinics in Pune were offered HIV screening. Women who denied CSW were included (n = 1020). Of 1020 women, 21% were HIV infected. The annual HIV prevalence increased from 14% in 1993 to 29% in 2001-2002 (P women were older, more often employed, less likely to be currently married, and more likely to report condom use. In multivariate analysis, factors independently associated with HIV were calendar period (adjusted odds ratio [AOR], 1.9 for 1997-1999 vs. 1993-1996; 95% CI, 1.2-3.0; AOR, 2.3 for 2000-2002 vs. 1993-1996; 95% CI, 1.5-3.6), lack of formal education (AOR, 2.0; 95% CI, 1.4-2.9), having been widowed (AOR, 3.1; 95% CI, 1.6-6.1), current employment (AOR, 1.8; 95% CI, 1.2-2.6), and genital ulcer disease on examination (AOR, 1.8; 95% CI, 1.2-2.7). Women attending STI clinics in India who deny a history of CSW represent a small, hidden subgroup, likely put at risk for HIV because of high-risk behavior of their male partners, generally their husbands. Educational and awareness efforts that have targeted other subgroups in India (men and CSWs) should also focus on these hard-to-reach women. Risk reduction in this subgroup of Indian women would also be expected to reduce perinatal infections in India.

  3. Clinical evaluation of low vision and central foveal thickness in highly myopic cataract eyes after phacoemulsification

    Directory of Open Access Journals (Sweden)

    Ji-Li Chen

    2015-07-01

    Full Text Available AIM:To retrospectively evaluate central foveal thickness in highly myopic eyes with best correct visual acuity(BCVAMETHODS: In this retrospective clinical study, we consecutively recruited 70 low highly myopic cataract subjects(70 eyesunderwent Phaco. Postoperative visits were performed at 1wk, 1 and 3mo. Postoperative BCVA were recorded and further divided into 2 groups with BCVARESULTS: The ratio of BCVAPr=-0.716, PCONCLUSION: In this study, BCVA is improved after 3mo follow up. There has significant correlation between postoperative BCVA and central foveal thickness.

  4. Bronchial asthma: correlation of high resolution computerized tomography findings with clinical data

    International Nuclear Information System (INIS)

    Mogami, Roberto; Marchiori, Edson; Kirk, Kennedy; Capone, Domenico; Daltro, Pedro

    1999-01-01

    In this work we did a sectional study of 31 asthmatic patients with several levels of disease severity, which were submitted to high resolution computed tomography of the thorax and spirometry, between the months of July, 1995 and August, 1997. The tomographic findings were correlated with the clinical classification of the patients and the most frequent tomographic findings were bronchial wall thickening, bronchial dilatation, air trapping, centrilobular opacities, cicatricial linear shadows, mucoid impaction, emphysema and atelectasis. In asthmatic patients of long duration we observed small airway disease and irreversible lesions as the predominant findings. In smoking patients there was no high frequency of emphysema. (author)

  5. PRE-MARKET CLINICAL EVALUATIONS OF INNOVATIVE HIGH-RISK MEDICAL DEVICES IN EUROPE

    DEFF Research Database (Denmark)

    Hulstaert, F.; Neyt, M.; Vinck, I.

    2012-01-01

    data are available? We studied the premarket clinical evaluation of innovative high-risk medical devices in Europe compared with the US, and with medicines, where appropriate. Methods: The literature and regulatory documents were checked. Representatives from industry, Competent Authorities, Notified...... of premarket trials in Europe and number of patients exposed, but failed as this information is not made public. Furthermore, the Helsinki Declaration is not followed with respect to the registration and publication of premarket trials. Conclusions: For innovative high-risk devices, new EU legislation should...

  6. Association between traditional clinical high-risk features and gene expression profile classification in uveal melanoma.

    Science.gov (United States)

    Nguyen, Brandon T; Kim, Ryan S; Bretana, Maria E; Kegley, Eric; Schefler, Amy C

    2018-02-01

    To evaluate the association between traditional clinical high-risk features of uveal melanoma patients and gene expression profile (GEP). This was a retrospective, single-center, case series of patients with uveal melanoma. Eighty-three patients met inclusion criteria for the study. Patients were examined for the following clinical risk factors: drusen/retinal pigment epithelium (RPE) changes, vascularity on B-scan, internal reflectivity on A-scan, subretinal fluid (SRF), orange pigment, apical tumor height/thickness, and largest basal dimensions (LBD). A novel point system was created to grade the high-risk clinical features of each tumor. Further analyses were performed to assess the degree of association between GEP and each individual risk factor, total clinical risk score, vascularity, internal reflectivity, American Joint Committee on Cancer (AJCC) tumor stage classification, apical tumor height/thickness, and LBD. Of the 83 total patients, 41 were classified as GEP class 1A, 17 as class 1B, and 25 as class 2. The presence of orange pigment, SRF, low internal reflectivity and vascularity on ultrasound, and apical tumor height/thickness ≥ 2 mm were not statistically significantly associated with GEP class. Lack of drusen/RPE changes demonstrated a trend toward statistical association with GEP class 2 compared to class 1A/1B. LBD and advancing AJCC stage was statistically associated with higher GEP class. In this cohort, AJCC stage classification and LBD were the only clinical features statistically associated with GEP class. Clinicians should use caution when inferring the growth potential of melanocytic lesions solely from traditional funduscopic and ultrasonographic risk factors without GEP data.

  7. The clinical impact of high resolution computed tomography in patients with respiratory disease

    International Nuclear Information System (INIS)

    Screaton, Nicholas J.; Tasker, Angela D.; Flower, Christopher D.R.; Miller, Fiona N.A.C.; Patel, Bipen D.; Groves, Ashley; Lomas, David A.

    2011-01-01

    High resolution computed tomography is widely used to investigate patients with suspected diffuse lung disease. Numerous studies have assessed the diagnostic performance of this investigation, but the diagnostic and therapeutic impacts have received little attention. The diagnostic and therapeutic impacts of high resolution computed tomography in routine clinical practice were evaluated prospectively. All 507 referrals for high-resolution computed tomography over 12 months in two centres were included. Requesting clinicians completed questionnaires before and after the investigation detailing clinical indications, working diagnoses, confidence level in each diagnosis, planned investigations and treatments. Three hundred and fifty-four studies on 347 patients had complete data and were available for analysis. Following high-resolution computed tomography, a new leading diagnosis (the diagnosis with the highest confidence level) emerged in 204 (58%) studies; in 166 (47%) studies the new leading diagnosis was not in the original differential diagnosis. Mean confidence in the leading diagnosis increased from 6.7 to 8.5 out of 10 (p < 0.001). The invasiveness of planned investigations increased in 23 (7%) studies and decreased in 124 (35%) studies. The treatment plan was modified after 319 (90%) studies. Thoracic high-resolution computed tomography alters leading diagnosis, increases diagnostic confidence, and frequently changes investigation and management plans. (orig.)

  8. Clinical signs of hypoxia with high-Dk soft lens extended wear: is the cornea convinced?

    Science.gov (United States)

    Sweeney, Deborah F

    2003-01-01

    To assess the effectiveness of high-Dk soft contact lenses with oxygen transmissibility (Dk/L) beyond the critical level required to avoid corneal edema during overnight wear. The most up-to-date data available on clinical signs of hypoxia with high-Dk contact lenses is reviewed. Chronic corneal edema associated with hypoxia is responsible for the development of large numbers of microcysts, limbal hyperemia, neovascularization, and small increases in myopia. Silicone hydrogel lenses worn continuously for up to 30 nights prevent corneal edema during overnight wear and do not induce a microcyst response. Long-term clinical trials indicate the mean level of limbal redness for patients wearing high-Dk lenses during continuous wear are equivalent to nonlens wearers. No changes in refractive error are associated with continuous wear of high-Dk lenses. High-Dk silicone hydrogel lenses can be worn for up to 3 years with virtual elimination of the hypoxic consequences observed with low-Dk lenses made from conventional lens materials.

  9. HLA Match Likelihoods for Hematopoietic Stem-Cell Grafts in the U.S. Registry

    Science.gov (United States)

    Gragert, Loren; Eapen, Mary; Williams, Eric; Freeman, John; Spellman, Stephen; Baitty, Robert; Hartzman, Robert; Rizzo, J. Douglas; Horowitz, Mary; Confer, Dennis; Maiers, Martin

    2018-01-01

    Background Hematopoietic stem-cell transplantation (HSCT) is a potentially lifesaving therapy for several blood cancers and other diseases. For patients without a suitable related HLA-matched donor, unrelated-donor registries of adult volunteers and banked umbilical cord–blood units, such as the Be the Match Registry operated by the National Marrow Donor Program (NMDP), provide potential sources of donors. Our goal in the present study was to measure the likelihood of finding a suitable donor in the U.S. registry. Methods Using human HLA data from the NMDP donor and cord-blood-unit registry, we built population-based genetic models for 21 U.S. racial and ethnic groups to predict the likelihood of identifying a suitable donor (either an adult donor or a cord-blood unit) for patients in each group. The models incorporated the degree of HLA matching, adult-donor availability (i.e., ability to donate), and cord-blood-unit cell dose. Results Our models indicated that most candidates for HSCT will have a suitable (HLA-matched or minimally mismatched) adult donor. However, many patients will not have an optimal adult donor — that is, a donor who is matched at high resolution at HLA-A, HLA-B, HLA-C, and HLA-DRB1. The likelihood of finding an optimal donor varies among racial and ethnic groups, with the highest probability among whites of European descent, at 75%, and the lowest probability among blacks of South or Central American descent, at 16%. Likelihoods for other groups are intermediate. Few patients will have an optimal cord-blood unit — that is, one matched at the antigen level at HLA-A and HLA-B and matched at high resolution at HLA-DRB1. However, cord-blood units mismatched at one or two HLA loci are available for almost all patients younger than 20 years of age and for more than 80% of patients 20 years of age or older, regardless of racial and ethnic background. Conclusions Most patients likely to benefit from HSCT will have a donor. Public investment in

  10. Gender and motor competence affects perceived likelihood and importance of physical activity outcomes among 14 year olds.

    Science.gov (United States)

    Hands, B; Parker, H E; Rose, E; Larkin, D

    2016-03-01

    Perceptions of the effects of physical activity could facilitate or deter future participation. This study explored the differences between gender and motor competence at 14 years of age in the perceptions of likelihood and importance of physical activity outcomes. The sample comprised 1582 14-year-old adolescents (769 girls) from the Western Australian Pregnancy Cohort (Raine) Study. Four motor competence groups were formed from a standardized Neuromuscular Developmental Index score (McCarron 1997). Perceptions of the likelihood and the importance of 15 physical activity outcomes were measured by a questionnaire developed for the NSW Schools Fitness and Physical Activity Survey (Booth et al. 1997). Gender (two) × motor competence (four) analyses of variance and Tukey post hoc were conducted on outcome scores (P importance of physical activity outcomes within competition, social friendships and injury domains. Motor competence was significant in the perceived likelihood of physical health (P importance was perceived for academic outcomes for 14 year olds categorized with low compared with high motor competence (P importance. Although level of motor competence at 14 years affected the perceived likelihood of health, social and fun outcomes from future participation in physical activity, adolescents highly valued these outcomes, whereas gender affected competition and winning, outcomes that were less valued. Physical activity that promotes these key and valued outcomes may encourage young people's ongoing involvement in physical activity, especially for those at risk of low participation. © 2015 John Wiley & Sons Ltd.

  11. Penalized likelihood and multi-objective spatial scans for the detection and inference of irregular clusters

    Directory of Open Access Journals (Sweden)

    Fonseca Carlos M

    2010-10-01

    Full Text Available Abstract Background Irregularly shaped spatial clusters are difficult to delineate. A cluster found by an algorithm often spreads through large portions of the map, impacting its geographical meaning. Penalized likelihood methods for Kulldorff's spatial scan statistics have been used to control the excessive freedom of the shape of clusters. Penalty functions based on cluster geometry and non-connectivity have been proposed recently. Another approach involves the use of a multi-objective algorithm to maximize two objectives: the spatial scan statistics and the geometric penalty function. Results & Discussion We present a novel scan statistic algorithm employing a function based on the graph topology to penalize the presence of under-populated disconnection nodes in candidate clusters, the disconnection nodes cohesion function. A disconnection node is defined as a region within a cluster, such that its removal disconnects the cluster. By applying this function, the most geographically meaningful clusters are sifted through the immense set of possible irregularly shaped candidate cluster solutions. To evaluate the statistical significance of solutions for multi-objective scans, a statistical approach based on the concept of attainment function is used. In this paper we compared different penalized likelihoods employing the geometric and non-connectivity regularity functions and the novel disconnection nodes cohesion function. We also build multi-objective scans using those three functions and compare them with the previous penalized likelihood scans. An application is presented using comprehensive state-wide data for Chagas' disease in puerperal women in Minas Gerais state, Brazil. Conclusions We show that, compared to the other single-objective algorithms, multi-objective scans present better performance, regarding power, sensitivity and positive predicted value. The multi-objective non-connectivity scan is faster and better suited for the

  12. Generalized Likelihood Uncertainty Estimation (GLUE) Using Multi-Optimization Algorithm as Sampling Method

    Science.gov (United States)

    Wang, Z.

    2015-12-01

    For decades, distributed and lumped hydrological models have furthered our understanding of hydrological system. The development of hydrological simulation in large scale and high precision elaborated the spatial descriptions and hydrological behaviors. Meanwhile, the new trend is also followed by the increment of model complexity and number of parameters, which brings new challenges of uncertainty quantification. Generalized Likelihood Uncertainty Estimation (GLUE) has been widely used in uncertainty analysis for hydrological models referring to Monte Carlo method coupled with Bayesian estimation. However, the stochastic sampling method of prior parameters adopted by GLUE appears inefficient, especially in high dimensional parameter space. The heuristic optimization algorithms utilizing iterative evolution show better convergence speed and optimality-searching performance. In light of the features of heuristic optimization algorithms, this study adopted genetic algorithm, differential evolution, shuffled complex evolving algorithm to search the parameter space and obtain the parameter sets of large likelihoods. Based on the multi-algorithm sampling, hydrological model uncertainty analysis is conducted by the typical GLUE framework. To demonstrate the superiority of the new method, two hydrological models of different complexity are examined. The results shows the adaptive method tends to be efficient in sampling and effective in uncertainty analysis, providing an alternative path for uncertainty quantilization.

  13. Maximum-Likelihood Sequence Detection of Multiple Antenna Systems over Dispersive Channels via Sphere Decoding

    Directory of Open Access Journals (Sweden)

    Hassibi Babak

    2002-01-01

    Full Text Available Multiple antenna systems are capable of providing high data rate transmissions over wireless channels. When the channels are dispersive, the signal at each receive antenna is a combination of both the current and past symbols sent from all transmit antennas corrupted by noise. The optimal receiver is a maximum-likelihood sequence detector and is often considered to be practically infeasible due to high computational complexity (exponential in number of antennas and channel memory. Therefore, in practice, one often settles for a less complex suboptimal receiver structure, typically with an equalizer meant to suppress both the intersymbol and interuser interference, followed by the decoder. We propose a sphere decoding for the sequence detection in multiple antenna communication systems over dispersive channels. The sphere decoding provides the maximum-likelihood estimate with computational complexity comparable to the standard space-time decision-feedback equalizing (DFE algorithms. The performance and complexity of the sphere decoding are compared with the DFE algorithm by means of simulations.

  14. Advanced MR methods at ultra-high field (7 Tesla) for clinical musculoskeletal applications

    Energy Technology Data Exchange (ETDEWEB)

    Trattnig, Siegfried [Medical University of Vienna/Vienna General Hospital, MR Centre - High Field MR, Department of Radiology, Vienna (Austria); Ludwig Boltzmann Institute for Experimental and Clinical Traumatology, Austrian Cluster for Tissue Regeneration, Vienna (Austria); Zbyn, Stefan; Schmitt, Benjamin; Friedrich, Klaus; Bogner, Wolfgang [Medical University of Vienna/Vienna General Hospital, MR Centre - High Field MR, Department of Radiology, Vienna (Austria); Juras, Vladimir; Szomolanyi, Pavol [Medical University of Vienna/Vienna General Hospital, MR Centre - High Field MR, Department of Radiology, Vienna (Austria); Slovak Academy of Sciences, Department of Imaging Methods, Institute of Measurement Science, Bratislava (Slovakia)

    2012-11-15

    This article provides an overview of the initial clinical results of musculoskeletal studies performed at 7 Tesla, with special focus on sodium imaging, new techniques such as chemical exchange saturation transfer (CEST) and T2* imaging, and multinuclear MR spectroscopy. Sodium imaging was clinically used at 7 T in the evaluation of patients after cartilage repair procedures because it enables the GAG content to be monitored over time. Sodium imaging and T2* mapping allow insights into the ultra-structural composition of the Achilles tendon and help detect early disease. Chemical exchange saturation transfer was, for the first time, successfully applied in the clinical set-up at 7 T in patients after cartilage repair surgery. The potential of phosphorus MR spectroscopy in muscle was demonstrated in a comparison study between 3 and 7 T, with higher spectral resolution and significantly shorter data acquisition times at 7 T. These initial clinical studies demonstrate the potential of ultra-high field MR at 7 T, with the advantage of significantly improved sensitivity for other nuclei, such as {sup 23}Na (sodium) and {sup 31}P (phosphorus). The application of non-proton imaging and spectroscopy provides new insights into normal and abnormal physiology of musculoskeletal tissues, particularly cartilage, tendons, and muscles. (orig.)

  15. High-throughput monitoring of integration site clonality in preclinical and clinical gene therapy studies

    Directory of Open Access Journals (Sweden)

    Frank A Giordano

    Full Text Available Gene transfer to hematopoietic stem cells with integrating vectors not only allows sustained correction of monogenic diseases but also tracking of individual clones in vivo. Quantitative real-time PCR (qPCR has been shown to be an accurate method to quantify individual stem cell clones, yet due to frequently limited amounts of target material (especially in clinical studies, it is not useful for large-scale analyses. To explore whether vector integration site (IS recovery techniques may be suitable to describe clonal contributions if combined with next-generation sequencing techniques, we designed artificial ISs of different sizes which were mixed to simulate defined clonal situations in clinical settings. We subjected all mixes to either linear amplification–mediated PCR (LAM-PCR or nonrestrictive LAM-PCR (nrLAM-PCR, both combined with 454 sequencing. We showed that nrLAM-PCR/454-detected clonality allows estimating qPCR-detected clonality in vitro. We then followed the kinetics of two clones detected in a patient enrolled in a clinical gene therapy trial using both, nrLAM-PCR/454 and qPCR and also saw nrLAM-PCR/454 to correlate to qPCR-measured clonal contributions. The method presented here displays a feasible high-throughput strategy to monitor clonality in clinical gene therapy trials is at hand.

  16. Exploring the use of high-fidelity simulation training to enhance clinical skills.

    Science.gov (United States)

    Ann Kirkham, Lucy

    2018-02-07

    The use of interprofessional simulation training to enhance nursing students' performance of technical and non-technical clinical skills is becoming increasingly common. Simulation training can involve the use of role play, virtual reality or patient simulator manikins to replicate clinical scenarios and assess the nursing student's ability to, for example, undertake clinical observations or work as part of a team. Simulation training enables nursing students to practise clinical skills in a safe environment. Effective simulation training requires extensive preparation, and debriefing is necessary following a simulated training session to review any positive or negative aspects of the learning experience. This article discusses a high-fidelity simulated training session that was used to assess a group of third-year nursing students and foundation level 1 medical students. This involved the use of a patient simulator manikin in a scenario that required the collaborative management of a deteriorating patient. ©2018 RCN Publishing Company Ltd. All rights reserved. Not to be copied, transmitted or recorded in any way, in whole or part, without prior permission of the publishers.

  17. Clinical dosimetry with plastic scintillators - Almost energy independent, direct absorbed dose reading with high resolution

    Energy Technology Data Exchange (ETDEWEB)

    Quast, U; Fluehs, D [Department of Radiotherapy, Essen (Germany). Div. of Clinical Radiation Physics; Fluehs, D; Kolanoski, H [Dortmund Univ. (Germany). Inst. fuer Physik

    1996-08-01

    Clinical dosimetry is still far behind the goal to measure any spatial or temporal distribution of absorbed dose fast and precise without disturbing the physical situation by the dosimetry procedure. NE 102A plastic scintillators overcome this border. These tissue substituting dosemeter probes open a wide range of new clinical applications of dosimetry. This versatile new dosimetry system enables fast measurement of the absorbed dose to water in water also in regions with a steep dose gradient, close to interfaces, or in partly shielded regions. It allows direct reading dosimetry in the energy range of all clinically used external photon and electron beams, or around all branchytherapy sources. Thin detector arrays permit fast and high resolution measurements in quality assurance, such as in-vivo dosimetry or even afterloading dose monitoring. A main field of application is the dosimetric treatment planning, the individual optimization of brachytherapy applicators. Thus, plastic scintillator dosemeters cover optimally all difficult fields of clinical dosimetry. An overview about its characteristics and applications is given here. 20 refs, 1 fig.

  18. A simulation study of likelihood inference procedures in rayleigh distribution with censored data

    International Nuclear Information System (INIS)

    Baklizi, S. A.; Baker, H. M.

    2001-01-01

    Inference procedures based on the likelihood function are considered for the one parameter Rayleigh distribution with type1 and type 2 censored data. Using simulation techniques, the finite sample performances of the maximum likelihood estimator and the large sample likelihood interval estimation procedures based on the Wald, the Rao, and the likelihood ratio statistics are investigated. It appears that the maximum likelihood estimator is unbiased. The approximate variance estimates obtained from the asymptotic normal distribution of the maximum likelihood estimator are accurate under type 2 censored data while they tend to be smaller than the actual variances when considering type1 censored data of small size. It appears also that interval estimation based on the Wald and Rao statistics need much more sample size than interval estimation based on the likelihood ratio statistic to attain reasonable accuracy. (authors). 15 refs., 4 tabs

  19. Highly accelerated cardiovascular MR imaging using many channel technology: concepts and clinical applications

    International Nuclear Information System (INIS)

    Niendorf, Thoralf; Sodickson, Daniel K.

    2008-01-01

    Cardiovascular magnetic resonance imaging (CVMRI) is of proven clinical value in the non-invasive imaging of cardiovascular diseases. CVMRI requires rapid image acquisition, but acquisition speed is fundamentally limited in conventional MRI. Parallel imaging provides a means for increasing acquisition speed and efficiency. However, signal-to-noise (SNR) limitations and the limited number of receiver channels available on most MR systems have in the past imposed practical constraints, which dictated the use of moderate accelerations in CVMRI. High levels of acceleration, which were unattainable previously, have become possible with many-receiver MR systems and many-element, cardiac-optimized RF-coil arrays. The resulting imaging speed improvements can be exploited in a number of ways, ranging from enhancement of spatial and temporal resolution to efficient whole heart coverage to streamlining of CVMRI work flow. In this review, examples of these strategies are provided, following an outline of the fundamentals of the highly accelerated imaging approaches employed in CVMRI. Topics discussed include basic principles of parallel imaging; key requirements for MR systems and RF-coil design; practical considerations of SNR management, supported by multi-dimensional accelerations, 3D noise averaging and high field imaging; highly accelerated clinical state-of-the art cardiovascular imaging applications spanning the range from SNR-rich to SNR-limited; and current trends and future directions. (orig.)

  20. High-resolution typing of Chlamydia trachomatis: epidemiological and clinical uses.

    Science.gov (United States)

    de Vries, Henry J C; Schim van der Loeff, Maarten F; Bruisten, Sylvia M

    2015-02-01

    A state-of-the-art overview of molecular Chlamydia trachomatis typing methods that are used for routine diagnostics and scientific studies. Molecular epidemiology uses high-resolution typing techniques such as multilocus sequence typing, multilocus variable number of tandem repeats analysis, and whole-genome sequencing to identify strains based on their DNA sequence. These data can be used for cluster, network and phylogenetic analyses, and are used to unveil transmission networks, risk groups, and evolutionary pathways. High-resolution typing of C. trachomatis strains is applied to monitor treatment efficacy and re-infections, and to study the recent emergence of lymphogranuloma venereum (LGV) amongst men who have sex with men in high-income countries. Chlamydia strain typing has clinical relevance in disease management, as LGV needs longer treatment than non-LGV C. trachomatis. It has also led to the discovery of a new variant Chlamydia strain in Sweden, which was not detected by some commercial C. trachomatis diagnostic platforms. After a brief history and comparison of the various Chlamydia typing methods, the applications of the current techniques are described and future endeavors to extend scientific understanding are formulated. High-resolution typing will likely help to further unravel the pathophysiological mechanisms behind the wide clinical spectrum of chlamydial disease.

  1. High Intrapatient Variability of Tacrolimus Levels and Outpatient Clinic Nonattendance Are Associated With Inferior Outcomes in Renal Transplant Patients

    Directory of Open Access Journals (Sweden)

    Dawn L. Goodall, MSc

    2017-08-01

    Conclusions. This study shows that high tacrolimus IPV and clinic nonattendance are associated with inferior allograft survival. Interventions to minimize the causes of high variability, particularly nonadherence are essential to improve long-term allograft outcomes.

  2. Clinical Application of Esophageal High-resolution Manometry in the Diagnosis of Esophageal Motility Disorders.

    Science.gov (United States)

    van Hoeij, Froukje B; Bredenoord, Albert J

    2016-01-31

    Esophageal high-resolution manometry (HRM) is replacing conventional manometry in the clinical evaluation of patients with esophageal symptoms, especially dysphagia. The introduction of HRM gave rise to new objective metrics and recognizable patterns of esophageal motor function, requiring a new classification scheme: the Chicago classification. HRM measurements are more detailed and more easily performed compared to conventional manometry. The visual presentation of acquired data improved the analysis and interpretation of esophageal motor function. This led to a more sensitive, accurate, and objective analysis of esophageal motility. In this review we discuss how HRM changed the way we define and categorize esophageal motility disorders. Moreover, we discuss the clinical applications of HRM for each esophageal motility disorder separately.

  3. Efficient algorithms for maximum likelihood decoding in the surface code

    Science.gov (United States)

    Bravyi, Sergey; Suchara, Martin; Vargo, Alexander

    2014-09-01

    We describe two implementations of the optimal error correction algorithm known as the maximum likelihood decoder (MLD) for the two-dimensional surface code with a noiseless syndrome extraction. First, we show how to implement MLD exactly in time O (n2), where n is the number of code qubits. Our implementation uses a reduction from MLD to simulation of matchgate quantum circuits. This reduction however requires a special noise model with independent bit-flip and phase-flip errors. Secondly, we show how to implement MLD approximately for more general noise models using matrix product states (MPS). Our implementation has running time O (nχ3), where χ is a parameter that controls the approximation precision. The key step of our algorithm, borrowed from the density matrix renormalization-group method, is a subroutine for contracting a tensor network on the two-dimensional grid. The subroutine uses MPS with a bond dimension χ to approximate the sequence of tensors arising in the course of contraction. We benchmark the MPS-based decoder against the standard minimum weight matching decoder observing a significant reduction of the logical error probability for χ ≥4.

  4. Maximum likelihood sequence estimation for optical complex direct modulation.

    Science.gov (United States)

    Che, Di; Yuan, Feng; Shieh, William

    2017-04-17

    Semiconductor lasers are versatile optical transmitters in nature. Through the direct modulation (DM), the intensity modulation is realized by the linear mapping between the injection current and the light power, while various angle modulations are enabled by the frequency chirp. Limited by the direct detection, DM lasers used to be exploited only as 1-D (intensity or angle) transmitters by suppressing or simply ignoring the other modulation. Nevertheless, through the digital coherent detection, simultaneous intensity and angle modulations (namely, 2-D complex DM, CDM) can be realized by a single laser diode. The crucial technique of CDM is the joint demodulation of intensity and differential phase with the maximum likelihood sequence estimation (MLSE), supported by a closed-form discrete signal approximation of frequency chirp to characterize the MLSE transition probability. This paper proposes a statistical method for the transition probability to significantly enhance the accuracy of the chirp model. Using the statistical estimation, we demonstrate the first single-channel 100-Gb/s PAM-4 transmission over 1600-km fiber with only 10G-class DM lasers.

  5. Maximum likelihood estimation for cytogenetic dose-response curves

    International Nuclear Information System (INIS)

    Frome, E.L; DuFrain, R.J.

    1983-10-01

    In vitro dose-response curves are used to describe the relation between the yield of dicentric chromosome aberrations and radiation dose for human lymphocytes. The dicentric yields follow the Poisson distribution, and the expected yield depends on both the magnitude and the temporal distribution of the dose for low LET radiation. A general dose-response model that describes this relation has been obtained by Kellerer and Rossi using the theory of dual radiation action. The yield of elementary lesions is kappa[γd + g(t, tau)d 2 ], where t is the time and d is dose. The coefficient of the d 2 term is determined by the recovery function and the temporal mode of irradiation. Two special cases of practical interest are split-dose and continuous exposure experiments, and the resulting models are intrinsically nonlinear in the parameters. A general purpose maximum likelihood estimation procedure is described and illustrated with numerical examples from both experimental designs. Poisson regression analysis is used for estimation, hypothesis testing, and regression diagnostics. Results are discussed in the context of exposure assessment procedures for both acute and chronic human radiation exposure

  6. Scale invariant for one-sided multivariate likelihood ratio tests

    Directory of Open Access Journals (Sweden)

    Samruam Chongcharoen

    2010-07-01

    Full Text Available Suppose 1 2 , ,..., n X X X is a random sample from Np ( ,V distribution. Consider 0 1 2 : ... 0 p H      and1 : 0 for 1, 2,..., i H   i  p , let 1 0 H  H denote the hypothesis that 1 H holds but 0 H does not, and let ~ 0 H denote thehypothesis that 0 H does not hold. Because the likelihood ratio test (LRT of 0 H versus 1 0 H  H is complicated, severalad hoc tests have been proposed. Tang, Gnecco and Geller (1989 proposed an approximate LRT, Follmann (1996 suggestedrejecting 0 H if the usual test of 0 H versus ~ 0 H rejects 0 H with significance level 2 and a weighted sum of the samplemeans is positive, and Chongcharoen, Singh and Wright (2002 modified Follmann’s test to include information about thecorrelation structure in the sum of the sample means. Chongcharoen and Wright (2007, 2006 give versions of the Tang-Gnecco-Geller tests and Follmann-type tests, respectively, with invariance properties. With LRT’s scale invariant desiredproperty, we investigate its powers by using Monte Carlo techniques and compare them with the tests which we recommendin Chongcharoen and Wright (2007, 2006.

  7. Quantifying uncertainty, variability and likelihood for ordinary differential equation models

    LENUS (Irish Health Repository)

    Weisse, Andrea Y

    2010-10-28

    Abstract Background In many applications, ordinary differential equation (ODE) models are subject to uncertainty or variability in initial conditions and parameters. Both, uncertainty and variability can be quantified in terms of a probability density function on the state and parameter space. Results The partial differential equation that describes the evolution of this probability density function has a form that is particularly amenable to application of the well-known method of characteristics. The value of the density at some point in time is directly accessible by the solution of the original ODE extended by a single extra dimension (for the value of the density). This leads to simple methods for studying uncertainty, variability and likelihood, with significant advantages over more traditional Monte Carlo and related approaches especially when studying regions with low probability. Conclusions While such approaches based on the method of characteristics are common practice in other disciplines, their advantages for the study of biological systems have so far remained unrecognized. Several examples illustrate performance and accuracy of the approach and its limitations.

  8. Affective mapping: An activation likelihood estimation (ALE) meta-analysis.

    Science.gov (United States)

    Kirby, Lauren A J; Robinson, Jennifer L

    2017-11-01

    Functional neuroimaging has the spatial resolution to explain the neural basis of emotions. Activation likelihood estimation (ALE), as opposed to traditional qualitative meta-analysis, quantifies convergence of activation across studies within affective categories. Others have used ALE to investigate a broad range of emotions, but without the convenience of the BrainMap database. We used the BrainMap database and analysis resources to run separate meta-analyses on coordinates reported for anger, anxiety, disgust, fear, happiness, humor, and sadness. Resultant ALE maps were compared to determine areas of convergence between emotions, as well as to identify affect-specific networks. Five out of the seven emotions demonstrated consistent activation within the amygdala, whereas all emotions consistently activated the right inferior frontal gyrus, which has been implicated as an integration hub for affective and cognitive processes. These data provide the framework for models of affect-specific networks, as well as emotional processing hubs, which can be used for future studies of functional or effective connectivity. Copyright © 2015 Elsevier Inc. All rights reserved.

  9. Dark matter CMB constraints and likelihoods for poor particle physicists

    Energy Technology Data Exchange (ETDEWEB)

    Cline, James M.; Scott, Pat, E-mail: jcline@physics.mcgill.ca, E-mail: patscott@physics.mcgill.ca [Department of Physics, McGill University, 3600 rue University, Montréal, QC, H3A 2T8 (Canada)

    2013-03-01

    The cosmic microwave background provides constraints on the annihilation and decay of light dark matter at redshifts between 100 and 1000, the strength of which depends upon the fraction of energy ending up in the form of electrons and photons. The resulting constraints are usually presented for a limited selection of annihilation and decay channels. Here we provide constraints on the annihilation cross section and decay rate, at discrete values of the dark matter mass m{sub χ}, for all the annihilation and decay channels whose secondary spectra have been computed using PYTHIA in arXiv:1012.4515 (''PPPC 4 DM ID: a poor particle physicist cookbook for dark matter indirect detection''), namely e, μ, τ, V → e, V → μ, V → τ, u, d s, c, b, t, γ, g, W, Z and h. By interpolating in mass, these can be used to find the CMB constraints and likelihood functions from WMAP7 and Planck for a wide range of dark matter models, including those with annihilation or decay into a linear combination of different channels.

  10. Dark matter CMB constraints and likelihoods for poor particle physicists

    International Nuclear Information System (INIS)

    Cline, James M.; Scott, Pat

    2013-01-01

    The cosmic microwave background provides constraints on the annihilation and decay of light dark matter at redshifts between 100 and 1000, the strength of which depends upon the fraction of energy ending up in the form of electrons and photons. The resulting constraints are usually presented for a limited selection of annihilation and decay channels. Here we provide constraints on the annihilation cross section and decay rate, at discrete values of the dark matter mass m χ , for all the annihilation and decay channels whose secondary spectra have been computed using PYTHIA in arXiv:1012.4515 (''PPPC 4 DM ID: a poor particle physicist cookbook for dark matter indirect detection''), namely e, μ, τ, V → e, V → μ, V → τ, u, d s, c, b, t, γ, g, W, Z and h. By interpolating in mass, these can be used to find the CMB constraints and likelihood functions from WMAP7 and Planck for a wide range of dark matter models, including those with annihilation or decay into a linear combination of different channels

  11. Maximum likelihood estimation for cytogenetic dose-response curves

    International Nuclear Information System (INIS)

    Frome, E.L.; DuFrain, R.J.

    1986-01-01

    In vitro dose-response curves are used to describe the relation between chromosome aberrations and radiation dose for human lymphocytes. The lymphocytes are exposed to low-LET radiation, and the resulting dicentric chromosome aberrations follow the Poisson distribution. The expected yield depends on both the magnitude and the temporal distribution of the dose. A general dose-response model that describes this relation has been presented by Kellerer and Rossi (1972, Current Topics on Radiation Research Quarterly 8, 85-158; 1978, Radiation Research 75, 471-488) using the theory of dual radiation action. Two special cases of practical interest are split-dose and continuous exposure experiments, and the resulting dose-time-response models are intrinsically nonlinear in the parameters. A general-purpose maximum likelihood estimation procedure is described, and estimation for the nonlinear models is illustrated with numerical examples from both experimental designs. Poisson regression analysis is used for estimation, hypothesis testing, and regression diagnostics. Results are discussed in the context of exposure assessment procedures for both acute and chronic human radiation exposure

  12. Maximum likelihood estimation for cytogenetic dose-response curves

    Energy Technology Data Exchange (ETDEWEB)

    Frome, E.L; DuFrain, R.J.

    1983-10-01

    In vitro dose-response curves are used to describe the relation between the yield of dicentric chromosome aberrations and radiation dose for human lymphocytes. The dicentric yields follow the Poisson distribution, and the expected yield depends on both the magnitude and the temporal distribution of the dose for low LET radiation. A general dose-response model that describes this relation has been obtained by Kellerer and Rossi using the theory of dual radiation action. The yield of elementary lesions is kappa(..gamma..d + g(t, tau)d/sup 2/), where t is the time and d is dose. The coefficient of the d/sup 2/ term is determined by the recovery function and the temporal mode of irradiation. Two special cases of practical interest are split-dose and continuous exposure experiments, and the resulting models are intrinsically nonlinear in the parameters. A general purpose maximum likelihood estimation procedure is described and illustrated with numerical examples from both experimental designs. Poisson regression analysis is used for estimation, hypothesis testing, and regression diagnostics. Results are discussed in the context of exposure assessment procedures for both acute and chronic human radiation exposure.

  13. Maximum likelihood approach for several stochastic volatility models

    International Nuclear Information System (INIS)

    Camprodon, Jordi; Perelló, Josep

    2012-01-01

    Volatility measures the amplitude of price fluctuations. Despite it being one of the most important quantities in finance, volatility is not directly observable. Here we apply a maximum likelihood method which assumes that price and volatility follow a two-dimensional diffusion process where volatility is the stochastic diffusion coefficient of the log-price dynamics. We apply this method to the simplest versions of the expOU, the OU and the Heston stochastic volatility models and we study their performance in terms of the log-price probability, the volatility probability, and its Mean First-Passage Time. The approach has some predictive power on the future returns amplitude by only knowing the current volatility. The assumed models do not consider long-range volatility autocorrelation and the asymmetric return-volatility cross-correlation but the method still yields very naturally these two important stylized facts. We apply the method to different market indices and with a good performance in all cases. (paper)

  14. Clinical, Endoscopic, and Radiologic Features of Three Subtypes of Achalasia, Classified Using High-Resolution Manometry

    Science.gov (United States)

    Khan, Mohammed Q.; AlQaraawi, Abdullah; Al-Sohaibani, Fahad; Al-Kahtani, Khalid; Al-Ashgar, Hamad I.

    2015-01-01

    Background/Aims: High-resolution manometry (HRM) has improved the accuracy of manometry in detecting achalasia and determining its subtypes. However, the correlation of achalasia subtypes with clinical, endoscopic, and radiologic findings has not been assessed. We aimed to evaluate and compare the clinical, endoscopic, and fluoroscopy findings associated with three subtypes of achalasia using HRM. Patients and Methods: The retrospective clinical data, HRM, endoscopy, and radiologic findings were obtained from the medical records of untreated achalasia patients. Results: From 2011 to 2013, 374 patients underwent HRM. Fifty-two patients (14%) were diagnosed with achalasia, but only 32 (8.5%) of these patients had not received treatment and were therefore included in this study. The endoscopy results were normal in 28% of the patients, and a barium swallow was inconclusive in 31% of the achalasia patients. Ten patients (31%) were classified as having type I achalasia, 17 (53%) were classified as type II, and 5 (16%) were classified as type III. Among the three subtypes, type I patients were on average the youngest and had the longest history of dysphagia, mildest chest pain, most significant weight loss, and most dilated esophagus with residual food. Chest pain was most common in type III patients, and frequently had normal fluoroscopic and endoscopic results. Conclusion: The clinical, radiologic, and endoscopic findings were not significantly different between patients with type I and type II untreated achalasia. Type III patients had the most severe symptoms and were the most difficult to diagnose based on varied clinical, radiologic, and endoscopic findings. PMID:26021774

  15. Scoring clinical signs can help diagnose canine visceral leishmaniasis in a highly endemic area in Brazil

    Directory of Open Access Journals (Sweden)

    Kleverton Ribeiro da Silva

    Full Text Available Canine visceral leishmaniasis (CVL diagnosis is still a challenge in endemic areas with limited diagnostic resources. This study proposes a score with the potential to distinguish positive CVL cases from negative ones. We studied 265 dogs that tested positive for CVL on ELISA and parasitological tests. A score ranging between 0 and 19 was recorded on the basis of clinical signs. Dogs with CVL had an overall higher positivity of the majority of clinical signs than did dogs without CVL or with ehrlichiosis. Clinical signs such as enlarged lymph nodes (83.93%, muzzle/ear lesions (55.36%, nutritional status (51.79%, bristle condition (57.14%, pale mucosal colour (48.21%, onychogryphosis (58.93%, skin lesion (39.28%, bleeding (12.50%, muzzle depigmentation (41.07%, alopecia (39.29%, blepharitis (21.43%, and keratoconjunctivitis (42.86% were more frequent in dogs with CVL than in dogs with ehrlichiosis or without CVL. Moreover, the clinical score increased according to the positivity of all diagnostic tests (ELISA, p < 0.001; parasite culture, p = 0.0021; and smear, p = 0.0003. Onychogryphosis (long nails [odds ratio (OR: 3.529; 95% confidence interval (CI: 1.832-6.796; p < 0.001], muzzle depigmentation (OR: 4.651; 95% CI: 2.218-9.750; p < 0.001, and keratoconjunctivitis (OR: 5.400; 95% CI: 2.549-11.441; p < 0.001 were highly associated with CVL. Interestingly, a score cut-off value ≥ 6 had an area under the curve of 0.717 (p < 0.0001, sensitivity of 60.71%, and specificity of 73.64% for CVL diagnosis. The clinical sign-based score for CVL diagnosis suggested herein can help veterinarians reliably identify dogs with CVL in endemic areas with limited diagnostic resources.

  16. Record High US Measles Cases: Patient Vaccination, Clinical Assessment and Management

    Centers for Disease Control (CDC) Podcasts

    This podcast is an overview of the Clinician Outreach and Communication Activity (COCA) Call: Record High US Measles Cases: Patient Vaccination, Clinical Assessment and Management. In May 2014, the United States recorded the largest number of reported measles cases since 1994 and the number continues to rise. Most cases reported have been acquired in the U.S. and are associated with importations from countries where measles is still common. This highly contagious, acute viral illness spreads quickly in unvaccinated populations once reaching the U.S. The recent measles outbreaks highlight the importance of maintaining high vaccination coverage in the U.S. and ensuring age-appropriate vaccination for international travelers. During this COCA call, clinicians will learn the status of measles in the U.S. and CDC vaccination recommendations and guidelines for patient assessment and management.

  17. Neuroanatomical and Symptomatic Sex Differences in Individuals at Clinical High Risk for Psychosis

    Directory of Open Access Journals (Sweden)

    Elisa Guma

    2017-12-01

    Full Text Available Sex differences have been widely observed in clinical presentation, functional outcome and neuroanatomy in individuals with a first-episode of psychosis, and chronic patients suffering from schizophrenia. However, little is known about sex differences in the high-risk stages for psychosis. The present study investigated sex differences in cortical and subcortical neuroanatomy in individuals at clinical high risk (CHR for psychosis and healthy controls (CTL, and the relationship between anatomy and clinical symptoms in males at CHR. Magnetic resonance images were collected in 26 individuals at CHR (13 men and 29 CTLs (15 men to determine total and regional brain volumes and morphology, cortical thickness, and surface area (SA. Clinical symptoms were assessed with the brief psychiatric rating scale. Significant sex-by-diagnosis interactions were observed with opposite directions of effect in male and female CHR subjects relative to their same-sex controls in multiple cortical and subcortical areas. The right postcentral, left superior parietal, inferior parietal supramarginal, and angular gyri [<5% false discovery rate (FDR] were thicker in male and thinner in female CHR subjects compared with their same-sex CTLs. The same pattern was observed in the right superior parietal gyrus SA at the regional and vertex level. Using a recently developed surface-based morphology pipeline, we observed sex-specific shape differences in the left hippocampus (<5% FDR and amygdala (<10% FDR. Negative symptom burden was significantly higher in male compared with female CHR subjects (p = 0.04 and was positively associated with areal expansion of the left amygdala in males (<5% FDR. Some limitations of the study include the sample size, and data acquisition at 1.5 T. This study demonstrates neuroanatomical sex differences in CHR subjects, which may be associated with variations in symptomatology in men and women with psychotic symptoms.

  18. [Diagnostic imaging of high-grade astrocytoma: heterogeneity of clinical manifestation, image characteristics, and histopathological findings].

    Science.gov (United States)

    Okajima, Kaoru; Ohta, Yoshio

    2012-10-01

    Recent developments in diagnostic radiology, which have enabled accurate differential diagnoses of brain tumors, have been well described in the last three decades. MR and PET imaging can also provide information to predict histological grades and prognoses that might influence treatment strategies. However, high-grade astrocytomas consist of many different subtypes that are associated with different imaging and histological characteristics. Hemorrhage and necrosis results in a variety of imaging features, and infiltrative tumor growth entrapping normal neurons may cause different clinical manifestations. We reviewed patients with high-grade astrocytomas that showed various imaging characteristics, with special emphasis on initial symptoms and histological features. Clinicopathological characteristics of astrocytomas were also compared with other malignant tumors. Neurological deficits were not notable in patients with grade 3-4 astrocytomas when they showed infiltrative tumor growth, while brain metastases with compact cellular proliferation caused more neurological symptoms. Infiltrative tumors did not show any enhancing masses on MR imaging, but these tumors may show intratumor heterogeneity. Seizures were reported to be more frequent in low-grade glioma and in secondary glioblastoma. Tumor heterogeneity was also reported in molecular genetic profile, and investigators identified some subsets of astrocytomas. They investigated IHD1/2 mutation, EGFR amplification, TP53 mutation, Ki-67 index, etc. In summary, high-grade astrocytomas are not homogenous groups of tumors, and this is associated with the heterogeneity of clinical manifestation, image characteristics, and histopathological findings. Molecular studies may explain the tumor heterogeneity in the near future.

  19. Observation Likelihood Model Design and Failure Recovery Scheme toward Reliable Localization of Mobile Robots

    Directory of Open Access Journals (Sweden)

    Chang-bae Moon

    2011-01-01

    Full Text Available Although there have been many researches on mobile robot localization, it is still difficult to obtain reliable localization performance in a human co-existing real environment. Reliability of localization is highly dependent upon developer's experiences because uncertainty is caused by a variety of reasons. We have developed a range sensor based integrated localization scheme for various indoor service robots. Through the experience, we found out that there are several significant experimental issues. In this paper, we provide useful solutions for following questions which are frequently faced with in practical applications: 1 How to design an observation likelihood model? 2 How to detect the localization failure? 3 How to recover from the localization failure? We present design guidelines of observation likelihood model. Localization failure detection and recovery schemes are presented by focusing on abrupt wheel slippage. Experiments were carried out in a typical office building environment. The proposed scheme to identify the localizer status is useful in practical environments. Moreover, the semi-global localization is a computationally efficient recovery scheme from localization failure. The results of experiments and analysis clearly present the usefulness of proposed solutions.

  20. Observation Likelihood Model Design and Failure Recovery Scheme Toward Reliable Localization of Mobile Robots

    Directory of Open Access Journals (Sweden)

    Chang-bae Moon

    2010-12-01

    Full Text Available Although there have been many researches on mobile robot localization, it is still difficult to obtain reliable localization performance in a human co-existing real environment. Reliability of localization is highly dependent upon developer's experiences because uncertainty is caused by a variety of reasons. We have developed a range sensor based integrated localization scheme for various indoor service robots. Through the experience, we found out that there are several significant experimental issues. In this paper, we provide useful solutions for following questions which are frequently faced with in practical applications: 1 How to design an observation likelihood model? 2 How to detect the localization failure? 3 How to recover from the localization failure? We present design guidelines of observation likelihood model. Localization failure detection and recovery schemes are presented by focusing on abrupt wheel slippage. Experiments were carried out in a typical office building environment. The proposed scheme to identify the localizer status is useful in practical environments. Moreover, the semi-global localization is a computationally efficient recovery scheme from localization failure. The results of experiments and analysis clearly present the usefulness of proposed solutions.

  1. Likelihood of women vs. men to receive bachelor's degrees in physics at Stanford, 1900-1929.

    Science.gov (United States)

    Nero, Anthony

    2005-04-01

    Work by K. Tolley indicates that girls in mid to late 19th century U.S. high schools were more likely to study mathematics and natural philosophy (i.e., physics and astronomy) than were boys (who pursued the classics).* She also found that after the turn of the century women were more likely than men to receive bachelor's degrees in math and biological sciences at Stanford, but her sampling of every fifth year yielded too few data to be conclusive about physics. Reexamination of graduation lists at Stanford, yielding data for each year from 1900 to 1929, shows that, while absolute numbers were small, women were as likely as men to receive bachelor's degrees in physics during the first decade of the century, in the second decade they were notably more likely, and in the third their likelihood decreased substantially, while that of men rose to exceed that of women. (Women were much more likely to receive bachelor's degrees in math, exceeding the likelihood for men by an order of magnitude during the second and third decades.) *K. Tolley, The Science Education of American Girls: A Historical Perspective (Routledge, N.Y.), 2003.

  2. Extending the Applicability of the Generalized Likelihood Function for Zero-Inflated Data Series

    Science.gov (United States)

    Oliveira, Debora Y.; Chaffe, Pedro L. B.; Sá, João. H. M.

    2018-03-01

    Proper uncertainty estimation for data series with a high proportion of zero and near zero observations has been a challenge in hydrologic studies. This technical note proposes a modification to the Generalized Likelihood function that accounts for zero inflation of the error distribution (ZI-GL). We compare the performance of the proposed ZI-GL with the original Generalized Likelihood function using the entire data series (GL) and by simply suppressing zero observations (GLy>0). These approaches were applied to two interception modeling examples characterized by data series with a significant number of zeros. The ZI-GL produced better uncertainty ranges than the GL as measured by the precision, reliability and volumetric bias metrics. The comparison between ZI-GL and GLy>0 highlights the need for further improvement in the treatment of residuals from near zero simulations when a linear heteroscedastic error model is considered. Aside from the interception modeling examples illustrated herein, the proposed ZI-GL may be useful for other hydrologic studies, such as for the modeling of the runoff generation in hillslopes and ephemeral catchments.

  3. PIRPLE: a penalized-likelihood framework for incorporation of prior images in CT reconstruction

    International Nuclear Information System (INIS)

    Stayman, J Webster; Dang, Hao; Ding, Yifu; Siewerdsen, Jeffrey H

    2013-01-01

    Over the course of diagnosis and treatment, it is common for a number of imaging studies to be acquired. Such imaging sequences can provide substantial patient-specific prior knowledge about the anatomy that can be incorporated into a prior-image-based tomographic reconstruction for improved image quality and better dose utilization. We present a general methodology using a model-based reconstruction approach including formulations of the measurement noise that also integrates prior images. This penalized-likelihood technique adopts a sparsity enforcing penalty that incorporates prior information yet allows for change between the current reconstruction and the prior image. Moreover, since prior images are generally not registered with the current image volume, we present a modified model-based approach that seeks a joint registration of the prior image in addition to the reconstruction of projection data. We demonstrate that the combined prior-image- and model-based technique outperforms methods that ignore the prior data or lack a noise model. Moreover, we demonstrate the importance of registration for prior-image-based reconstruction methods and show that the prior-image-registered penalized-likelihood estimation (PIRPLE) approach can maintain a high level of image quality in the presence of noisy and undersampled projection data. (paper)

  4. [Clinical Characteristics of Rhegmatogenous Retinal Detachment in Highly Myopic and Phakic Eyes].

    Science.gov (United States)

    Orihara, Tadashi; Hirota, Kazunari; Yokota, Reiji; Kunita, Daisuke; Itoh, Yuji; Rii, Tosho; Koto, Takashi; Hiraoka, Tomoyuki; Inoue, Makoto; Hirakata, Akito

    2016-05-01

    To evaluate clinical characteristics of rhegmatogenous retinal detachment in high myopic and phakic eyes. The subjects were 1174 eyes of phakic rhegmatogenous retinal detachment detected in 1199 eyes that underwent initial vitreoretinal surgery between April 2006 and March 2011. Eyes with macular hole retinal detachment or secondary retinal detachment were excluded. The 486 eyes with high myopia (spherical equivalent ≤ -6.0 D or axial length ≥ 26.5 mm) and the 688 eyes with non-high myopia were compared. The mean age was significantly younger in the high myopia group (42.7 ± 14.2 years old, p lattice degeneration were more frequent (16.7%, 20.4%, respectively). The incidences of the retinal detachment in younger age and those caused by retinal holes were higher in the high myopia group. Higher incidence of retinal detachment and lattice degeneration in the fellow eyes of the high myopia group indicated that careful observation also in the fellow eyes was recommended.

  5. Comparative behaviour of the Dynamically Penalized Likelihood algorithm in inverse radiation therapy planning

    Energy Technology Data Exchange (ETDEWEB)

    Llacer, Jorge [EC Engineering Consultants, LLC, Los Gatos, CA (United States)]. E-mail: jllacer@home.com; Solberg, Timothy D. [Department of Radiation Oncology, University of California, Los Angeles, CA (United States)]. E-mail: Solberg@radonc.ucla.edu; Promberger, Claus [BrainLAB AG, Heimstetten (Germany)]. E-mail: promberg@brainlab.com

    2001-10-01

    This paper presents a description of tests carried out to compare the behaviour of five algorithms in inverse radiation therapy planning: (1) The Dynamically Penalized Likelihood (DPL), an algorithm based on statistical estimation theory; (2) an accelerated version of the same algorithm; (3) a new fast adaptive simulated annealing (ASA) algorithm; (4) a conjugate gradient method; and (5) a Newton gradient method. A three-dimensional mathematical phantom and two clinical cases have been studied in detail. The phantom consisted of a U-shaped tumour with a partially enclosed 'spinal cord'. The clinical examples were a cavernous sinus meningioma and a prostate case. The algorithms have been tested in carefully selected and controlled conditions so as to ensure fairness in the assessment of results. It has been found that all five methods can yield relatively similar optimizations, except when a very demanding optimization is carried out. For the easier cases, the differences are principally in robustness, ease of use and optimization speed. In the more demanding case, there are significant differences in the resulting dose distributions. The accelerated DPL emerges as possibly the algorithm of choice for clinical practice. An appendix describes the differences in behaviour between the new ASA method and the one based on a patent by the Nomos Corporation. (author)

  6. Ultra-sensitive high performance liquid chromatography-laser-induced fluorescence based proteomics for clinical applications.

    Science.gov (United States)

    Patil, Ajeetkumar; Bhat, Sujatha; Pai, Keerthilatha M; Rai, Lavanya; Kartha, V B; Chidangil, Santhosh

    2015-09-08

    An ultra-sensitive high performance liquid chromatography-laser induced fluorescence (HPLC-LIF) based technique has been developed by our group at Manipal, for screening, early detection, and staging for various cancers, using protein profiling of clinical samples like, body fluids, cellular specimens, and biopsy-tissue. More than 300 protein profiles of different clinical samples (serum, saliva, cellular samples and tissue homogenates) from volunteers (normal, and different pre-malignant/malignant conditions) were recorded using this set-up. The protein profiles were analyzed using principal component analysis (PCA) to achieve objective detection and classification of malignant, premalignant and healthy conditions with high sensitivity and specificity. The HPLC-LIF protein profiling combined with PCA, as a routine method for screening, diagnosis, and staging of cervical cancer and oral cancer, is discussed in this paper. In recent years, proteomics techniques have advanced tremendously in life sciences and medical sciences for the detection and identification of proteins in body fluids, tissue homogenates and cellular samples to understand biochemical mechanisms leading to different diseases. Some of the methods include techniques like high performance liquid chromatography, 2D-gel electrophoresis, MALDI-TOF-MS, SELDI-TOF-MS, CE-MS and LC-MS techniques. We have developed an ultra-sensitive high performance liquid chromatography-laser induced fluorescence (HPLC-LIF) based technique, for screening, early detection, and staging for various cancers, using protein profiling of clinical samples like, body fluids, cellular specimens, and biopsy-tissue. More than 300 protein profiles of different clinical samples (serum, saliva, cellular samples and tissue homogenates) from healthy and volunteers with different malignant conditions were recorded by using this set-up. The protein profile data were analyzed using principal component analysis (PCA) for objective

  7. Clinical potentials of methylator phenotype in stage 4 high-risk neuroblastoma: an open challenge.

    Directory of Open Access Journals (Sweden)

    Barbara Banelli

    Full Text Available Approximately 20% of stage 4 high-risk neuroblastoma patients are alive and disease-free 5 years after disease onset while the remaining experience rapid and fatal progression. Numerous findings underline the prognostic role of methylation of defined target genes in neuroblastoma without taking into account the clinical and biological heterogeneity of this disease. In this report we have investigated the methylation of the PCDHB cluster, the most informative member of the "Methylator Phenotype" in neuroblastoma, hypothesizing that if this epigenetic mark can predict overall and progression free survival in high-risk stage 4 neuroblastoma, it could be utilized to improve the risk stratification of the patients, alone or in conjunction with the previously identified methylation of the SFN gene (14.3.3sigma that can accurately predict outcome in these patients. We have utilized univariate and multivariate models to compare the prognostic power of PCDHB methylation in terms of overall and progression free survival, quantitatively determined by pyrosequencing, with that of other markers utilized for the patients' stratification utilizing methylation thresholds calculated on neuroblastoma at stage 1-4 and only on stage 4, high-risk patients. Our results indicate that PCDHB accurately distinguishes between high- and intermediate/low risk stage 4 neuroblastoma in agreement with the established risk stratification criteria. However PCDHB cannot predict outcome in the subgroup of stage 4 patients at high-risk whereas methylation levels of SFN are suggestive of a "methylation gradient" associated with tumor aggressiveness as suggested by the finding of a higher threshold that defines a subset of patients with an extremely severe disease (OS <24 months. Because of the heterogeneity of neuroblastoma we believe that clinically relevant methylation markers should be selected and tested on homogeneous groups of patients rather than on patients at all stages.

  8. Risk factors for high blood pressure in women attending menopause clinics in Italy.

    Science.gov (United States)

    2006-01-10

    We analysed risk factors for high blood pressure (BP) among women around menopause. Eligible women were consecutively attending first-level outpatient menopause clinics in Italy for general counseling or treatment of menopausal symptoms. During the visit BP was measured three times. The mean of second and third of the three diastolic BP values for women was >90mm of mercury and/or reporting any current pharmacological treatment for high BP were considered hypertensive. Out of 45,204 women who entered the study with information on blood pressure, 12,150 had high BP. The odds ratios (OR) of high BP increased with age: in comparison with women aged or =58, respectively. Women with high BP were less educated than those without (OR education >12 versus 26. In comparison with women reporting no regular physical activity, the multivariate OR of high BP was 0.93 (95% CI, 0.87-0.99) for women reporting regular activity. In comparison with peri-menopausal women, post-menopausal women were at increased risk (OR 1.14, 95% CI, 1.03-1.24) and the risk tended to increase with age at menopause. Current use of hormonal replacement therapy (HRT) was associated with a lower risk of high BP (OR 0.88, 95% CI, 0.84-0.94). This large cross-sectional study suggests that, after taking into account the effect of age, post-menopausal women are at greater risk of high BP, but current HRT use slightly lowers the risk. Other determinants of high BP were low level of education, overweight, and low level of physical activity.

  9. The Intersection of Care Seeking and Clinical Capacity for Patients With Highly Pathogenic Avian Influenza A (H5N1) Virus in Indonesia: Knowledge and Treatment Practices of the Public and Physicians.

    Science.gov (United States)

    Kreslake, Jennifer M; Wahyuningrum, Yunita; Iuliano, Angela D; Storms, Aaron D; Lafond, Kathryn E; Mangiri, Amalya; Praptiningsih, Catharina Y; Safi, Basil; Uyeki, Timothy M; Storey, J Douglas

    2016-12-01

    Indonesia has the highest human mortality from highly pathogenic avian influenza (HPAI) A (H5N1) virus infection in the world. A survey of households (N=2520) measured treatment sources and beliefs among symptomatic household members. A survey of physicians (N=554) in various types of health care facilities measured knowledge, assessment and testing behaviors, and perceived clinical capacity. Households reported confidence in health care system capacity but infrequently sought treatment for potential HPAI H5N1 signs/symptoms. More clinicians were confident in their knowledge of diagnosis and treatment than in the adequacy of related equipment and resources at their facilities. Physicians expressed awareness of the HPAI H5N1 suspect case definition, yet expressed only moderate knowledge in questioning symptomatic patients about exposures. Self-reported likelihood of testing for HPAI H5N1 virus was high after learning of certain exposures. Knowledge of antiviral treatment was moderate, but it was higher among clinicians in puskesmas. Physicians in private outpatient clinics, the most heavily used facilities, reported the lowest confidence in their diagnostic and treatment capabilities. Educational campaigns can encourage recall of possible poultry exposure when patients are experiencing signs/symptoms and can raise awareness of the effectiveness of antivirals to drive people to seek health care. Clinicians may benefit from training regarding exposure assessment and referral procedures, particularly in private clinics. (Disaster Med Public Health Preparedness. 2016;10:838-847).

  10. Refractory coeliac disease in a country with a high prevalence of clinically-diagnosed coeliac disease.

    Science.gov (United States)

    Ilus, T; Kaukinen, K; Virta, L J; Huhtala, H; Mäki, M; Kurppa, K; Heikkinen, M; Heikura, M; Hirsi, E; Jantunen, K; Moilanen, V; Nielsen, C; Puhto, M; Pölkki, H; Vihriälä, I; Collin, P

    2014-02-01

    Refractory coeliac disease (RCD) is thought to be a rare disorder, but the accurate prevalence is unknown. We aimed to identify the prevalence of and the risk factors for developing RCD in a Finnish population where the clinical detection rate of coeliac disease is high. The study involved 11 hospital districts in Finland where the number of treated RCD patients (n = 44), clinically diagnosed coeliac disease patients (n = 12 243) and adult inhabitants (n = 1.7 million) was known. Clinical characteristics at diagnosis of coeliac disease between the RCD patients and patients with uncomplicated disease were compared. The prevalence of RCD was 0.31% among diagnosed coeliac disease patients and 0.002% in the general population. Of the enrolled 44 RCD patients, 68% had type I and 23% type II; in 9% the type was undetermined. Comparing 886 patients with uncomplicated coeliac disease with these 44 patients that developed RCD later in life, the latter were significantly older (median 56 vs 44 years, P coeliac disease. Patients with evolving RCD had more severe symptoms at the diagnosis of coeliac disease, including weight loss in 36% (vs. 16%, P = 0.001) and diarrhoea in 54% (vs. 38%, P = 0.050). Refractory coeliac disease is very rare in the general population. Patients of male gender, older age, severe symptoms or seronegativity at the diagnosis of coeliac disease are at risk of future refractory coeliac disease and should be followed up carefully. © 2014 John Wiley & Sons Ltd.

  11. Smoking increases the likelihood of Helicobacter pylori treatment failure.

    Science.gov (United States)

    Itskoviz, David; Boltin, Doron; Leibovitzh, Haim; Tsadok Perets, Tsachi; Comaneshter, Doron; Cohen, Arnon; Niv, Yaron; Levi, Zohar

    2017-07-01

    Data regarding the impact of smoking on the success of Helicobacter pylori (H. pylori) eradication are conflicting, partially due to the fact that sociodemographic status is associated with both smoking and H. pylori treatment success. We aimed to assess the effect of smoking on H. pylori eradication rates after controlling for sociodemographic confounders. Included were subjects aged 15 years or older, with a first time positive C 13 -urea breath test (C 13 -UBT) between 2007 to 2014, who underwent a second C 13 -UBT after receiving clarithromycin-based triple therapy. Data regarding age, gender, socioeconomic status (SES), smoking (current smokers or "never smoked"), and drug use were extracted from the Clalit health maintenance organization database. Out of 120,914 subjects with a positive first time C 13 -UBT, 50,836 (42.0%) underwent a second C 13 -UBT test. After excluding former smokers, 48,130 remained who were eligible for analysis. The mean age was 44.3±18.2years, 69.2% were females, 87.8% were Jewish and 12.2% Arabs, 25.5% were current smokers. The overall eradication failure rates were 33.3%: 34.8% in current smokers and 32.8% in subjects who never smoked. In a multivariate analysis, eradication failure was positively associated with current smoking (Odds Ratio {OR} 1.15, 95% CI 1.10-1.20, psmoking was found to significantly increase the likelihood of unsuccessful first-line treatment for H. pylori infection. Copyright © 2017 Editrice Gastroenterologica Italiana S.r.l. Published by Elsevier Ltd. All rights reserved.

  12. Obstetric History and Likelihood of Preterm Birth of Twins.

    Science.gov (United States)

    Easter, Sarah Rae; Little, Sarah E; Robinson, Julian N; Mendez-Figueroa, Hector; Chauhan, Suneet P

    2018-01-05

     The objective of this study was to investigate the relationship between preterm birth in a prior pregnancy and preterm birth in a twin pregnancy.  We performed a secondary analysis of a randomized controlled trial evaluating 17-α-hydroxyprogesterone caproate in twins. Women were classified as nulliparous, multiparous with a prior term birth, or multiparous with a prior preterm birth. We used logistic regression to examine the odds of spontaneous preterm birth of twins before 35 weeks according to past obstetric history.  Of the 653 women analyzed, 294 were nulliparas, 310 had a prior term birth, and 49 had a prior preterm birth. Prior preterm birth increased the likelihood of spontaneous delivery before 35 weeks (adjusted odds ratio [aOR]: 2.44, 95% confidence interval [CI]: 1.28-4.66), whereas prior term delivery decreased these odds (aOR: 0.55, 95% CI: 0.38-0.78) in the current twin pregnancy compared with the nulliparous reference group. This translated into a lower odds of composite neonatal morbidity (aOR: 0.38, 95% CI: 0.27-0.53) for women with a prior term delivery.  For women carrying twins, a history of preterm birth increases the odds of spontaneous preterm birth, whereas a prior term birth decreases odds of spontaneous preterm birth and neonatal morbidity for the current twin pregnancy. These results offer risk stratification and reassurance for clinicians. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  13. High-resolution melt PCR analysis for genotyping of Ureaplasma parvum isolates directly from clinical samples.

    Science.gov (United States)

    Payne, Matthew S; Tabone, Tania; Kemp, Matthew W; Keelan, Jeffrey A; Spiller, O Brad; Newnham, John P

    2014-02-01

    Ureaplasma sp. infection in neonates and adults underlies a variety of disease pathologies. Of the two human Ureaplasma spp., Ureaplasma parvum is clinically the most common. We have developed a high-resolution melt (HRM) PCR assay for the differentiation of the four serovars of U. parvum in a single step. Currently U. parvum strains are separated into four serovars by sequencing the promoter and coding region of the multiple-banded antigen (MBA) gene. We designed primers to conserved sequences within this region for PCR amplification and HRM analysis to generate reproducible and distinct melt profiles that distinguish clonal representatives of serovars 1, 3, 6, and 14. Furthermore, our HRM PCR assay could classify DNA extracted from 74 known (MBA-sequenced) test strains with 100% accuracy. Importantly, HRM PCR was also able to identify U. parvum serovars directly from 16 clinical swabs. HRM PCR performed with DNA consisting of mixtures of combined known serovars yielded profiles that were easily distinguished from those for single-serovar controls. These profiles mirrored clinical samples that contained mixed serovars. Unfortunately, melt curve analysis software is not yet robust enough to identify the composition of mixed serovar samples, only that more than one serovar is present. HRM PCR provides a single-step, rapid, cost-effective means to differentiate the four serovars of U. parvum that did not amplify any of the known 10 serovars of Ureaplasma urealyticum tested in parallel. Choice of reaction reagents was found to be crucial to allow sufficient sensitivity to differentiate U. parvum serovars directly from clinical swabs rather than requiring cell enrichment using microbial culture techniques.

  14. Clinical significance of pontine high signals identified on magnetic resonance imaging

    International Nuclear Information System (INIS)

    Watanabe, Masaki; Takahashi, Akira; Arahata, Yutaka; Motegi, Yoshimasa; Furuse, Masahiro.

    1993-01-01

    Spin-echo magnetic resonance imaging (MRI) was evaluated to 530 cases in order to investigate the clinical significance of pontine high signals. The subjects comprised 109 cases of pontine infarction with high signal on T 2 -weighted image and low signal on T 1 -weighted image (PI group), 145 of pontine high signal with high signal on T 2 -weighted image but normal signal on T 1 -weighted image (PH group) and 276 of age-matched control without abnormality either on T 1 or T 2 -weighted images (AC group). Subjective complaints such as vertigo-dizziness were more frequent in the PH group than in the PI group. In both PI and groups, periventricular hyperintensity as well as subcortical high signals in the supratentorium were more severe than in the AC group. These degrees were higher in the PI group than in the PH group. In conclusion, PH as well as PI may result from diffuse arteriosclerosis and PH is considered to be an early finding of pontine ischemia. (author)

  15. Clinical significance of pontine high signals identified on magnetic resonance imaging

    Energy Technology Data Exchange (ETDEWEB)

    Watanabe, Masaki; Takahashi, Akira (Nagoya Univ. (Japan). Faculty of Medicine); Arahata, Yutaka; Motegi, Yoshimasa; Furuse, Masahiro

    1993-07-01

    Spin-echo magnetic resonance imaging (MRI) was evaluated to 530 cases in order to investigate the clinical significance of pontine high signals. The subjects comprised 109 cases of pontine infarction with high signal on T[sub 2]-weighted image and low signal on T[sub 1]-weighted image (PI group), 145 of pontine high signal with high signal on T[sub 2]-weighted image but normal signal on T[sub 1]-weighted image (PH group) and 276 of age-matched control without abnormality either on T[sub 1] or T[sub 2]-weighted images (AC group). Subjective complaints such as vertigo-dizziness were more frequent in the PH group than in the PI group. In both PI and groups, periventricular hyperintensity as well as subcortical high signals in the supratentorium were more severe than in the AC group. These degrees were higher in the PI group than in the PH group. In conclusion, PH as well as PI may result from diffuse arteriosclerosis and PH is considered to be an early finding of pontine ischemia. (author).

  16. A Clinical Drug Library Screen Identifies Tosufloxacin as Being Highly Active against Staphylococcus aureus Persisters

    Directory of Open Access Journals (Sweden)

    Hongxia Niu

    2015-07-01

    Full Text Available To identify effective compounds that are active against Staphylococcus aureus (S. aureus persisters, we screened a clinical drug library consisting of 1524 compounds and identified six drug candidates that had anti-persister activity: tosufloxacin, clinafloxacin, sarafloxacin, doxycycline, thiostrepton, and chlorosalicylanilide. Among them, tosufloxacin had the highest anti-persister activity, which could completely eradicate S. aureus persisters within 2 days in vitro. Clinafloxacin ranked the second with very few persisters surviving the drug exposure. Interestingly, we found that both tosufloxacin and trovafloxacin that had high activity against persisters contained at the N-1 position the 2,4-difluorophenyl group, which is absent in other less active quinolones and may be associated with the high anti-persister activity. Further studies are needed to evaluate tosufloxacin in animal models and to explain its unique activity against bacterial persisters. Our findings may have implications for improved treatment of persistent bacterial infections.

  17. EPA guidance on the early detection of clinical high risk states of psychoses

    DEFF Research Database (Denmark)

    Schultze-Lutter, F; Michel, C; Schmidt, S J

    2015-01-01

    The aim of this guidance paper of the European Psychiatric Association is to provide evidence-based recommendations on the early detection of a clinical high risk (CHR) for psychosis in patients with mental problems. To this aim, we conducted a meta-analysis of studies reporting on conversion rates...... to psychosis in non-overlapping samples meeting any at least any one of the main CHR criteria: ultra-high risk (UHR) and/or basic symptoms criteria. Further, effects of potential moderators (different UHR criteria definitions, single UHR criteria and age) on conversion rates were examined. Conversion rates...... in the identified 42 samples with altogether more than 4000 CHR patients who had mainly been identified by UHR criteria and/or the basic symptom criterion 'cognitive disturbances' (COGDIS) showed considerable heterogeneity. While UHR criteria and COGDIS were related to similar conversion rates until 2-year follow...

  18. Analysis of allergen immunotherapy studies shows increased clinical efficacy in highly symptomatic patients

    DEFF Research Database (Denmark)

    Howarth, P; Malling, Hans-Jørgen; Molimard, M

    2011-01-01

    them. Thus, clinical studies of AIT can neither establish baseline symptom levels nor limit the enrolment of patients to those with the most severe symptoms. Allergen immunotherapy treatment effects are therefore diluted by patients with low symptoms for a particular pollen season. The objective...... tertiles). The difference observed in the average score in each tertile in active vs placebo-treated patients was assessed. This allowed an estimation of the efficacy that could be achieved in patients from sites where symptoms were high during the pollen season. Results:  An increased treatment effect...... of this analysis was to assess the effect possible to achieve with AIT in the groups of patients presenting the most severe allergic symptoms. Methods:  Study centres were grouped into tertiles categorized according to symptom severity scores observed in the placebo patients in each centre (low, middle and high...

  19. Physical and clinical evaluation of new high-strip-density radiographic grids

    International Nuclear Information System (INIS)

    Doi, K.; Frank, P.H.; Chan, H.P.; Vyborny, C.J.; Makino, S.; Iida, N.; Carlin, M.

    1983-01-01

    The imaging performance of new high-strip-density (HSD) grids having 57 lines/cm was compared with that of conventional low-strip-density (LSD) grids having 33 or 40 lines/cm. The unique advantage of HSD grids is that, under most standard radiographic conditions, the grid lines are not noticeable on the final image, even if the grid is stationary. This is due to the combined effect of the high fundamental spatial frequency of HSD grids, the modulation transfer function of screen-film systems and of the human visual system, and scattered radiation. Monte Carlo simulation studies, phantom images, and clinical evaluation indicate that HSD grids can provide contrast improvement factors and Bucky factors that are comparable to or slightly better than those obtained with LSD grids. Therefore, it may now be possible to eliminate moving Bucky trays from radiographic tables and fluoroscopic devices

  20. A highly invasive human glioblastoma pre-clinical model for testing therapeutics

    Directory of Open Access Journals (Sweden)

    Cao Brian

    2008-12-01

    Full Text Available Abstract Animal models greatly facilitate understanding of cancer and importantly, serve pre-clinically for evaluating potential anti-cancer therapies. We developed an invasive orthotopic human glioblastoma multiforme (GBM mouse model that enables real-time tumor ultrasound imaging and pre-clinical evaluation of anti-neoplastic drugs such as 17-(allylamino-17-demethoxy geldanamycin (17AAG. Clinically, GBM metastasis rarely happen, but unexpectedly most human GBM tumor cell lines intrinsically possess metastatic potential. We used an experimental lung metastasis assay (ELM to enrich for metastatic cells and three of four commonly used GBM lines were highly metastatic after repeated ELM selection (M2. These GBM-M2 lines grew more aggressively orthotopically and all showed dramatic multifold increases in IL6, IL8, MCP-1 and GM-CSF expression, cytokines and factors that are associated with GBM and poor prognosis. DBM2 cells, which were derived from the DBTRG-05MG cell line were used to test the efficacy of 17AAG for treatment of intracranial tumors. The DMB2 orthotopic xenografts form highly invasive tumors with areas of central necrosis, vascular hyperplasia and intracranial dissemination. In addition, the orthotopic tumors caused osteolysis and the skull opening correlated to the tumor size, permitting the use of real-time ultrasound imaging to evaluate antitumor drug activity. We show that 17AAG significantly inhibits DBM2 tumor growth with significant drug responses in subcutaneous, lung and orthotopic tumor locations. This model has multiple unique features for investigating the pathobiology of intracranial tumor growth and for monitoring systemic and intracranial responses to antitumor agents.

  1. Using a Malcolm Baldrige framework to understand high-performing clinical microsystems.

    Science.gov (United States)

    Foster, Tina C; Johnson, Julie K; Nelson, Eugene C; Batalden, Paul B

    2007-10-01

    BACKGROUND, OBJECTIVES AND METHOD: The Malcolm Baldrige National Quality Award (MBNQA) provides a set of criteria for organisational quality assessment and improvement that has been used by thousands of business, healthcare and educational organisations for more than a decade. The criteria can be used as a tool for self-evaluation, and are widely recognised as a robust framework for design and evaluation of healthcare systems. The clinical microsystem, as an organisational construct, is a systems approach for providing clinical care based on theories from organisational development, leadership and improvement. This study compared the MBNQA criteria for healthcare and the success factors of high-performing clinical microsystems to (1) determine whether microsystem success characteristics cover the same range of issues addressed by the Baldrige criteria and (2) examine whether this comparison might better inform our understanding of either framework. Both Baldrige criteria and microsystem success characteristics cover a wide range of areas crucial to high performance. Those particularly called out by this analysis are organisational leadership, work systems and service processes from a Baldrige standpoint, and leadership, performance results, process improvement, and information and information technology from the microsystem success characteristics view. Although in many cases the relationship between Baldrige criteria and microsystem success characteristics are obvious, in others the analysis points to ways in which the Baldrige criteria might be better understood and worked with by a microsystem through the design of work systems and a deep understanding of processes. Several tools are available for those who wish to engage in self-assessment based on MBNQA criteria and microsystem characteristics.

  2. Clinical evaluation of unadapted sheep submited to sudden intake of melon with high levels of sugar

    Directory of Open Access Journals (Sweden)

    Francisco Leonardo Costa Oliveira

    2015-12-01

    Full Text Available This study evaluated the clinical effects of two different amounts of melon, with a high sugar content, suddenly offered to unadapted sheep. Twelve rumem cannulated crossbred 8-months-old sheep , weighing 25 kg each, were used. These sheep had never been fed with food concentrated with sugar or fruits. The animals were kept in collective pens with a basal diet of roughage and then randomly divided into two equal groups. The sheep in the two groups received 25% and 75% of dry matter (DM of the diet the crushed melon, administered by the rumen cannula. Physical examination and measurement of rumen fluid pH was performed at the following times: 0, 3, 6, 12, 18 and 24 h. The animals of G25% did not present clinical signs despite subacute acidosis expected after administration of the melon. However, in the G75%, sheep developed clinical manifestation indicative of lactic acidosis with rumen fluid pH lower than 5.0 from T6h, but did not present with dehydration. In sheep from G75 %, tachycardia was observed at 3 h and continued until the end of the study; tachypnea was also observed at 3 h, which was caused by increased abdominal circumference. Based on the results obtained, the supplementation of high amounts of melon (75% DM in the diet is not recommended for sheep, although the use of 25% DM is safe. However, greater amounts of this fruit could be used in the diet of sheep with gradual adaptation to the substrate.

  3. How to Identify High-Risk APS Patients: Clinical Utility and Predictive Values of Validated Scores.

    Science.gov (United States)

    Oku, Kenji; Amengual, Olga; Yasuda, Shinsuke; Atsumi, Tatsuya

    2017-08-01

    Antiphospholipid syndrome (APS) is a clinical disorder characterised by thrombosis and/or pregnancy morbidity in the persistence of antiphospholipid (aPL) antibodies that are pathogenic and have pro-coagulant activities. Thrombosis in APS tends to recur and require prophylaxis; however, the stereotypical treatment for APS patients is inadequate and stratification of the thrombotic risks is important as aPL are prevalently observed in various diseases or elderly population. It is previously known that the multiple positive aPL or high titre aPL correlate to thrombotic events. To progress the stratification of thrombotic risks in APS patients and to quantitatively analyse those risks, antiphospholipid score (aPL-S) and the Global Anti-phospholipid Syndrome Score (GAPSS) were defined. These scores were raised from the large patient cohort data and either aPL profile classified in detail (aPL-S) or simplified aPL profile with classical thrombotic risk factors (GAPSS) was put into a scoring system. Both the aPL-S and GAPSS have shown a degree of accuracy in identifying high-risk APS patients, especially those at a high risk of thrombosis. However, there are several areas requiring improvement, or at least that clinicians should be aware of, before these instruments are applied in clinical practice. One such issue is standardisation of the aPL tests, including general testing of phosphatidylserine-dependent antiprothrombin antibodies (aPS/PT). Additionally, clinicians may need to be aware of the patient's medical history, particularly with respect to the incidence of SLE, which influences the cutoff value for identifying high-risk patients.

  4. Pretreatment data is highly predictive of liver chemistry signals in clinical trials.

    Science.gov (United States)

    Cai, Zhaohui; Bresell, Anders; Steinberg, Mark H; Silberg, Debra G; Furlong, Stephen T

    2012-01-01

    The goal of this retrospective analysis was to assess how well predictive models could determine which patients would develop liver chemistry signals during clinical trials based on their pretreatment (baseline) information. Based on data from 24 late-stage clinical trials, classification models were developed to predict liver chemistry outcomes using baseline information, which included demographics, medical history, concomitant medications, and baseline laboratory results. Predictive models using baseline data predicted which patients would develop liver signals during the trials with average validation accuracy around 80%. Baseline levels of individual liver chemistry tests were most important for predicting their own elevations during the trials. High bilirubin levels at baseline were not uncommon and were associated with a high risk of developing biochemical Hy's law cases. Baseline γ-glutamyltransferase (GGT) level appeared to have some predictive value, but did not increase predictability beyond using established liver chemistry tests. It is possible to predict which patients are at a higher risk of developing liver chemistry signals using pretreatment (baseline) data. Derived knowledge from such predictions may allow proactive and targeted risk management, and the type of analysis described here could help determine whether new biomarkers offer improved performance over established ones.

  5. Clinical and epidemiological features of extrapulmonary tuberculosis in a high incidence region.

    Directory of Open Access Journals (Sweden)

    Carlos Pérez-Guzmán

    2014-03-01

    Full Text Available Objective. To describe the clinical features of extrapulmonary tuberculosis (EXPTB and to evaluate epidemiological data to search for potential explanations for its high frequency in the state of Aguascalientes, Mexico. Materials and methods. Clinical records of all patients with tuberculosis seen in Aguascalientes in 2008 were reviewed, and official databases were analyzed. Results. EXPTB comprised 60.5% of the 86 cases evaluated, being lymph nodes the main site affected. Patients with EXPTB were younger and more obese than subjects with pulmonary tuberculosis (PTB. One third of cases in either group had diabetes, a frequency much higher than expected. Epidemiological analysis showed that PTB incidence, but not EXPTB incidence, decreases as geographical altitude increases, and had a descendent trend from 1997 to 2011. Conclusions. The lower frequency of PTB (due to its inverse relationship with altitude and its descendent trend in last years might explain the high frequency of EXPTB. Obesity appeared to protect against developing pulmonary involvement, and diabetes was more frequent than expected among PTB and EXPTB cases.

  6. Clinical and epidemiological features of extrapulmonary tuberculosis in a high incidence region.

    Science.gov (United States)

    Pérez-Guzmán, Carlos; Vargas, Mario H; Arellano-Macías, María del Rosario; Hernández-Cobos, Silvia; García-Ituarte, Aurea Zelindabeth; Serna-Vela, Francisco Javier

    2014-04-01

    To describe the clinical features of extrapulmonary tuberculosis (EXPTB) and to evaluate epidemiological data to search for potential explanations for its high frequency in the state of Aguascalientes, Mexico. Clinical records of all patients with tuberculosis seen in Aguascalientes in 2008 were reviewed, and official databases were analyzed. EXPTB comprised 60.5% of the 86 cases evaluated, being lymph nodes the main site affected. Patients with EXPTB were younger and more obese than subjects with pulmonary tuberculosis (PTB). One third of cases in either group had diabetes, a frequency much higher than expected. Epidemiological analysis showed that PTB incidence, but not EXPTB incidence, decreases as geographical altitude increases, and had a descendent trend from 1997 to 2011. The lower frequency of PTB (due to its inverse relationship with altitude and its descendent trend in last years) might explain the high frequency of EXPTB. Obesity appeared to protect against developing pulmonary involvement, and diabetes was more frequent than expected among PTB and EXPTB cases.

  7. Reinforcement of high-risk anastomoses using laser-activated protein solders: a clinical study

    Science.gov (United States)

    Libutti, Steven K.; Bessler, Marc; Chabot, J.; Bass, Lawrence S.; Oz, Mehmet C.; Auteri, Joseph S.; Kirsch, Andrew J.; Nowygrod, Roman; Treat, Michael R.

    1993-07-01

    Anastomotic leakage or breakdown can result in catastrophic complications and significantly increased post-operative morbidity and mortality. Certain anastomoses are subject to a higher incidence of disruption and are therefore termed high risk. In an attempt to decrease the risk of anastomotic leaks, we reinforced sutured anastomoses with a laser activated protein solder in patients undergoing esophagojejunostomies (n equals 2), lung transplantation (n equals 2), and pancreaticojejunostomies (Whipple procedure, n equals 5). The protein solder was composed of 1.0 ml of a 25% human albumin solution, 1.0 ml of sodium hyaluronate, and 0.1 ml of Cardiogreen dye. This composition was applied to the sutured anastomosis and activated with an 860 nm pulsed diode laser. Drains were placed when appropriate and patients were followed for up to 10 months post-operatively and assessed for clinical signs of anastomotic leaks. Results to data demonstrated that there were no immediate complications as a result of the procedure. Operative time was not significantly lengthened. There were no cases of clinically significant leakage from any of the reinforced anastomoses. Laser activated protein solders may help to reduce the incidence of leakage in high risk anastomoses. Large numbers of patients and longer follow-up is needed however, to draw significant conclusions.

  8. Are high doses of carbidopa a concern? A randomized clinical trial in Parkinson’s disease

    Science.gov (United States)

    Brod, Lissa S.; Aldred, Jason L.; Nutt, John G.

    2013-01-01

    Background Recommended doses of carbidopa are 75–200 mg/day. Higher doses could inhibit brain aromatic amino acid decarboxylase and reduce clinical effects. Methods We compared 4-week outpatient treatments with carbidopa 75 mg and 450 mg/day administered with levodopa on the subjects’ normal schedule. After each treatment phase subjects had two 2-hour levodopa infusions. The first infusion examined the effects of carbidopa doses administered the preceding four weeks and the second infusion determined the acute effects of the two dosages of carbidopa. The antiparkinsonian effects and levodopa and carbidopa plasma concentrations were monitored during the infusions. Results Twelve subjects completed the study. Carbidopa concentrations were eight times higher after the high carbidopa phase. Area under the curve (AUC) for clinical ratings did not differ for the four levodopa infusions although AUC for plasma levodopa was modestly increased with 450 mg of carbidopa. Nine subjects reported the high carbidopa outpatient phase was associated with greater response to levodopa. Conclusion Doses of 450 mg/day of carbidopa did not reduce the responses to levodopa infusion, extending the safe range of carbidopa to 450 mg/day. PMID:22508376

  9. Does knowledge of coronary artery calcium affect cardiovascular risk perception, likelihood of taking action, and health-promoting behavior change?

    Science.gov (United States)

    Johnson, Jennie E; Gulanick, Meg; Penckofer, Sue; Kouba, Joanne

    2015-01-01

    Evidence indicates that a healthy lifestyle can reduce cardiovascular disease risk, yet many people engage in unhealthy behaviors. New technologies such as coronary artery calcium (CAC) screening detect atherosclerosis before clinical disease is manifested. Knowledge of an abnormal finding could provide the "teachable moment" to enhance motivation for change. The aim of this study was to examine how knowledge of CAC score affects risk perception, likelihood of taking action, and health-promoting behavior change in persons at high risk for cardiovascular disease. This study used a descriptive prospective design with 174 high-risk adults (≥3 major risk factors) recruited at a radiology center offering CAC scans. Baseline self-report surveys using the Perception of Risk of Heart Disease Scale, the Benefits and Barriers Scale, the Quality of Life Index, and the Health-Promoting Lifestyle Profile II were completed immediately after a screening CAC scan but before results were known. Follow-up occurred 3 months later using mailed packets. Participants' mean age was 58 years; 62% were men, 89% were white, and most were well educated. There was no significant change in risk perception scores over time or between groups, except for a positive interaction in the moderate-risk group (CAC scores of 101-400) (P = .004). Quality of life remained unchanged. Health-promoting behavior changes increased in all groups over time (P behavior change were perceived barriers (β = -.41; P Knowledge of CAC score does impact risk perception for some at-risk groups. This knowledge does enhance motivation for behavior change. Knowledge of CAC score does not impact quality of life. It is hoped that through improved understanding of the effect of CAC scoring on behavior change, nurses can better assist patients to modify behaviors during teachable moments.

  10. Likelihood of cesarean delivery after applying leading active labor diagnostic guidelines.

    Science.gov (United States)

    Neal, Jeremy L; Lowe, Nancy K; Phillippi, Julia C; Ryan, Sharon L; Knupp, Amy M; Dietrich, Mary S; Thung, Stephen F

    2017-06-01

    Friedman, the United Kingdom's National Institute for Health and Care Excellence (NICE), and the American College of Obstetricians and Gynecologists/Society for Maternal-Fetal Medicine (ACOG/SMFM) support different active labor diagnostic guidelines. Our aims were to compare likelihoods for cesarean delivery among women admitted before vs in active labor by diagnostic guideline (within-guideline comparisons) and between women admitted in active labor per one or more of the guidelines (between-guideline comparisons). Active labor diagnostic guidelines were retrospectively applied to cervical examination data from nulliparous women with spontaneous labor onset (n = 2573). Generalized linear models were used to determine outcome likelihoods within- and between-guideline groups. At admission, 15.7%, 48.3%, and 10.1% of nulliparous women were in active labor per Friedman, NICE, and ACOG/SMFM diagnostic guidelines, respectively. Cesarean delivery was more likely among women admitted before vs in active labor per the Friedman (AOR 1.75 [95% CI 1.08-2.82] or NICE guideline (AOR 2.55 [95% CI 1.84-3.53]). Between guidelines, cesarean delivery was less likely among women admitted in active labor per the NICE guideline, as compared with the ACOG/SMFM guideline (AOR 0.55 [95% CI 0.35-0.88]). Many nulliparous women are admitted to the hospital before active labor onset. These women are significantly more likely to have a cesarean delivery. Diagnosing active labor before admission or before intervention to speed labor may be one component of a multi-faceted approach to decreasing the primary cesarean rate in the United States. The NICE diagnostic guideline is more inclusive than Friedman or ACOG/SMFM guidelines and its use may be the most clinically useful for safely lowering cesarean rates. © 2017 Wiley Periodicals, Inc.

  11. Supervised maximum-likelihood weighting of composite protein networks for complex prediction

    Directory of Open Access Journals (Sweden)

    Yong Chern Han

    2012-12-01

    Full Text Available Abstract Background Protein complexes participate in many important cellular functions, so finding the set of existent complexes is essential for understanding the organization and regulation of processes in the cell. With the availability of large amounts of high-throughput protein-protein interaction (PPI data, many algorithms have been proposed to discover protein complexes from PPI networks. However, such approaches are hindered by the high rate of noise in high-throughput PPI data, including spurious and missing interactions. Furthermore, many transient interactions are detected between proteins that are not from the same complex, while not all proteins from the same complex may actually interact. As a result, predicted complexes often do not match true complexes well, and many true complexes go undetected. Results We address these challenges by integrating PPI data with other heterogeneous data sources to construct a composite protein network, and using a supervised maximum-likelihood approach to weight each edge based on its posterior probability of belonging to a complex. We then use six different clustering algorithms, and an aggregative clustering strategy, to discover complexes in the weighted network. We test our method on Saccharomyces cerevisiae and Homo sapiens, and show that complex discovery is improved: compared to previously proposed supervised and unsupervised weighting approaches, our method recalls more known complexes, achieves higher precision at all recall levels, and generates novel complexes of greater functional similarity. Furthermore, our maximum-likelihood approach allows learned parameters to be used to visualize and evaluate the evidence of novel predictions, aiding human judgment of their credibility. Conclusions Our approach integrates multiple data sources with supervised learning to create a weighted composite protein network, and uses six clustering algorithms with an aggregative clustering strategy to

  12. High-Level Disinfection of Otorhinolaryngology Clinical Instruments: An Evaluation of the Efficacy and Cost-effectiveness of Instrument Storage.

    Science.gov (United States)

    Yalamanchi, Pratyusha; Yu, Jason; Chandler, Laura; Mirza, Natasha

    2018-01-01

    Objectives Despite increasing interest in individual instrument storage, risk of bacterial cross-contamination of otorhinolaryngology clinic instruments has not been assessed. This study is the first to determine the clinical efficacy and cost-effectiveness of standard high-level disinfection and clinic instrument storage. Methods To assess for cross-contamination, surveillance cultures of otorhinolaryngology clinic instruments subject to standard high-level disinfection and storage were obtained at the start and end of the outpatient clinical workday. Rate of microorganism recovery was compared with cultures of instruments stored in individual peel packs and control cultures of contaminated instruments. Based on historical clinic data, the direct allocation method of cost accounting was used to determine aggregate raw material cost and additional labor hours required to process and restock peel-packed instruments. Results Among 150 cultures of standard high-level disinfected and co-located clinic instruments, 3 positive bacterial cultures occurred; 100% of control cultures were positive for bacterial species ( P cost of individual semicritical instrument storage at $97,852.50 per year. Discussion With in vitro inoculation of >200 otorhinolaryngology clinic instruments, this study demonstrates that standard high-level disinfection and storage are equally efficacious to more time-consuming and expensive individual instrument storage protocols, such as peel packing, with regard to bacterial contamination. Implications for Practice Standard high-level disinfection and storage are equally effective to labor-intensive and costly individual instrument storage protocols.

  13. Optoacoustic diagnostic modality: from idea to clinical studies with highly compact laser diode-based systems

    Science.gov (United States)

    Esenaliev, Rinat O.

    2017-09-01

    Optoacoustic (photoacoustic) diagnostic modality is a technique that combines high optical contrast and ultrasound spatial resolution. We proposed using the optoacoustic technique for a number of applications, including cancer detection, monitoring of thermotherapy (hyperthermia, coagulation, and freezing), monitoring of cerebral blood oxygenation in patients with traumatic brain injury, neonatal patients, fetuses during late-stage labor, central venous oxygenation monitoring, and total hemoglobin concentration monitoring as well as hematoma detection and characterization. We developed and built optical parametric oscillator-based systems and multiwavelength, fiber-coupled highly compact, laser diode-based systems for optoacoustic imaging, monitoring, and sensing. To provide sufficient output pulse energy, a specially designed fiber-optic system was built and incorporated in ultrasensitive, wideband optoacoustic probes. We performed preclinical and clinical tests of the systems and the optoacoustic probes in backward mode for most of the applications and in forward mode for the breast cancer and cerebral applications. The high pulse energy and repetition rate allowed for rapid data acquisition with high signal-to-noise ratio from cerebral blood vessels, such as the superior sagittal sinus, central veins, and peripheral veins and arteries, as well as from intracranial hematomas. The optoacoustic systems were capable of automatic, real-time, continuous measurements of blood oxygenation in these blood vessels.

  14. Clinical heterogeneity among people with high functioning autism spectrum conditions: evidence favouring a continuous severity gradient

    Directory of Open Access Journals (Sweden)

    Woodbury-Smith Marc

    2008-02-01

    Full Text Available Abstract Background Autism Spectrum Conditions (ASCs are characterized by a high degree of clinical heterogeneity, but the extent to which this variation represents a severity gradient versus discrete phenotypes is unclear. This issue has complicated genetic studies seeking to investigate the genetic basis of the high hereditability observed clinically in those with an ASC. The aim of this study was to examine the possible clustering of symptoms associated with ASCs to determine whether the observed distribution of symptom type and severity supported either a severity or a symptom subgroup model to account for the phenotypic variation observed within the ASCs. Methods We investigated the responses of a group of adults with higher functioning ASCs on the fifty clinical features examined in the Autism Spectrum Quotient, a screening questionnaire used in the diagnosis of higher functioning ASCs. In contrast to previous studies we have used this instrument with no a priori assumptions about any underlying factor structure of constituent items. The responses obtained were analyzed using complete linkage hierarchical cluster analysis. For the members of each cluster identified the mean score on each Autism Spectrum Quotient question was calculated. Results Autism Spectrum Quotient responses from a total of 333 individuals between the ages of 16.6 and 78.0 years were entered into the hierarchical cluster analysis. The four cluster solution was the one that generated the largest number of clusters that did not also include very small cluster sizes, defined as a membership comprising 10 individuals or fewer. Examination of these clusters demonstrated that they varied in total Autism Spectrum Quotient but that the profiles across the symptoms comprising the Autism Spectrum Quotient did not differ independently of this severity factor. Conclusion These results are consistent with a unitary spectrum model, suggesting that the clinical heterogeneity observed

  15. Clinical application of percutaneous cholecystostomy in the treatment of high-risk patients with acute cholecystitis

    International Nuclear Information System (INIS)

    Qoap Delin; ZZhou Bing; Chen Shiwei; Dong Jiangnan; Hua Yingxue; Chen Bo

    2009-01-01

    Objective: To discuss the therapeutic strategy and the clinical efficacy of percutaneous cholecystostomy in treating high-risk patients with acute cholecystitis. Methods: During the period of Jan. 2006-June 2008, percutaneous cholecystostomy was performed in 27 high-risk patients with acute cholecystitis, consisting of lithic cholecystitis (n = 21) and non-lithic cholecystitis (n = 6). Of 27 patients, percutaneous cholecystostomy via transhepatic approach was performed in 22 and via transperitoneal approach in 5. The 7 F drainage catheter was used. Cholecystography was conducted before the drainage catheter was extracted. Results: Percutaneous cholecystostomy was successfully accomplished in all 27 cases, with a technical success rate of 100%. Postoperative patency of gallbladder drainage was obtained in 25 patients, with the relieving or subsiding of abdominal pain and the restoring of temperature and leukocyte account to normal range within 72 hours. In one patient, as the abdominal pain relief was not obvious 72 hours after the procedure, cholecystography was employed and it revealed the obstruction of the drainage catheter. After reopening of the drainage catheter, the abdominal pain was relieved. In another case, cholecystography was carried out because the abdominal pain became worse after the procedure, and minor bile leak was demonstrated. After powerful anti-infective and symptomatic medication, the abdominal pain was alleviated. The drainage catheter was extracted in 25 patients 6-7 weeks after the treatment. Of these 25 patients, 12 accepted selective cholecystectomy, 7 received percutaneous cholecystolithotomy and 6 with non-lithic cholecystitis did not get any additional surgery. The remaining two patients were living with long-term retention of the indwelling drainage-catheter. Conclusion: Percutaneous cholecystostomy is a simple, safe and effective treatment for acute cholecystitis in high-risk patients. This technique is of great value in clinical

  16. Echo planar perfusion imaging with high spatial and temporal resolution: methodology and clinical aspects

    International Nuclear Information System (INIS)

    Bitzer, M.; Klose, U.; Naegele, T.; Friese, S.; Kuntz, R.; Voigt, K.; Fetter, M.; Opitz, H.

    1999-01-01

    The purpose of the present study was to analyse specific advantages of calculated parameter images and their limitations using an optimized echo-planar imaging (EPI) technique with high spatial and temporal resolution. Dynamic susceptibility contrast magnetic resonance imaging (DSC-MRI) was performed in 12 patients with cerebrovascular disease and in 13 patients with brain tumours. For MR imaging of cerebral perfusion an EPI sequence was developed which provides a temporal resolution of 0.68 s for three slices with a 128 x 128 image matrix. To evaluate DSC-MRI, the following parameter images were calculated pixelwise: (1) Maximum signal reduction (MSR); (2) maximum signal difference (ΔSR); (3) time-to-peak (T p ); and (4) integral of signal-intensity-time curve until T p (S Int ). The MSR maps were superior in the detection of acute infarctions and ΔSR maps in the delineation of vasogenic brain oedema. The time-to-peak (T p ) maps seemed to be highly sensitive in the detection of poststenotic malperfused brain areas (sensitivity 90 %). Hyperperfused areas of brain tumours were detectable down to a diameter of 1 cm with high sensitivity (> 90 %). Distinct clinical and neuroradiological conditions revealed different suitabilities for the parameter images. The time-to-peak (T p ) maps may be an important advantage in the detection of poststenotic ''areas at risk'', due to an improved temporal resolution using an EPI technique. With regard to spatial resolution, a matrix size of 128 x 128 is sufficient for all clinical conditions. According to our results, a further increase in matrix size would not improve the spatial resolution in DSC-MRI, since the degree of the vascularization of lesions and the susceptibility effect itself seem to be the limiting factors. (orig.)

  17. Geisinger high-risk osteoporosis clinic (HiROC): 2013-2015 FLS performance analysis.

    Science.gov (United States)

    Dunn, P; Webb, D; Olenginski, T P

    2018-02-01

    Geisinger Health System (GHS) high-risk osteoporosis clinic (HiROC), which treats patients with low-trauma, fragility fractures, reports their 2013-2015 performance measures in secondary fracture prevention. This fracture liaison service (FLS) pathway treats 75% of high-risk, drug eligible patients, compared to 13.8% in GHS primary care. This performance points to the need for more FLS programs throughout the world. The purpose of this study is to analyze and report ongoing performance measures in outpatient and inpatient high-risk osteoporosis clinic (HiROC) program designed for patients with low-trauma, fragility fractures. Retrospective chart review of outpatient HiROC (511 patients) and inpatient HiROC (1279 patients) performance from 2013 to 2015 is reported within Geisinger Health System (GHS). Similar to a prior report, we document that Geisinger's branded outpatient and inpatient HiROC pathways continue to function as an all-fracture FLS. Importantly, this analysis emphasizes the importance of FLS care that HiROC's treatment rate of 75% was markedly superior to GHS-PCP care of 13.8%. However, a large percentage of patients (37.8%) were lost to follow-up care. This led to the identification of multiple care gaps/barriers to ideal best practice. FLS programs use case finding strategies and address secondary fracture prevention. GHS HiROC's performance and initiation of drug therapy in this fracture patient population contrasts with GHS-PCP care's much lower rate of treatment, documenting the need for ongoing FLS care. Importantly, the results of this analysis have prompted the beginnings of GHS programmatic changes, designed to narrow the reported care gaps in this mature FLS.

  18. Maximum Likelihood Estimation and Inference With Examples in R, SAS and ADMB

    CERN Document Server

    Millar, Russell B

    2011-01-01

    This book takes a fresh look at the popular and well-established method of maximum likelihood for statistical estimation and inference. It begins with an intuitive introduction to the concepts and background of likelihood, and moves through to the latest developments in maximum likelihood methodology, including general latent variable models and new material for the practical implementation of integrated likelihood using the free ADMB software. Fundamental issues of statistical inference are also examined, with a presentation of some of the philosophical debates underlying the choice of statis

  19. What is the optimal management of high risk, clinically localized prostate cancer?

    Science.gov (United States)

    Eastham, James A; Evans, Christopher P; Zietman, Anthony

    2010-01-01

    To summarize the presentations and debate regarding the optimal treatment of localized high-risk prostate cancer as presented at the 2009 Spring Meeting of the Society of Urologic Oncology. The debate was centered on presentations arguing for radical prostatectomy (RP) or radiotherapy as the optimal treatment for this condition. The meeting presentations are summarized by their respective presenters herein. Dr. James Eastham presents the varied definitions for "high-risk" prostate cancer as strongly influencing which patients end up in this cohort. Based upon this, between 3% and 38% of patients with high-risk features could be defined as "high-risk". Despite that, these men do not have a uniformly poor prognosis after RP, and attention to surgical principles as outlined improve outcomes. Disease-specific survival at 12 years is excellent and up to one-half of these men may not need adjuvant or salvage therapies, depending on their specific disease characteristics. Adjuvant or salvage radiotherapies improve outcomes and are part of a sequential approach to treating these patients. Dr. Anthony Zietman presented radiotherapy as the gold-standard based upon large, randomized clinical trials of intermediate- and high-risk prostate cancer patients. Compared with androgen deprivation alone, the addition of radiotherapy provided a 12% cancer-specific survival advantage and 10% overall survival advantage. Dose escalation seems to confer further improvements in cancer control without significant escalation of toxicities, with more data forthcoming. There are no randomized trials comparing RP to radiotherapy for any risk category. In high-risk prostate cancer patients, both approaches have potential benefits and cumulative toxicities that must be matched to disease characteristics and patient expectations in selecting a treatment course. Copyright (c) 2010 Elsevier Inc. All rights reserved.

  20. Is Primary Prostate Cancer Treatment Influenced by Likelihood of Extraprostatic Disease? A Surveillance, Epidemiology and End Results Patterns of Care Study

    International Nuclear Information System (INIS)

    Holmes, Jordan A.; Wang, Andrew Z.; Hoffman, Karen E.; Hendrix, Laura H.; Rosenman, Julian G.; Carpenter, William R.; Godley, Paul A.; Chen, Ronald C.

    2012-01-01

    Purpose: To examine the patterns of primary treatment in a recent population-based cohort of prostate cancer patients, stratified by the likelihood of extraprostatic cancer as predicted by disease characteristics available at diagnosis. Methods and Materials: A total of 157,371 patients diagnosed from 2004 to 2008 with clinically localized and potentially curable (node-negative, nonmetastatic) prostate cancer, who have complete information on prostate-specific antigen, Gleason score, and clinical stage, were included. Patients with clinical T1/T2 disease were grouped into categories of 50% likelihood of having extraprostatic disease using the Partin nomogram. Clinical T3/T4 patients were examined separately as the highest-risk group. Logistic regression was used to examine the association between patient group and receipt of each primary treatment, adjusting for age, race, year of diagnosis, marital status, Surveillance, Epidemiology and End Results database region, and county-level education. Separate models were constructed for primary surgery, external-beam radiotherapy (RT), and conservative management. Results: On multivariable analysis, increasing likelihood of extraprostatic disease was significantly associated with increasing use of RT and decreased conservative management. Use of surgery also increased. Patients with >50% likelihood of extraprostatic cancer had almost twice the odds of receiving prostatectomy as those with 50% likelihood of extraprostatic cancer (34%) and clinical T3–T4 disease (24%). The proportion of patients who received prostatectomy or conservative management was approximately 50% or slightly higher in all groups. Conclusions: There may be underutilization of RT in older prostate cancer patients and those with likely extraprostatic disease. Because more than half of prostate cancer patients do not consult with a radiation oncologist, a multidisciplinary consultation may affect the treatment decision-making process.

  1. A strategy to reduce cross-cultural miscommunication and increase the likelihood of improving health outcomes.

    Science.gov (United States)

    Kagawa-Singer, Marjorie; Kassim-Lakha, Shaheen

    2003-06-01

    Encounters between physicians and patients from different cultural backgrounds are becoming commonplace. Physicians strive to improve health outcomes and increase quality of life for every patient, yet these discordant encounters appear to be a significant factor, beyond socioeconomic barriers, in creating the unequal and avoidable excess burden of disease borne by members of ethnic minority populations in the United States. Most clinicians lack the information to understand how culture influences the clinical encounter and the skills to effectively bridge potential differences. New strategies are required to expand medical training to adequately address culturally discordant encounters among the physicians, their patients, and the families, for all three may have different concepts regarding the nature of the disease, expectations about treatment, and modes of appropriate communication beyond language. The authors provide an anthropological perspective of the fundamental relationship between culture and health, and outline systemic changes needed within the social and legal structures of the health care system to reduce the risk of cross-cultural miscommunication and increase the likelihood of improving health outcomes for all populations within the multicultural U.S. society. The authors define the strengths inherent within every culture, provide a guideline for the clinician to evaluate disease and illness within its cultural context, and outline the clinical skills required to negotiate among potential differences to reach mutually desired goals for care. Last, they indicate the structural changes required in the health care setting to enable and support such practice.

  2. Consistent high clinical pregnancy rates and low ovarian hyperstimulation syndrome rates in high-risk patients after GnRH agonist triggering and modified luteal support

    DEFF Research Database (Denmark)

    Iliodromiti, Stamatina; Blockeel, Christophe; Tremellen, Kelton P

    2013-01-01

    Are clinical pregnancy rates satisfactory and the incidence of OHSS low after GnRH agonist trigger and modified intensive luteal support in patients with a high risk of ovarian hyperstimulation syndrome (OHSS)?......Are clinical pregnancy rates satisfactory and the incidence of OHSS low after GnRH agonist trigger and modified intensive luteal support in patients with a high risk of ovarian hyperstimulation syndrome (OHSS)?...

  3. A scaling transformation for classifier output based on likelihood ratio: Applications to a CAD workstation for diagnosis of breast cancer

    International Nuclear Information System (INIS)

    Horsch, Karla; Pesce, Lorenzo L.; Giger, Maryellen L.; Metz, Charles E.; Jiang Yulei

    2012-01-01

    Purpose: The authors developed scaling methods that monotonically transform the output of one classifier to the ''scale'' of another. Such transformations affect the distribution of classifier output while leaving the ROC curve unchanged. In particular, they investigated transformations between radiologists and computer classifiers, with the goal of addressing the problem of comparing and interpreting case-specific values of output from two classifiers. Methods: Using both simulated and radiologists' rating data of breast imaging cases, the authors investigated a likelihood-ratio-scaling transformation, based on ''matching'' classifier likelihood ratios. For comparison, three other scaling transformations were investigated that were based on matching classifier true positive fraction, false positive fraction, or cumulative distribution function, respectively. The authors explored modifying the computer output to reflect the scale of the radiologist, as well as modifying the radiologist's ratings to reflect the scale of the computer. They also evaluated how dataset size affects the transformations. Results: When ROC curves of two classifiers differed substantially, the four transformations were found to be quite different. The likelihood-ratio scaling transformation was found to vary widely from radiologist to radiologist. Similar results were found for the other transformations. Our simulations explored the effect of database sizes on the accuracy of the estimation of our scaling transformations. Conclusions: The likelihood-ratio-scaling transformation that the authors have developed and evaluated was shown to be capable of transforming computer and radiologist outputs to a common scale reliably, thereby allowing the comparison of the computer and radiologist outputs on the basis of a clinically relevant statistic.

  4. arXiv FlavBit: A GAMBIT module for computing flavour observables and likelihoods

    CERN Document Server

    Bernlochner, Florian U.; Dal, Lars A.; Farmer, Ben; Jackson, Paul; Kvellestad, Anders; Mahmoudi, Farvah; Putze, Antje; Rogan, Christopher; Scott, Pat; Serra, Nicola; Weniger, Christoph; White, Martin

    2017-11-21

    Flavour physics observables are excellent probes of new physics up to very high energy scales. Here we present FlavBit, the dedicated flavour physics module of the global-fitting package GAMBIT. FlavBit includes custom implementations of various likelihood routines for a wide range of flavour observables, including detailed uncertainties and correlations associated with LHCb measurements of rare, leptonic and semileptonic decays of B and D mesons, kaons and pions. It provides a generalised interface to external theory codes such as SuperIso, allowing users to calculate flavour observables in and beyond the Standard Model, and then test them in detail against all relevant experimental data. We describe FlavBit and its constituent physics in some detail, then give examples from supersymmetry and effective field theory illustrating how it can be used both as a standalone library for flavour physics, and within GAMBIT.

  5. Pippi — Painless parsing, post-processing and plotting of posterior and likelihood samples

    Science.gov (United States)

    Scott, Pat

    2012-11-01

    Interpreting samples from likelihood or posterior probability density functions is rarely as straightforward as it seems it should be. Producing publication-quality graphics of these distributions is often similarly painful. In this short note I describe pippi, a simple, publicly available package for parsing and post-processing such samples, as well as generating high-quality PDF graphics of the results. Pippi is easily and extensively configurable and customisable, both in its options for parsing and post-processing samples, and in the visual aspects of the figures it produces. I illustrate some of these using an existing supersymmetric global fit, performed in the context of a gamma-ray search for dark matter. Pippi can be downloaded and followed at http://github.com/patscott/pippi.

  6. Parallel implementation of D-Phylo algorithm for maximum likelihood clusters.

    Science.gov (United States)

    Malik, Shamita; Sharma, Dolly; Khatri, Sunil Kumar

    2017-03-01

    This study explains a newly developed parallel algorithm for phylogenetic analysis of DNA sequences. The newly designed D-Phylo is a more advanced algorithm for phylogenetic analysis using maximum likelihood approach. The D-Phylo while misusing the seeking capacity of k -means keeps away from its real constraint of getting stuck at privately conserved motifs. The authors have tested the behaviour of D-Phylo on Amazon Linux Amazon Machine Image(Hardware Virtual Machine)i2.4xlarge, six central processing unit, 122 GiB memory, 8  ×  800 Solid-state drive Elastic Block Store volume, high network performance up to 15 processors for several real-life datasets. Distributing the clusters evenly on all the processors provides us the capacity to accomplish a near direct speed if there should arise an occurrence of huge number of processors.

  7. FlavBit. A GAMBIT module for computing flavour observables and likelihoods

    Energy Technology Data Exchange (ETDEWEB)

    Bernlochner, Florian U. [Physikalisches Institut der Rheinischen Friedrich-Wilhelms-Universitaet Bonn (Germany); Chrzaszcz, Marcin [Universitaet Zuerich, Physik-Institut, Zurich (Switzerland); Polish Academy of Sciences, H. Niewodniczanski Institute of Nuclear Physics, Krakow (Poland); Dal, Lars A. [University of Oslo, Department of Physics, Oslo (Norway); Farmer, Ben [Oskar Klein Centre for Cosmoparticle Physics, AlbaNova University Centre, Stockholm (Sweden); Stockholm University, Department of Physics, Stockholm (Sweden); Jackson, Paul; White, Martin [University of Adelaide, Department of Physics, Adelaide, SA (Australia); Australian Research Council Centre of Excellence for Particle Physics at the Tera-scale (Australia); Kvellestad, Anders [NORDITA, Stockholm (Sweden); Mahmoudi, Farvah [Univ Lyon, Univ Lyon 1, ENS de Lyon, CNRS, Centre de Recherche Astrophysique de Lyon UMR5574, Saint-Genis-Laval (France); CERN, Theoretical Physics Department, Geneva (Switzerland); Putze, Antje [LAPTh, Universite de Savoie, CNRS, Annecy-le-Vieux (France); Rogan, Christopher [Harvard University, Department of Physics, Cambridge, MA (United States); Scott, Pat [Imperial College London, Department of Physics, Blackett Laboratory, London (United Kingdom); Serra, Nicola [Universitaet Zuerich, Physik-Institut, Zurich (Switzerland); Weniger, Christoph [University of Amsterdam, GRAPPA, Institute of Physics, Amsterdam (Netherlands); Collaboration: The GAMBIT Flavour Workgroup

    2017-11-15

    Flavour physics observables are excellent probes of new physics up to very high energy scales. Here we present FlavBit, the dedicated flavour physics module of the global-fitting package GAMBIT. FlavBit includes custom implementations of various likelihood routines for a wide range of flavour observables, including detailed uncertainties and correlations associated with LHCb measurements of rare, leptonic and semileptonic decays of B and D mesons, kaons and pions. It provides a generalised interface to external theory codes such as SuperIso, allowing users to calculate flavour observables in and beyond the Standard Model, and then test them in detail against all relevant experimental data. We describe FlavBit and its constituent physics in some detail, then give examples from supersymmetry and effective field theory illustrating how it can be used both as a standalone library for flavour physics, and within GAMBIT. (orig.)

  8. Systemic Therapy for Youth at Clinical High Risk for Psychosis: A Pilot Study

    Directory of Open Access Journals (Sweden)

    Jingyu Shi

    2017-10-01

    Full Text Available Psychosocial intervention trials for youth at clinical high risk (CHR for psychosis have shown promising effects on treating psychotic symptoms but have not focused on psychosocial functional outcomes, and those studies have been conducted among help-seeking patients; there is a lack of research on non-clinical young CHR individuals. Systemic therapy (ST is grounded in systemic-constructivist and psychosocial resilience theories. It has a number of advantages that makes it attractive for use with CHR individuals in non-clinical context. The present study evaluated the effect of ST for students at CHR on reducing symptoms and enhancing psychosocial function. This was a single-blind randomized controlled trial for CHR young people comparing ST to supportive therapy with a 6-month treatment. Psychotic and depressive symptoms (DS as well as self-esteem and social support (SS were assessed at pre- and posttreatment. 26 CHR individuals were randomly divided into intervention group (n = 13 and control group (n = 13. There were no significant differences in severity of symptoms, level of SS and self-esteem at baseline between the two groups (P > 0.05. At posttreatment, significant improvements in positive and DS as well as SS and self-esteem were observed in the ST group (P < 0.05; in the control group, these improvements were not significant (P > 0.05. The findings indicated that systemic intervention for university students at CHR for psychosis may have a positive effect on symptoms and self-esteem as well as SS in short term. More long-term research is needed to further evaluate this intervention.

  9. High throughput static and dynamic small animal imaging using clinical PET/CT: potential preclinical applications

    International Nuclear Information System (INIS)

    Aide, Nicolas; Desmonts, Cedric; Agostini, Denis; Bardet, Stephane; Bouvard, Gerard; Beauregard, Jean-Mathieu; Roselt, Peter; Neels, Oliver; Beyer, Thomas; Kinross, Kathryn; Hicks, Rodney J.

    2010-01-01

    The objective of the study was to evaluate state-of-the-art clinical PET/CT technology in performing static and dynamic imaging of several mice simultaneously. A mouse-sized phantom was imaged mimicking simultaneous imaging of three mice with computation of recovery coefficients (RCs) and spillover ratios (SORs). Fifteen mice harbouring abdominal or subcutaneous tumours were imaged on clinical PET/CT with point spread function (PSF) reconstruction after injection of [18F]fluorodeoxyglucose or [18F]fluorothymidine. Three of these mice were imaged alone and simultaneously at radial positions -5, 0 and 5 cm. The remaining 12 tumour-bearing mice were imaged in groups of 3 to establish the quantitative accuracy of PET data using ex vivo gamma counting as the reference. Finally, a dynamic scan was performed in three mice simultaneously after the injection of 68 Ga-ethylenediaminetetraacetic acid (EDTA). For typical lesion sizes of 7-8 mm phantom experiments indicated RCs of 0.42 and 0.76 for ordered subsets expectation maximization (OSEM) and PSF reconstruction, respectively. For PSF reconstruction, SOR air and SOR water were 5.3 and 7.5%, respectively. A strong correlation (r 2 = 0.97, p 2 = 0.98; slope = 0.89, p 2 = 0.96; slope = 0.62, p 68 Ga-EDTA dynamic acquisition. New generation clinical PET/CT can be used for simultaneous imaging of multiple small animals in experiments requiring high throughput and where a dedicated small animal PET system is not available. (orig.)

  10. High rate of smoking in female patients with Mondor's disease in an outpatient clinic in Japan

    Directory of Open Access Journals (Sweden)

    Okumura T

    2012-09-01

    Full Text Available Toshikatsu Okumura,1 Masumi Ohhira,1 Tsukasa Nozu21Department of General Medicine, 2Department of Regional Medicine and Education, Asahikawa Medical University, Asahikawa, Hokkaido, JapanPurpose: Little is known about the epidemiology of Mondor's disease. The aim of this study was to analyze the clinical features of Mondor's disease in an outpatient clinic where primary care physicians are working in Japan, to better understand the epidemiological characteristics of the disease.Patients and methods: The data for consecutive outpatients who were new visitors to the Department of General Medicine in the teaching hospital (Asahikawa Medical University Hospital at Asahikawa Medical University, Asahikawa, Hokkaido, Japan, between April 2004 and March 2012 were analyzed. Parameters such as age, sex, diagnosis, and clinical presentation were investigated.Results: During the 8-year period covered in this study, six (0.07% out of 8767 patients were diagnosed as having Mondor's disease. All of these patients with Mondor's disease were female, and the mean age was 41 plus or minus 12 years; the overall rate of Mondor's disease in all female patients involved in this study was 0.12%. The patients complained of pain and a cord-like structure in the anterolateral thoracoabdominal wall. The painful mass had persisted for 1–4 weeks before presenting at the Department of General Medicine and it disappeared within a couple of weeks. Current smoking was significantly higher in the patients with Mondor's disease than in the age-matched female patients without Mondor's disease who were also evaluated in this study.Conclusion: These results suggest that a high rate of smoking in middle-aged females may be a characteristic feature of Mondor's disease. These epidemiological data may be useful in detection of the disease in the primary care setting in Japan.Keywords: primary care, epidemiology, current smoking, women

  11. CT Image Contrast of High-Z Elements: Phantom Imaging Studies and Clinical Implications.

    Science.gov (United States)

    FitzGerald, Paul F; Colborn, Robert E; Edic, Peter M; Lambert, Jack W; Torres, Andrew S; Bonitatibus, Peter J; Yeh, Benjamin M

    2016-03-01

    To quantify the computed tomographic (CT) image contrast produced by potentially useful contrast material elements in clinically relevant imaging conditions. Equal mass concentrations (grams of active element per milliliter of solution) of seven radiodense elements, including iodine, barium, gadolinium, tantalum, ytterbium, gold, and bismuth, were formulated as compounds in aqueous solutions. The compounds were chosen such that the active element dominated the x-ray attenuation of the solution. The solutions were imaged within a modified 32-cm CT dose index phantom at 80, 100, 120, and 140 kVp at CT. To simulate larger body sizes, 0.2-, 0.5-, and 1.0-mm-thick copper filters were applied. CT image contrast was measured and corrected for measured concentrations and presence of chlorine in some compounds. Each element tested provided higher image contrast than iodine at some tube potential levels. Over the range of tube potentials that are clinically practical for average-sized and larger adults-that is, 100 kVp and higher-barium, gadolinium, ytterbium, and tantalum provided consistently increased image contrast compared with iodine, respectively demonstrating 39%, 56%, 34%, and 24% increases at 100 kVp; 39%, 66%, 53%, and 46% increases at 120 kVp; and 40%, 72%, 65%, and 60% increases at 140 kVp, with no added x-ray filter. The consistently high image contrast produced with 100-140 kVp by tantalum compared with bismuth and iodine at equal mass concentration suggests that tantalum could potentially be favorable for use as a clinical CT contrast agent.

  12. Feasibility of using ultra-high field (7 T MRI for clinical surgical targeting.

    Directory of Open Access Journals (Sweden)

    Yuval Duchin

    Full Text Available The advantages of ultra-high magnetic field (7 Tesla MRI for basic science research and neuroscience applications have proven invaluable. Structural and functional MR images of the human brain acquired at 7 T exhibit rich information content with potential utility for clinical applications. However, (1 substantial increases in susceptibility artifacts, and (2 geometrical distortions at 7 T would be detrimental for stereotactic surgeries such as deep brain stimulation (DBS, which typically use 1.5 T images for surgical planning. Here, we explore whether these issues can be addressed, making feasible the use of 7 T MRI to guide surgical planning. Twelve patients with Parkinson's disease, candidates for DBS, were scanned on a standard clinical 1.5 T MRI and a 7 T MRI scanner. Qualitative and quantitative assessments of global and regional distortion were evaluated based on anatomical landmarks and transformation matrix values. Our analyses show that distances between identical landmarks on 1.5 T vs. 7 T, in the mid-brain region, were less than one voxel, indicating a successful co-registration between the 1.5 T and 7 T images under these specific imaging parameter sets. On regional analysis, the central part of the brain showed minimal distortion, while inferior and frontal areas exhibited larger distortion due to proximity to air-filled cavities. We conclude that 7 T MR images of the central brain regions have comparable distortions to that observed on a 1.5 T MRI, and that clinical applications targeting structures such as the STN, are feasible with information-rich 7 T imaging.

  13. Clinical high risk for psychosis in children and adolescents: a systematic review.

    Science.gov (United States)

    Tor, Jordina; Dolz, Montserrat; Sintes, Anna; Muñoz, Daniel; Pardo, Marta; de la Serna, Elena; Puig, Olga; Sugranyes, Gisela; Baeza, Inmaculada

    2017-09-15

    The concept of being at risk for psychosis has been introduced both for adults and children and adolescents, but fewer studies have been conducted in the latter population. The aim of this study is to systematically review the articles associated with clinical description, interventions, outcome and other areas in children and adolescents at risk for psychosis. We searched in MEDLINE/PubMed and PsycINFO databases for articles published up to 30/06/16. Reviewed articles were prospective studies; written in English; original articles with Clinical High Risk (CHR) for psychosis samples; and mean age of samples younger than 18 years. From 103 studies initially selected, 48 met inclusion criteria and were systematically reviewed. Studies show that CHR children and adolescents present several clinical characteristics at baseline, with most attenuated positive-symptom inclusion criteria observed, reporting mostly perceptual abnormalities and suspiciousness, and presenting comorbid conditions such as depressive and anxiety disorders. CHR children and adolescents show lower general intelligence and no structural brain changes compared with controls. Original articles reviewed show rates of conversion to psychosis between 17 and 20% at 1 year follow-up and between 7 and 21% at 2 years. While 36% of patients recovered from their CHR status at 6-year follow-up, 40% still met CHR criteria. Studies in children and adolescents with CHR were conducted with different methodologies, assessments tools and small samples. It is important to conduct studies on psychopharmacological and psychological treatment, as well as replication of the few studies found.

  14. Evaluation of robustness of maximum likelihood cone-beam CT reconstruction with total variation regularization

    International Nuclear Information System (INIS)

    Stsepankou, D; Arns, A; Hesser, J; Ng, S K; Zygmanski, P

    2012-01-01

    The objective of this paper is to evaluate an iterative maximum likelihood (ML) cone–beam computed tomography (CBCT) reconstruction with total variation (TV) regularization with respect to the robustness of the algorithm due to data inconsistencies. Three different and (for clinical application) typical classes of errors are considered for simulated phantom and measured projection data: quantum noise, defect detector pixels and projection matrix errors. To quantify those errors we apply error measures like mean square error, signal-to-noise ratio, contrast-to-noise ratio and streak indicator. These measures are derived from linear signal theory and generalized and applied for nonlinear signal reconstruction. For quality check, we focus on resolution and CT-number linearity based on a Catphan phantom. All comparisons are made versus the clinical standard, the filtered backprojection algorithm (FBP). In our results, we confirm and substantially extend previous results on iterative reconstruction such as massive undersampling of the number of projections. Errors of projection matrix parameters of up to 1° projection angle deviations are still in the tolerance level. Single defect pixels exhibit ring artifacts for each method. However using defect pixel compensation, allows up to 40% of defect pixels for passing the standard clinical quality check. Further, the iterative algorithm is extraordinarily robust in the low photon regime (down to 0.05 mAs) when compared to FPB, allowing for extremely low-dose image acquisitions, a substantial issue when considering daily CBCT imaging for position correction in radiotherapy. We conclude that the ML method studied herein is robust under clinical quality assurance conditions. Consequently, low-dose regime imaging, especially for daily patient localization in radiation therapy is possible without change of the current hardware of the imaging system. (paper)

  15. Radiotherapy and androgen ablation for clinically localized high-risk prostate cancer

    Energy Technology Data Exchange (ETDEWEB)

    Pollack, Alan; Zagars, Gunar K; Kopplin, Susan

    1995-04-30

    Purpose: The response of patients with clinical stages T1-4 prostate cancer to radiotherapy is variable. A particularly poor prognostic group has been found to be comprised of those with pretreatment prostate specific antigen (PSA) levels above 30 ng/ml with any tumor grade, or PSA levels > 10 and {<=} 30 with tumors grade 3 or 4. These patients have over an 80% actuarial risk of biochemical failure 3 years after definitive external beam radiotherapy. Thus, patients with these high-risk features require more aggressive therapy. During the last 3-4 years, the policy to treat such patients with radiotherapy and androgen ablation (XRT/HORM) was instituted. A retrospective comparison was made between high-risk patients treated with radiotherapy alone (XRT) vs. XRT/HORM. Methods and Materials: Between 1987 and 1991, there were 81 high-risk patients treated with XRT. There were 38 high-risk patients treated with XRT/HORM between 1990 and 1992. The median follow-up was 37 months for the XRT group and 22 months for the XRT/HORM group. No patient had clinical, radiographic, or pathologic evidence of lymph node involvement. The median dose to the prostate was 66 Gy for the XRT group and 68 Gy for the XRT/HORM group. Results: The distributions of several potential prognostic factors were analyzed. Significant differences between the groups were observed for tumor grade, pretreatment prostatic acid phosphatase, and age. The XRT/HORM group was composed of patients with worse features, including a greater proportion of patients with grade 4 tumors, more with abnormal acid phosphatase levels, and more under 60 years of age. The actuarial incidence of a rising PSA at 3 years for the XRT group was 81% vs. 15% for the XRT/HORM group (p < 0.0001). In addition, local relapse at 3 years was 34% for the XRT group and 15% for the XRT/HORM group (p < 0.02). There was no difference between the groups in terms of survival. Cox proportional hazards analyses were performed using several

  16. Effect of vaccination against sub-clinical Porcine Circovirus type 2 infection in a high-health finishing pig herd

    DEFF Research Database (Denmark)

    Nielsen, Gitte Blach; Nielsen, Jens Peter; Haugegaard, John

    2017-01-01

    During the last decade, the clinical manifestation of Porcine Circovirus type 2 (PCV2) infections has mostly changed from postweaning multisystemic wasting syndrome and high mortality to sub-clinical infections manifested only through impaired production parameters. However, co-infection with oth...

  17. Pediatric High Grade Glioma: a Review and Update on Tumor Clinical Characteristics and Biology

    International Nuclear Information System (INIS)

    Fangusaro, Jason

    2012-01-01

    High grade gliomas (HGG) are one of the most common central nervous system (CNS) tumors encountered in adults, but they only represent approximately 8–12% of all pediatric CNS tumors. Historically, pediatric HGG were thought to be similar to adult HGG since they appear histologically identical; however, molecular, genetic, and biologic data reveal that they are distinct. Similar to adults, pediatric HGG are very aggressive and malignant lesions with few patients achieving long-term survival despite a variety of therapies. Initial treatment strategies typically consist of a gross total resection (GTR) when feasible followed by focal radiotherapy combined with chemotherapy. Over the last few decades, a wealth of data has emerged from basic science and pre-clinical animal models helping to better define the common biologic, genetic, and molecular make-up of these tumors. These data have not only provided a better understanding of tumor biology, but they have also provided new areas of research targeting molecular and genetic pathways with the potential for novel treatment strategies and improved patient outcomes. Here we provide a review of pediatric non-brainstem HGG, including epidemiology, presentation, histology, imaging characteristics, treatments, survival outcomes, and an overview of both basic and translational research. An understanding of all relevant pre-clinical tumor models, including their strengths and pitfalls is essential in realizing improved patient outcomes in this population.

  18. High Level Expression and Purification of the Clinically Active Antimicrobial Peptide P-113 in Escherichia coli

    Directory of Open Access Journals (Sweden)

    Kuang-Ting Cheng

    2018-03-01

    Full Text Available P-113, which was originally derived from the human saliva protein histatin 5, is a histidine-rich antimicrobial peptide with the sequence AKRHHGYKRKFH. P-113 is currently undergoing phase II clinical trial as a pharmaceutical agent to fight against fungal infections in HIV patients with oral candidiasis. Previously, we developed a new procedure for the high-yield expression and purification of hG31P, an analogue and antagonist of human CXCL8. Moreover, we have successfully removed lipopolysaccharide (LPS, endotoxin associated with hG31P in the expression with Escherichia coli. In this paper, we have used hG31P as a novel fusion protein for the expression and purification of P-113. The purity of the expressed P-113 is more than 95% and the yield is 4 mg P-113 per liter of E. coli cell culture in Luria-Bertani (LB medium. The antimicrobial activity of the purified P-113 was tested. Furthermore, we used circular dichroism (CD and nuclear magnetic resonance (NMR spectroscopy to study the structural properties of P-113. Our results indicate that using hG31P as a fusion protein to obtain large quantities of P-113 is feasible and is easy to scale up for commercial production. An effective way of producing enough P-113 for future clinical studies is evident in this study.

  19. The High-Density Lipoprotein Puzzle: Why Classic Epidemiology, Genetic Epidemiology, and Clinical Trials Conflict?

    Science.gov (United States)

    Rosenson, Robert S

    2016-05-01

    Classical epidemiology has established the incremental contribution of the high-density lipoprotein (HDL) cholesterol measure in the assessment of atherosclerotic cardiovascular disease risk; yet, genetic epidemiology does not support a causal relationship between HDL cholesterol and the future risk of myocardial infarction. Therapeutic interventions directed toward cholesterol loading of the HDL particle have been based on epidemiological studies that have established HDL cholesterol as a biomarker of atherosclerotic cardiovascular risk. However, therapeutic interventions such as niacin, cholesteryl ester transfer protein inhibitors increase HDL cholesterol in patients treated with statins, but have repeatedly failed to reduce cardiovascular events. Statin therapy interferes with ATP-binding cassette transporter-mediated macrophage cholesterol efflux via miR33 and thus may diminish certain HDL functional properties. Unraveling the HDL puzzle will require continued technical advances in the characterization and quantification of multiple HDL subclasses and their functional properties. Key mechanistic criteria for clinical outcomes trials with HDL-based therapies include formation of HDL subclasses that improve the efficiency of macrophage cholesterol efflux and compositional changes in the proteome and lipidome of the HDL particle that are associated with improved antioxidant and anti-inflammatory properties. These measures require validation in genetic studies and clinical trials of HDL-based therapies on the background of statins. © 2016 American Heart Association, Inc.

  20. Pediatric High Grade Glioma: a Review and Update on Tumor Clinical Characteristics and Biology

    Energy Technology Data Exchange (ETDEWEB)

    Fangusaro, Jason, E-mail: jfangusaro@luriechildrens.org [Pediatric Neuro-Oncology, The Ann & Robert H. Lurie Children’s Hospital of Chicago, Feinberg School of Medicine, Northwestern University, Chicago, IL (United States)

    2012-08-24

    High grade gliomas (HGG) are one of the most common central nervous system (CNS) tumors encountered in adults, but they only represent approximately 8–12% of all pediatric CNS tumors. Historically, pediatric HGG were thought to be similar to adult HGG since they appear histologically identical; however, molecular, genetic, and biologic data reveal that they are distinct. Similar to adults, pediatric HGG are very aggressive and malignant lesions with few patients achieving long-term survival despite a variety of therapies. Initial treatment strategies typically consist of a gross total resection (GTR) when feasible followed by focal radiotherapy combined with chemotherapy. Over the last few decades, a wealth of data has emerged from basic science and pre-clinical animal models helping to better define the common biologic, genetic, and molecular make-up of these tumors. These data have not only provided a better understanding of tumor biology, but they have also provided new areas of research targeting molecular and genetic pathways with the potential for novel treatment strategies and improved patient outcomes. Here we provide a review of pediatric non-brainstem HGG, including epidemiology, presentation, histology, imaging characteristics, treatments, survival outcomes, and an overview of both basic and translational research. An understanding of all relevant pre-clinical tumor models, including their strengths and pitfalls is essential in realizing improved patient outcomes in this population.

  1. Low-cost, high-resolution scanning laser ophthalmoscope for the clinical environment

    Science.gov (United States)

    Soliz, P.; Larichev, A.; Zamora, G.; Murillo, S.; Barriga, E. S.

    2010-02-01

    Researchers have sought to gain greater insight into the mechanisms of the retina and the optic disc at high spatial resolutions that would enable the visualization of small structures such as photoreceptors and nerve fiber bundles. The sources of retinal image quality degradation are aberrations within the human eye, which limit the achievable resolution and the contrast of small image details. To overcome these fundamental limitations, researchers have been applying adaptive optics (AO) techniques to correct for the aberrations. Today, deformable mirror based adaptive optics devices have been developed to overcome the limitations of standard fundus cameras, but at prices that are typically unaffordable for most clinics. In this paper we demonstrate a clinically viable fundus camera with auto-focus and astigmatism correction that is easy to use and has improved resolution. We have shown that removal of low-order aberrations results in significantly better resolution and quality images. Additionally, through the application of image restoration and super-resolution techniques, the images present considerably improved quality. The improvements lead to enhanced visualization of retinal structures associated with pathology.

  2. Clinical Presentation, Aetiology, and Outcomes of Meningitis in a Setting of High HIV and TB Prevalence

    Directory of Open Access Journals (Sweden)

    Keneuoe Hycianth Thinyane

    2015-01-01

    Full Text Available Meningitis causes significant morbidity and mortality globally. The aim of this study was to study the clinical presentation, aetiology, and outcomes of meningitis among adult patients admitted to Queen Mamohato Memorial Hospital in Maseru, Lesotho, with a diagnosis of meningitis. A cross-sectional study was conducted between February and April 2014; data collected included presenting signs and symptoms, laboratory results, and clinical outcomes. Descriptive statistics were used to summarise data; association between variables was analysed using Fisher’s exact test. 56 patients were enrolled; the HIV coinfection rate was 79%. The most common presenting symptoms were altered mental status, neck stiffness, headache, and fever. TB meningitis was the most frequent diagnosis (39%, followed by bacterial (27%, viral (18%, and cryptococcal meningitis (16%. In-hospital mortality was 43% with case fatalities of 23%, 40%, 44%, and 90% for TB, bacterial, cryptococcal, and viral meningitis, respectively. Severe renal impairment was significantly associated with mortality. In conclusion, the causes of meningitis in this study reflect the high prevalence of HIV and TB in our setting. Strategies to reduce morbidity and mortality due to meningitis should include improving diagnostic services to facilitate early detection and treatment of meningitis and timely initiation of antiretroviral therapy in HIV-infected patients.

  3. Clinical outcome of high-dose-rate interstitial brachytherapy in patients with oral cavity cancer

    International Nuclear Information System (INIS)

    Lee, Sung Uk; Cho, Kwan Ho; Moon, Sung Ho; Choi, Sung Weon; Park, Joo Yong; Yun, Tak; Lee, Sang Hyun; Lim, Young Kyung; Jeong, Chi Young

    2014-01-01

    To evaluate the clinical outcome of high-dose-rate (HDR) interstitial brachytherapy (IBT) in patients with oral cavity cancer. Sixteen patients with oral cavity cancer treated with HDR remote-control afterloading brachytherapy using 192Ir between 2001 and 2013 were analyzed retrospectively. Brachytherapy was administered in 11 patients as the primary treatment and in five patients as salvage treatment for recurrence after the initial surgery. In 12 patients, external beam radiotherapy (50-55 Gy/25 fractions) was combined with IBT of 21 Gy/7 fractions. In addition, IBT was administered as the sole treatment in three patients with a total dose of 50 Gy/10 fractions and as postoperative adjuvant treatment in one patient with a total of 35 Gy/7 fractions. The 5-year overall survival of the entire group was 70%. The actuarial local control rate after 3 years was 84%. All five recurrent cases after initial surgery were successfully salvaged using IBT +/- external beam radiotherapy. Two patients developed local recurrence at 3 and 5 months, respectively, after IBT. The acute complications were acceptable (< or =grade 2). Three patients developed major late complications, such as radio-osteonecrosis, in which one patient was treated by conservative therapy and two required surgical intervention. HDR IBT for oral cavity cancer was effective and acceptable in diverse clinical settings, such as in the cases of primary or salvage treatment.

  4. Differences between opening versus closing high tibial osteotomy on clinical outcomes and gait analysis.

    Science.gov (United States)

    Deie, Masataka; Hoso, Takayuki; Shimada, Noboru; Iwaki, Daisuke; Nakamae, Atsuo; Adachi, Nobuo; Ochi, Mitsuo

    2014-12-01

    High tibial osteotomy (HTO) for medial knee osteoarthritis (OA) is mainly performed via two procedures: closing wedge HTO (CW) and opening wedge HTO (OW). In this study, differences between these procedures were assessed by serial clinical evaluation and gait analysis before and after surgery. Twenty-one patients underwent HTO for medial knee OA in 2011 and 2012, with 12 patients undergoing CW and nine undergoing OW. The severity of OA was classified according to the Kellgren-Lawrence classification. The Japanese Orthopedic Association score for assessment of knee OA (JOA score), the Numeric Rating Scale (NRS), and the femoral tibial angle (FTA) on X-ray were evaluated. For gait analysis, gait speed, varus moment, varus angle and lateral thrust were calculated. The JOA score and NRS were improved significantly one year postoperatively in both groups. The FTA was maintained in both groups at one year. Varus angle and varus moment were significantly improved in both groups at each postoperative follow-up, when compared preoperatively. Lateral thrust was significantly improved at three months postoperatively in both groups. However, the significant improvement in lateral thrust had disappeared in the CW group six months postoperatively, whereas it was maintained for at least one year in the OW group. This study found that clinical outcomes were well maintained after HTO. OW reduced knee varus moment and lateral thrust, whereas CW had little effect on reducing lateral thrust. Level IV. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. MAX to MYCN intracellular ratio drives the aggressive phenotype and clinical outcome of high risk neuroblastoma.

    Science.gov (United States)

    Ferrucci, Francesca; Ciaccio, Roberto; Monticelli, Sara; Pigini, Paolo; di Giacomo, Simone; Purgato, Stefania; Erriquez, Daniela; Bernardoni, Roberto; Norris, Murray; Haber, Michelle; Milazzo, Giorgio; Perini, Giovanni

    2018-03-01

    Childhood neuroblastoma, a disease of the sympathetic nervous system, is the most common solid tumour of infancy, remarkably refractory to therapeutic treatments. One of the most powerful independent prognostic indicators for this disease is the amplification of the MYCN oncogene, which occurs at high levels in approximately 25% of neuroblastomas. Interestingly, amplification and not just expression of MYCN has a strong prognostic value, although this fact appears quite surprising as MYCN is a transcription factor that requires dimerising with its partner MAX, to exert its function. This observation greatly suggests that the role of MYCN in neuroblastoma should be examined in the context of MAX expression. In this report, we show that, in contrast to what is found in normal cells, MAX expression is significantly different among primary NBs, and that its level appears to correlate with the clinical outcome of the disease. Importantly, controlled modulation of MAX expression in neuroblastoma cells with different extents of MYCN amplification, demonstrates that MAX can instruct gene transcription programs that either reinforce or weaken the oncogenic process enacted by MYCN. In general, our work illustrates that it is the MAX to MYCN ratio that can account for tumour progression and clinical outcome in neuroblastoma and proposes that such a ratio should be considered as an important criterion to the design and development of anti-MYCN therapies. Copyright © 2018 Elsevier B.V. All rights reserved.

  6. High-definition optical coherence tomography - an aid to clinical practice and research in dermatology.

    Science.gov (United States)

    Cao, Taige; Tey, Hong Liang

    2015-09-01

    At present, beyond clinical assessment, the diagnosis of skin diseases is primarily made histologically. However, skin biopsies have many disadvantages, including pain, scarring, risk of infection, and sampling error. With recent advances in skin imaging technology, the clinical use of imaging methods for the practical management of skin diseases has become an option. The in vivo high-definition optical coherence tomography (HD-OCT) has recently been developed and commercialized (Skintell; Agfa, Belgium). Compared with conventional OCT, it has a higher resolution; compared with reflectance confocal microscopy, it has a shorter time for image acquisition as well as a greater penetration depth and a larger field of view. HD-OCT is promising but much work is still required to develop it from a research tool to a valuable adjunct for the noninvasive diagnosis of skin lesions. Substantial work has been done to identify HD-OCT features in various diseases but interpretation can be time-consuming and tedious. Projects aimed at automating these processes and improving image quality are currently under way. © 2015 Deutsche Dermatologische Gesellschaft (DDG). Published by John Wiley & Sons Ltd.

  7. DREAM3: network inference using dynamic context likelihood of relatedness and the inferelator.

    Directory of Open Access Journals (Sweden)

    Aviv Madar

    2010-03-01

    Full Text Available Many current works aiming to learn regulatory networks from systems biology data must balance model complexity with respect to data availability and quality. Methods that learn regulatory associations based on unit-less metrics, such as Mutual Information, are attractive in that they scale well and reduce the number of free parameters (model complexity per interaction to a minimum. In contrast, methods for learning regulatory networks based on explicit dynamical models are more complex and scale less gracefully, but are attractive as they may allow direct prediction of transcriptional dynamics and resolve the directionality of many regulatory interactions.We aim to investigate whether scalable information based methods (like the Context Likelihood of Relatedness method and more explicit dynamical models (like Inferelator 1.0 prove synergistic when combined. We test a pipeline where a novel modification of the Context Likelihood of Relatedness (mixed-CLR, modified to use time series data is first used to define likely regulatory interactions and then Inferelator 1.0 is used for final model selection and to build an explicit dynamical model.Our method ranked 2nd out of 22 in the DREAM3 100-gene in silico networks challenge. Mixed-CLR and Inferelator 1.0 are complementary, demonstrating a large performance gain relative to any single tested method, with precision being especially high at low recall values. Partitioning the provided data set into four groups (knock-down, knock-out, time-series, and combined revealed that using comprehensive knock-out data alone provides optimal performance. Inferelator 1.0 proved particularly powerful at resolving the directionality of regulatory interactions, i.e. "who regulates who" (approximately of identified true positives were correctly resolved. Performance drops for high in-degree genes, i.e. as the number of regulators per target gene increases, but not with out-degree, i.e. performance is not affected by

  8. Clinical analysis of modified trabeculectomy in glaucoma surgery with high elevated intraocular pressure

    Directory of Open Access Journals (Sweden)

    Cang-Xia Zhang

    2013-10-01

    Full Text Available AIM: To make a retrospective analysis of the clinical data of modified trabeculectomy in treating glaucoma surgery with high elevated intraocular pressure retrospectively and evaluate the effect of modified trabeculectomy.METHODS:One hundred acute angle-closure glaucoma patients(100 eyeswith persistent high intraocular pressure were divided into treatment group(45 eyesand control group(55 eyes. Patients in treatment group was treated with by trabeculectomy, while those in control group received modified trabeculectomy. The modified measures include stellate ganglion block preoperative, topical anesthesia and local anesthesia with 20g/L lidocaine cotton-piece, to make scleral flap with sclerotome, to release aqueous humor outflow slowly after paracentesis of anterior chamber, and using mydriatic and cycloplegic during and after surgery.RESULTS: The incidence of operation complicationin control group was lower than that in treatment group. The differences were statistically significant(Pt=9.1535, Pt=39.8010, Pt=11.3219, PCONCLUSION: The modified trabeculectomy applied in the treatment of glaucoma with persistent high intraocular pressure can not only save the visual function of connection part to a certain extent, but also reduce the incidence of serious complications. It can obtain better intraocular pressure, shorten the average hospitalization days, decrease the expenses and increase patients satisfaction.

  9. Record High US Measles Cases: Patient Vaccination, Clinical Assessment and Management

    Centers for Disease Control (CDC) Podcasts

    2014-06-30

    This podcast is an overview of the Clinician Outreach and Communication Activity (COCA) Call: Record High US Measles Cases: Patient Vaccination, Clinical Assessment and Management. In May 2014, the United States recorded the largest number of reported measles cases since 1994 and the number continues to rise. Most cases reported have been acquired in the U.S. and are associated with importations from countries where measles is still common. This highly contagious, acute viral illness spreads quickly in unvaccinated populations once reaching the U.S. The recent measles outbreaks highlight the importance of maintaining high vaccination coverage in the U.S. and ensuring age-appropriate vaccination for international travelers. During this COCA call, clinicians will learn the status of measles in the U.S. and CDC vaccination recommendations and guidelines for patient assessment and management.  Created: 6/30/2014 by : National Center for Immunization and Respiratory Diseases; Division of Viral Diseases; Healthcare Preparedness Activity (HPA); Office of Public Health Preparedness and Response (OPHPR).   Date Released: 6/30/2014.

  10. Clinical Features of Patients with Basedow's Disease and High Serum IgG4 Levels.

    Science.gov (United States)

    Torimoto, Keiichi; Okada, Yosuke; Kurozumi, Akira; Narisawa, Manabu; Arao, Tadashi; Tanaka, Yoshiya

    2017-01-01

    Objective IgG4-related disease is a recently characterized condition presenting with high blood IgG4 levels, swelling of organs, and hypertrophic lesions. This disease is associated with thyroid disease, Hashimoto's disease, and Riedel's thyroiditis. However, there is little information on the association between IgG4-related disease and Basedow's disease. We herein defined the clinical features of patients with Basedow's disease and high IgG4 levels. Methods We compared two groups of patients with Basedow's disease (n=72) who had either normal IgG4 levels (IgG4 levels (≥135 mg/dL; n=5 [6.9%], mean IgG4: 206±116 mg/dL, IgG4/IgG ratio: 10.6%±3.3%). Patients Seventy-two newly diagnosed, untreated patients with Basedow's disease. Results Compared to the normal IgG4 group, patients in the high IgG4 group were predominantly male and showed a significantly higher thyroid low-echo score (1.8±0.4 vs. 1.2±0.5) and eosinophil count (363±354/mm 2 vs. 136±122/mm 2 ). Five patients had high IgG4 levels: one had a pancreatic lesion, and four had thyroid lesions. Conclusion Patients with Basedow's disease and high IgG4 levels may represent a new subtype of Basedow's disease. Further studies with larger sample sizes are needed.

  11. The main challenges that remain in applying high-throughput sequencing to clinical diagnostics.

    Science.gov (United States)

    Loeffelholz, Michael; Fofanov, Yuriy

    2015-01-01

    Over the last 10 years, the quality, price and availability of high-throughput sequencing instruments have improved to the point that this technology may be close to becoming a routine tool in the diagnostic microbiology laboratory. Two groups of challenges, however, have to be resolved in order to move this powerful research technology into routine use in the clinical microbiology laboratory. The computational/bioinformatics challenges include data storage cost and privacy concerns, requiring analysis to be performed without access to cloud storage or expensive computational infrastructure. The logistical challenges include interpretation of complex results and acceptance and understanding of the advantages and limitations of this technology by the medical community. This article focuses on the approaches to address these challenges, such as file formats, algorithms, data collection, reporting and good laboratory practices.

  12. Clinical evaluation of high-intensity focused ultrasound in treating uterus myomas

    International Nuclear Information System (INIS)

    Peng Jingjing; Tan Yan; Wei Dong; Li Yan; Zhao Zhengguo; Gao hui; Zhang Tao

    2010-01-01

    Objective: To explore the safety and efficacy of high-intensity focused ultrasound (HIFU) for the treatment of uterus myomas. Methods: HIFU was performed in 47 patients with symptomatic hysteromyoma, who had a childbearing history and were 26-59 years old. Postoperative follow-up was carried out. Clinical symptoms and the tumor's size were observed before and after the HIFU treatment. The results were compared with each other. Results: After HIFU treatment, the symptoms such as dysmenorrhea and hypermenorrhea were markedly improved. Some patients developed hematuria or lower limb pain, which was relieved after symptomatic management. The average volume of myoma before the treatment was (47.6 ± 24.1) cm 3 and it was reduced to (17.7 ± 13.1) cm 3 at 6 months after the treatment, the difference was statistically significant (P < 0.05). Conclusion: HIFU is a safe and effective treatment for uterus myomas. (authors)

  13. Fractal-like Distributions over the Rational Numbers in High-throughput Biological and Clinical Data

    Science.gov (United States)

    Trifonov, Vladimir; Pasqualucci, Laura; Dalla-Favera, Riccardo; Rabadan, Raul

    2011-12-01

    Recent developments in extracting and processing biological and clinical data are allowing quantitative approaches to studying living systems. High-throughput sequencing (HTS), expression profiles, proteomics, and electronic health records (EHR) are some examples of such technologies. Extracting meaningful information from those technologies requires careful analysis of the large volumes of data they produce. In this note, we present a set of fractal-like distributions that commonly appear in the analysis of such data. The first set of examples are drawn from a HTS experiment. Here, the distributions appear as part of the evaluation of the error rate of the sequencing and the identification of tumorogenic genomic alterations. The other examples are obtained from risk factor evaluation and analysis of relative disease prevalence and co-mordbidity as these appear in EHR. The distributions are also relevant to identification of subclonal populations in tumors and the study of quasi-species and intrahost diversity of viral populations.

  14. High lipoprotein(a) as a possible cause of clinical familial hypercholesterolaemia

    DEFF Research Database (Denmark)

    Langsted, Anne; Kamstrup, Pia Rørbœk; Benn, Marianne

    2016-01-01

    , and that individuals with both high lipoprotein(a) concentrations and clinical familial hypercholesterolaemia have the highest risk of myocardial infarction. METHODS: We did a prospective cohort study that included data from 46 200 individuals from the Copenhagen General Population Study who had lipoprotein...... cholesterol, mean lipoprotein(a) concentrations were 23 mg/dL in individuals unlikely to have familial hypercholesterolaemia, 32 mg/dL in those with possible familial hypercholesterolaemia, and 35 mg/dL in those with probable or definite familial hypercholesterolaemia (ptrend... LDL cholesterol for lipoprotein(a) cholesterol content the corresponding values were 24 mg/dL for individuals unlikely to have familial hypercholesterolaemia, 22 mg/dL for those with possible familial hypercholesterolaemia, and 21 mg/dL for those with probable or definite familial...

  15. Treatment of esophageal tumors using high intensity intraluminal ultrasound: first clinical results

    Directory of Open Access Journals (Sweden)

    Prat Frederic

    2008-06-01

    Full Text Available Abstract Background Esophageal tumors generally bear a poor prognosis. Radical surgery is generally the only curative method available but is not feasible in the majority of patients; palliative therapy with stent placement is generally performed. It has been demonstrated that High Intensity Ultrasound can induce rapid, complete and well-defined coagulation necrosis. Thus, for the treatment of esophageal tumors, we have designed an ultrasound applicator that uses an intraluminal approach to fill up this therapeutic gap. Methods Thermal ablation is performed with water-cooled ultrasound transducers operating at a frequency of 10 MHz. Single lesions extend from the transducer surface up to 10 mm in depth when applying an intensity of 14 W/cm2 for 10s. A lumen inside the therapy applicator provides path for an endoscopic ultrasound imaging probe operating at a frequency of 12 MHz. The mechanical rotation of the applicator around its axis enables treatment of sectorial or cylindrical volumes. This method is thus particularly suitable for esophageal tumors that may develop only on a portion of the esophageal circumference. Previous experiments were conducted from bench to in vivo studies on pig esophagi. Results Here we report clinical results obtained on four patients included in a pilot study. The treatment of esophageal tumors was performed under fluoroscopic guidance and ultrasound imaging. Objective tumor response was obtained in all cases and a complete necrosis of a tumor was obtained in one case. All patients recovered uneventfully and dysphagia improved significantly within 15 days, allowing for resuming a solid diet in three cases. Conclusion This clinical work demonstrated the efficacy of intraluminal high intensity ultrasound therapy for local tumor destruction in the esophagus.

  16. Pretreatment data is highly predictive of liver chemistry signals in clinical trials

    Directory of Open Access Journals (Sweden)

    Cai Z

    2012-11-01

    Full Text Available Zhaohui Cai,1,* Anders Bresell,2,* Mark H Steinberg,1 Debra G Silberg,1 Stephen T Furlong11AstraZeneca Pharmaceuticals, Wilmington, DE, USA; 2AstraZeneca Pharmaceuticals, Södertälje, Sweden*These authors contributed equally to this workPurpose: The goal of this retrospective analysis was to assess how well predictive models could determine which patients would develop liver chemistry signals during clinical trials based on their pretreatment (baseline information.Patients and methods: Based on data from 24 late-stage clinical trials, classification models were developed to predict liver chemistry outcomes using baseline information, which included demographics, medical history, concomitant medications, and baseline laboratory results.Results: Predictive models using baseline data predicted which patients would develop liver signals during the trials with average validation accuracy around 80%. Baseline levels of individual liver chemistry tests were most important for predicting their own elevations during the trials. High bilirubin levels at baseline were not uncommon and were associated with a high risk of developing biochemical Hy’s law cases. Baseline γ-glutamyltransferase (GGT level appeared to have some predictive value, but did not increase predictability beyond using established liver chemistry tests.Conclusion: It is possible to predict which patients are at a higher risk of developing liver chemistry signals using pretreatment (baseline data. Derived knowledge from such predictions may allow proactive and targeted risk management, and the type of analysis described here could help determine whether new biomarkers offer improved performance over established ones.Keywords: bilirubin, Hy’s Law, ALT, GGT, baseline, prediction

  17. Economic impact of and satisfaction with a high resolution thyroid nodule clinic at the endocrinology department.

    Science.gov (United States)

    Díaz-Soto, Gonzalo; Torres, Beatriz; López Gómez, Juan Jose; Gómez Hoyos, Emilia; Villar, Aurelia; Romero, Enrique; de Luis, Daniel A

    2016-10-01

    No conclusive data exist on the value of a high resolution thyroid nodule clinic for management of nodular thyroid disease. The aim of this study was to evaluate the economic impact of and user satisfaction with a high resolution thyroid nodule clinic (HRTNC) in coordination with primary care. A prospective, observational, descriptive study was conducted to analyze data from 3,726 patients (mean age 61±12 years; 85% women) evaluated at an HRTNC during 2014 and 2015. Demographic data (sex and age), number of ultrasound examinations and fine needle aspiration cytologies (FNAC), referral center and consultation type were assessed. In 2014 and 2015, 3,726 neck ultrasound examinations and 926 FNACs (3.8% rated as non-diagnostic) were performed. Among the 1,227 patients evaluated for the first time, 21.5% did not require a second endocrine appointment, which resulted in mean estimated savings of 14,354.55 euros. Of all patients, 41.1% were referred from primary care, 33.4% from endocrinology, and 26.5% from other specialties. As compared to 2013, the number of thyroid ultrasound examinations requested decreased by 65.3% and 59.7% in 2014 and 2015 respectively, with mean estimated savings of 137,563.92 euros. Mean user satisfaction assessed was 4.0 points (95% confidence interval, 3.7-4.3) on a 5-point scale. HRTNCs at endocrinology departments, coordinated with primary care, are a viable, cost-effective alternative with a positive user perception. Copyright © 2016 SEEN. Publicado por Elsevier España, S.L.U. All rights reserved.

  18. High SRPX2 protein expression predicts unfavorable clinical outcome in patients with prostate cancer

    Science.gov (United States)

    Zhang, Meng; Li, Xiaoli; Fan, Zhirui; Zhao, Jing; Liu, Shuzheng; Zhang, Mingzhi; Li, Huixiang; Goscinski, Mariusz Adam; Fan, Huijie; Suo, Zhenhe

    2018-01-01

    Background Sushi repeat-containing protein X-linked 2 (SRPX2) is overexpressed in a variety of different tumor tissues and correlated with poor prognosis in patients. Little research focuses on the role of SRPX2 expression in prostate cancer (PCa), and the clinicopathological significance of the protein expression in this tumor is relatively unknown. However, our previous transcriptome data from those cancer stem-like cells indicated the role of SRPX2 in PCa. Materials and methods In this study, RT-PCR and Western blotting were firstly used to examine the SRPX2 expression in three PCa cell lines including LNCaP, DU145, and PC3, and then SRPX2 protein expression was immunohistochemically investigated and statistically analyzed in a series of 106 paraffin-embedded PCa tissue specimens. Results Significantly lower levels of SRPX2 expression were verified in the LNCaP cells, compared with the expression in the aggressive DU145 and PC3 cells, in both mRNA and protein levels. Immunohistochemically, there were variable SRPX2 protein expressions in the clinical samples. Moreover, high levels of SRPX2 expression in the PCa tissues were significantly associated with Gleason score (P=0.008), lymph node metastasis (P=0.009), and distant metastasis (P=0.021). Furthermore, higher levels of SRPX2 expression in the PCa tissues were significantly associated with shorter overall survival (OS) (P<0.001). Conclusion Our results demonstrate that SRPX2 is highly expressed in aggressive PCa cells in vitro, and its protein expression in PCa is significantly associated with malignant clinical features and shorter OS, strongly indicating its prognostic value in prostate cancers. PMID:29881288

  19. A clinical comparison of high dose and low dose of Suxamethonium

    Directory of Open Access Journals (Sweden)

    RK Yadav

    2014-01-01

    Full Text Available Background: Suxamethonium having its rapid onset and short duration of action makes this drug unique amongst the neuromuscular blocking drugs described so far. However, use of suxamethonium is associated with a large number of undesirable side effects. Objective: To evaluate clinical effects of high and low dose of suxamethonium and to determine whether lower dose of suxamethonium can be used for any beneficial effects in terms of its various adverse effects e.g. cardiovascular responses, post-operative muscle pains and intraocular pressure. Methods: A total of 100 patients were included in this prospective study. All these patients on preoperative clinical evaluation were assessed to have adequate airway. All the patients were divided in two groups, low dose group (group I and High dose group (group II with 50 patients in each at random. A standard anesthetic technique was adhered to all the patients and following parameters were observed on comparative basis: a. Fasciculation and post operative myalgia. b. Cardiovascular effects, c. Intraocular pressure. Observation: The incidence of post Suxamethonium pain was significantly greater in group II. Increase in heart rate from baseline was significant in both groups. There was no significant difference between the two groups in the diastolic pressure but rise in systolic blood pressure was significant at all assessment times in both groups. This rise from control was statistically significant. Conclusion: Suxamethonium can be used in lower doses (0.5 mg/kg in elective cases without airway compromise. It gives benefits of reduced muscle pains, cardiovascular responses and intraocular hypertension. Journal of College of Medical Sciences-Nepal, 2013, Vol-9, No-2, 1-8 DOI: http://dx.doi.org/10.3126/jcmsn.v9i2.9677

  20. Longer duration of homelessness is associated with a lower likelihood of non-detectable plasma HIV-1 RNA viral load among people who use illicit drugs in a Canadian setting.

    Science.gov (United States)

    Loh, Jane; Kennedy, Mary Clare; Wood, Evan; Kerr, Thomas; Marshall, Brandon; Parashar, Surita; Montaner, Julio; Milloy, M-J

    2016-11-01

    Homelessness is common among people who use drugs (PWUD) and, for those living with HIV/AIDS, an important contributor to sub-optimal HIV treatment outcomes. This study aims to investigate the relationship between the duration of homelessness and the likelihood of plasma HIV-1 RNA viral load (VL) non-detectability among a cohort of HIV-positive PWUD. We used data from the ACCESS study, a long-running prospective cohort study of HIV-positive PWUD linked to comprehensive HIV clinical records including systematic plasma HIV-1 RNA VL monitoring. We estimated the longitudinal relationship between the duration of homelessness and the likelihood of exhibiting a non-detectable VL (i.e., effects modelling. Between May 1996 and June 2014, 922 highly active antiretroviral therapy-exposed participants were recruited and contributed 8188 observations. Of these, 4800 (59%) were characterized by non-detectable VL. Participants reported they were homeless in 910 (11%) interviews (median: six months, interquartile range: 6-12 months). A longer duration of homelessness was associated with lower odds of VL non-detectability (adjusted odds ratio = 0.71 per six-month period of homelessness, 95% confidence interval: 0.60-0.83) after adjustment for age, ancestry, drug use patterns, engagement in addiction treatment, and other potential confounders. Longer durations of episodes of homelessness in this cohort of HIV-positive illicit drug users were associated with a lower likelihood of plasma VL non-detectability. Our findings suggest that interventions that seek to promptly house homeless individuals, such as Housing First approaches, might assist in maximizing the clinical and public health benefits of antiretroviral therapy among people living with HIV/AIDS.

  1. Robot-assisted biopsies in a high-field MRI system. First clinical results

    International Nuclear Information System (INIS)

    Schell, B.; Eichler, K.; Mack, M.G.; Mueller, C.; Kerl, J.M.; Beeres, M.; Thalhammer, A.; Vogl, T.J.; Zangos, S.; Czerny, C.

    2012-01-01

    Purpose: The purpose of this study was to examine the clinical use of MR-guided biopsies in patients with suspicious lesions using a new MR-compatible assistance system in a high-field MR system. Materials and Methods: Six patients with suspicious focal lesions in various anatomic regions underwent percutanous biopsy in a high-field MR system (1.5 T, Magnetom Espree, Siemens) using a new MR-compatible assistance system (Innomotion). The procedures were planned and guided using T1-weighted FLASH and TrueFISP sequences. A servopneumatic drive then moved the guiding arm automatically to the insertion point. An MRI compatible 15G biopsy system (Somatex) was introduced by a physician guided by the needle holder and multiple biopsies were performed using the coaxial technique. The feasibility, duration of the intervention and biopsy findings were analyzed. Results: The proposed new system allows accurate punctures in a high-field MR system. The assistance device did not interfere with the image quality, and guided the needle virtually exactly as planned. Histological examination could be conducted on every patient. The lesion was malignant in four cases, and an infectious etiology was diagnosed for the two remaining lesions. Regarding the differentiation of anatomical and pathological structures and position monitoring of the insertion needle, TrueFISP images are to be given preference. The average intervention time was 41 minutes. Lesions up to 15.4 cm beneath the skin surface were punctured. Conclusion: The proposed MR-guided assistance system can be successfully utilized in a high-field MR system for accurate punctures of even deep lesions in various anatomic regions. (orig.)

  2. High-throughput genome sequencing of two Listeria monocytogenes clinical isolates during a large foodborne outbreak

    Directory of Open Access Journals (Sweden)

    Trout-Yakel Keri M

    2010-02-01

    Full Text Available Abstract Background A large, multi-province outbreak of listeriosis associated with ready-to-eat meat products contaminated with Listeria monocytogenes serotype 1/2a occurred in Canada in 2008. Subtyping of outbreak-associated isolates using pulsed-field gel electrophoresis (PFGE revealed two similar but distinct AscI PFGE patterns. High-throughput pyrosequencing of two L. monocytogenes isolates was used to rapidly provide the genome sequence of the primary outbreak strain and to investigate the extent of genetic diversity associated with a change of a single restriction enzyme fragment during PFGE. Results The chromosomes were collinear, but differences included 28 single nucleotide polymorphisms (SNPs and three indels, including a 33 kbp prophage that accounted for the observed difference in AscI PFGE patterns. The distribution of these traits was assessed within further clinical, environmental and food isolates associated with the outbreak, and this comparison indicated that three distinct, but highly related strains may have been involved in this nationwide outbreak. Notably, these two isolates were found to harbor a 50 kbp putative mobile genomic island encoding translocation and efflux functions that has not been observed in other Listeria genomes. Conclusions High-throughput genome sequencing provided a more detailed real-time assessment of genetic traits characteristic of the outbreak strains than could be achieved with routine subtyping methods. This study confirms that the latest generation of DNA sequencing technologies can be applied during high priority public health events, and laboratories need to prepare for this inevitability and assess how to properly analyze and interpret whole genome sequences in the context of molecular epidemiology.

  3. Effect of vaccination against sub-clinical Porcine Circovirus type 2 infection in a high-health finishing pig herd

    DEFF Research Database (Denmark)

    Nielsen, Gitte Blach; Nielsen, Jens Peter; Haugegaard, John

    2017-01-01

    During the last decade, the clinical manifestation of Porcine Circovirus type 2 (PCV2) infections has mostly changed from postweaning multisystemic wasting syndrome and high mortality to sub-clinical infections manifested only through impaired production parameters. However, co-infection with other...... respiratory pathogens often results in a larger effect on production, sometimes with clinical signs. Little is known about the impact of a moderate level PCV2 infection without co-infections, particularly in terms of feed conversion ratio and antimicrobial consumption. The purpose of the study was to evaluate...... the effect of vaccination against PCV2 in a sub-clinically infected, high-health finishing herd in terms of viral load in serum, feed conversion ratio and antimicrobial treatments. The study was conducted as a randomised clinical field trial with a parallel group design. Vaccination against PCV2...

  4. Bias Correction for the Maximum Likelihood Estimate of Ability. Research Report. ETS RR-05-15

    Science.gov (United States)

    Zhang, Jinming

    2005-01-01

    Lord's bias function and the weighted likelihood estimation method are effective in reducing the bias of the maximum likelihood estimate of an examinee's ability under the assumption that the true item parameters are known. This paper presents simulation studies to determine the effectiveness of these two methods in reducing the bias when the item…

  5. Analyzing multivariate survival data using composite likelihood and flexible parametric modeling of the hazard functions

    DEFF Research Database (Denmark)

    Nielsen, Jan; Parner, Erik

    2010-01-01

    In this paper, we model multivariate time-to-event data by composite likelihood of pairwise frailty likelihoods and marginal hazards using natural cubic splines. Both right- and interval-censored data are considered. The suggested approach is applied on two types of family studies using the gamma...

  6. Existence and uniqueness of the maximum likelihood estimator for models with a Kronecker product covariance structure

    NARCIS (Netherlands)

    Ros, B.P.; Bijma, F.; de Munck, J.C.; de Gunst, M.C.M.

    2016-01-01

    This paper deals with multivariate Gaussian models for which the covariance matrix is a Kronecker product of two matrices. We consider maximum likelihood estimation of the model parameters, in particular of the covariance matrix. There is no explicit expression for the maximum likelihood estimator

  7. Likelihood functions for the analysis of single-molecule binned photon sequences

    Energy Technology Data Exchange (ETDEWEB)

    Gopich, Irina V., E-mail: irinag@niddk.nih.gov [Laboratory of Chemical Physics, National Institute of Diabetes and Digestive and Kidney Diseases, National Institutes of Health, Bethesda, MD 20892 (United States)

    2012-03-02

    Graphical abstract: Folding of a protein with attached fluorescent dyes, the underlying conformational trajectory of interest, and the observed binned photon trajectory. Highlights: Black-Right-Pointing-Pointer A sequence of photon counts can be analyzed using a likelihood function. Black-Right-Pointing-Pointer The exact likelihood function for a two-state kinetic model is provided. Black-Right-Pointing-Pointer Several approximations are considered for an arbitrary kinetic model. Black-Right-Pointing-Pointer Improved likelihood functions are obtained to treat sequences of FRET efficiencies. - Abstract: We consider the analysis of a class of experiments in which the number of photons in consecutive time intervals is recorded. Sequence of photon counts or, alternatively, of FRET efficiencies can be studied using likelihood-based methods. For a kinetic model of the conformational dynamics and state-dependent Poisson photon statistics, the formalism to calculate the exact likelihood that this model describes such sequences of photons or FRET efficiencies is developed. Explicit analytic expressions for the likelihood function for a two-state kinetic model are provided. The important special case when conformational dynamics are so slow that at most a single transition occurs in a time bin is considered. By making a series of approximations, we eventually recover the likelihood function used in hidden Markov models. In this way, not only is insight gained into the range of validity of this procedure, but also an improved likelihood function can be obtained.

  8. Use of deterministic sampling for exploring likelihoods in linkage analysis for quantitative traits.

    NARCIS (Netherlands)

    Mackinnon, M.J.; Beek, van der S.; Kinghorn, B.P.

    1996-01-01

    Deterministic sampling was used to numerically evaluate the expected log-likelihood surfaces of QTL-marker linkage models in large pedigrees with simple structures. By calculating the expected values of likelihoods, questions of power of experimental designs, bias in parameter estimates, approximate

  9. Likelihood ratio data to report the validation of a forensic fingerprint evaluation method

    NARCIS (Netherlands)

    Ramos, Daniel; Haraksim, Rudolf; Meuwly, Didier

    2017-01-01

    Data to which the authors refer to throughout this article are likelihood ratios (LR) computed from the comparison of 5–12 minutiae fingermarks with fingerprints. These LRs data are used for the validation of a likelihood ratio (LR) method in forensic evidence evaluation. These data present a

  10. A guideline for the validation of likelihood ratio methods used for forensic evidence evaluation

    NARCIS (Netherlands)

    Meuwly, Didier; Ramos, Daniel; Haraksim, Rudolf

    2017-01-01

    This Guideline proposes a protocol for the validation of forensic evaluation methods at the source level, using the Likelihood Ratio framework as defined within the Bayes’ inference model. In the context of the inference of identity of source, the Likelihood Ratio is used to evaluate the strength of

  11. Predictors of Self-Reported Likelihood of Working with Older Adults

    Science.gov (United States)

    Eshbaugh, Elaine M.; Gross, Patricia E.; Satrom, Tatum

    2010-01-01

    This study examined the self-reported likelihood of working with older adults in a future career among 237 college undergraduates at a midsized Midwestern university. Although aging anxiety was not significantly related to likelihood of working with older adults, those students who had a greater level of death anxiety were less likely than other…

  12. Organizational Justice and Men's Likelihood to Sexually Harass: The Moderating Role of Sexism and Personality

    Science.gov (United States)

    Krings, Franciska; Facchin, Stephanie

    2009-01-01

    This study demonstrated relations between men's perceptions of organizational justice and increased sexual harassment proclivities. Respondents reported higher likelihood to sexually harass under conditions of low interactional justice, suggesting that sexual harassment likelihood may increase as a response to perceived injustice. Moreover, the…

  13. Sampling variability in forensic likelihood-ratio computation: A simulation study

    NARCIS (Netherlands)

    Ali, Tauseef; Spreeuwers, Lieuwe Jan; Veldhuis, Raymond N.J.; Meuwly, Didier

    2015-01-01

    Recently, in the forensic biometric community, there is a growing interest to compute a metric called “likelihood- ratio‿ when a pair of biometric specimens is compared using a biometric recognition system. Generally, a biomet- ric recognition system outputs a score and therefore a likelihood-ratio

  14. Biomedicinal implications of high-density lipoprotein: its composition, structure, functions, and clinical applications.

    Science.gov (United States)

    Cho, Kyung-Hyun

    2009-07-31

    High-density lipoprotein (HDL) is a proven biomarker for the monitoring of changes in antioxidant and anti-inflammation capability of body fluids. The beneficial virtues of HDL are highly dependent on its lipids and protein compositions, and their ratios. In normal state, the HDL particle is enriched with lipids and several HDL-associated enzymes, which are responsible for its antioxidant activity. Lower HDL-cholesterol levels (40 mg/dL) have been recognized as an independent risk factor for coronary artery disease, as well as being a known component of metabolic syndrome. Functional and structural changes of HDL have been recognized as factors pivotal to the evaluation of HDL-quality. In this review, I have elected to focus on the functional and structural correlations of HDL and the roles of HDL-associated apolipoproteins and enzymes. Recent clinical applications of HDL have also been reviewed, particularly the therapeutic targeting of HDL metabolism and reconstituted HDL; these techniques represent promising emerging strategies for the treatment of cardiovascular disease, for drug or gene therapy.

  15. Clinical diagnostic of pleural effusions using a high-speed viscosity measurement method

    Science.gov (United States)

    Hurth, Cedric; Klein, Katherine; van Nimwegen, Lena; Korn, Ronald; Vijayaraghavan, Krishnaswami; Zenhausern, Frederic

    2011-08-01

    We present a novel bio-analytical method to discriminate between transudative and exudative pleural effusions based on a high-speed video analysis of a solid glass sphere impacting a liquid. Since the result depends on the solution viscosity, it can ultimately replace the battery of biochemical assays currently used. We present results obtained on a series of 7 pleural effusions obtained from consenting patients by analyzing both the splash observed after the glass impactor hits the liquid surface, and in a configuration reminiscent of the drop ball viscometer with added sensitivity and throughput provided by the high-speed camera. The results demonstrate distinction between the pleural effusions and good correlation with the fluid chemistry analysis to accurately differentiate exudates and transudates for clinical purpose. The exudative effusions display a viscosity around 1.39 ± 0.08 cP whereas the transudative effusion was measured at 0.89 ± 0.09 cP, in good agreement with previous reports.

  16. 'Sink or swim': an evaluation of the clinical characteristics of individuals with high bone mass.

    LENUS (Irish Health Repository)

    Gregson, C L

    2011-04-01

    High bone mineral density on routine dual energy X-ray absorptiometry (DXA) may indicate an underlying skeletal dysplasia. Two hundred fifty-eight individuals with unexplained high bone mass (HBM), 236 relatives (41% with HBM) and 58 spouses were studied. Cases could not float, had mandible enlargement, extra bone, broad frames, larger shoe sizes and increased body mass index (BMI). HBM cases may harbour an underlying genetic disorder. INTRODUCTION: High bone mineral density is a sporadic incidental finding on routine DXA scanning of apparently asymptomatic individuals. Such individuals may have an underlying skeletal dysplasia, as seen in LRP5 mutations. We aimed to characterize unexplained HBM and determine the potential for an underlying skeletal dysplasia. METHODS: Two hundred fifty-eight individuals with unexplained HBM (defined as L1 Z-score ≥ +3.2 plus total hip Z-score ≥ +1.2, or total hip Z-score ≥ +3.2) were recruited from 15 UK centres, by screening 335,115 DXA scans. Unexplained HBM affected 0.181% of DXA scans. Next 236 relatives were recruited of whom 94 (41%) had HBM (defined as L1 Z-score + total hip Z-score ≥ +3.2). Fifty-eight spouses were also recruited together with the unaffected relatives as controls. Phenotypes of cases and controls, obtained from clinical assessment, were compared using random-effects linear and logistic regression models, clustered by family, adjusted for confounders, including age and sex. RESULTS: Individuals with unexplained HBM had an excess of sinking when swimming (7.11 [3.65, 13.84], p < 0.001; adjusted odds ratio with 95% confidence interval shown), mandible enlargement (4.16 [2.34, 7.39], p < 0.001), extra bone at tendon\\/ligament insertions (2.07 [1.13, 3.78], p = 0.018) and broad frame (3.55 [2.12, 5.95], p < 0.001). HBM cases also had a larger shoe size (mean difference 0.4 [0.1, 0.7] UK sizes, p = 0.009) and increased BMI (mean difference 2.2 [1.3, 3.1] kg\\/m(2

  17. Statistical modelling of survival data with random effects h-likelihood approach

    CERN Document Server

    Ha, Il Do; Lee, Youngjo

    2017-01-01

    This book provides a groundbreaking introduction to the likelihood inference for correlated survival data via the hierarchical (or h-) likelihood in order to obtain the (marginal) likelihood and to address the computational difficulties in inferences and extensions. The approach presented in the book overcomes shortcomings in the traditional likelihood-based methods for clustered survival data such as intractable integration. The text includes technical materials such as derivations and proofs in each chapter, as well as recently developed software programs in R (“frailtyHL”), while the real-world data examples together with an R package, “frailtyHL” in CRAN, provide readers with useful hands-on tools. Reviewing new developments since the introduction of the h-likelihood to survival analysis (methods for interval estimation of the individual frailty and for variable selection of the fixed effects in the general class of frailty models) and guiding future directions, the book is of interest to research...

  18. Adolescent HIV Prevention: An Application of the Elaboration Likelihood Model.

    Science.gov (United States)

    Metzler, April E.; Weiskotten, David; Morgen, Keith J.

    Ninth grade students (n=298) participated in a study to examine the influence source credibility, message, quality, and personal relevance on HIV prevention message efficacy. A pilot study with adolescent focus groups created the high and low quality messages, as well as the high (HIV+) and low (worried parent) credibility sources. Participants…

  19. Magnitude of effects in clinical trials published in high-impact general medical journals.

    Science.gov (United States)

    Siontis, Konstantinos C M; Evangelou, Evangelos; Ioannidis, John P A

    2011-10-01

    Prestigious journals select for publication studies that are considered most important and informative. We aimed to examine whether high-impact general (HIG) medical journals systematically demonstrate more favourable results for experimental interventions compared with the rest of the literature. We scrutinized systematic reviews of the Cochrane Database (Issue 4, 2009) and meta-analyses published in four general journals (2008-09). Eligible articles included ≥1 binary outcome meta-analysis(es) pertaining to effectiveness with ≥1 clinical trial(s) published in NEJM, JAMA or Lancet. Effect sizes in trials from NEJM, JAMA or Lancet were compared with those from other trials in the same meta-analyses by deriving summary relative odds ratios (sRORs). Additional analyses examined separately early- and late-published trials in HIG journals and journal-specific effects. A total of 79 meta-analyses including 1043 clinical trials were analysed. Trials in HIG journals had similar effects to trials in other journals, when there was large-scale evidence, but showed more favourable results for experimental interventions when they were small. When HIG trials had less than 40 events, the sROR was 1.64 [95% confidence interval (95% CI): 1.23-2.18). The difference was most prominent when small early trials published in HIG journals were compared with subsequent trials [sROR 2.68 (95% CI: 1.33-5.38)]. Late-published HIG trials showed no consistent inflation of effects. The patterns did not differ beyond chance between NEJM, JAMA or Lancet. Small trials published in the most prestigious journals show more favourable effects for experimental interventions, and this is most prominent for early-published trials in such journals. No effect inflation is seen for large trials.

  20. Clinical significance of high-sensitivity C- reactive protein in systemic sclerosis

    Directory of Open Access Journals (Sweden)

    T. A. Nevskaya

    2007-01-01

    Full Text Available Objective. To determine diagnostic and prognostic significance of high-sensitivity Creactive protein (hsCRP in systemic sclerosis (SS, to define relationship of this factor with activity of the disease and cardiovascular pathology, to assess role of pro-inflammatory cytokines in induction of acute phase proteins synthesis in SS.Material and methods. Serum levels of hsCRP, interleukin (IL 6, IL 1β, IL1ra, IL 10, sIL2r were evaluated by enzyme immunoassay (EIA in 40 pts with SS and 24 subjects of control group. Relationship with clinical features of the disease, endothelial dysfunction, capillaries structure changes, sub-clinical atherosclerosis, total coronary risk and some traditional cardiovascular risk factors was analyzed. Instrumental assessment included nailfold capillaroscopy, sonographic duplex examination of carotids , brachial arte ry sonographic examination. HsCRP prognostic significance was assessed in 51 pts with SS.Results. Elevated levels of hsCRP were found in 32% of SS pts and correlated with activity and severity of the disease, HAQ and SHAQ. Direct correlation of hsCRP with skin fibrosis distribution, interstitial lung disease, arthritis, laboratory indices of SS activity (ESR, sIL2r and Scl-70 was revealed. HsCRP concentration in SS did not depend on character and intensity of cardiovascular pathology, subclinical atherosclerosis, endothelial dysfunction and proinflammatory cytokines production.Conclusion. HsCRP in SS reflects intensity of immuno-inflammatory process, correlates with T-cell activation markers and can be used as index of the disease activity, severity of skin and lung fibrosis.

  1. CNV Workshop: an integrated platform for high-throughput copy number variation discovery and clinical diagnostics.

    Science.gov (United States)

    Gai, Xiaowu; Perin, Juan C; Murphy, Kevin; O'Hara, Ryan; D'arcy, Monica; Wenocur, Adam; Xie, Hongbo M; Rappaport, Eric F; Shaikh, Tamim H; White, Peter S

    2010-02-04

    Recent studies have shown that copy number variations (CNVs) are frequent in higher eukaryotes and associated with a substantial portion of inherited and acquired risk for various human diseases. The increasing availability of high-resolution genome surveillance platforms provides opportunity for rapidly assessing research and clinical samples for CNV content, as well as for determining the potential pathogenicity of identified variants. However, few informatics tools for accurate and efficient CNV detection and assessment currently exist. We developed a suite of software tools and resources (CNV Workshop) for automated, genome-wide CNV detection from a variety of SNP array platforms. CNV Workshop includes three major components: detection, annotation, and presentation of structural variants from genome array data. CNV detection utilizes a robust and genotype-specific extension of the Circular Binary Segmentation algorithm, and the use of additional detection algorithms is supported. Predicted CNVs are captured in a MySQL database that supports cohort-based projects and incorporates a secure user authentication layer and user/admin roles. To assist with determination of pathogenicity, detected CNVs are also annotated automatically for gene content, known disease loci, and gene-based literature references. Results are easily queried, sorted, filtered, and visualized via a web-based presentation layer that includes a GBrowse-based graphical representation of CNV content and relevant public data, integration with the UCSC Genome Browser, and tabular displays of genomic attributes for each CNV. To our knowledge, CNV Workshop represents the first cohesive and convenient platform for detection, annotation, and assessment of the biological and clinical significance of structural variants. CNV Workshop has been successfully utilized for assessment of genomic variation in healthy individuals and disease cohorts and is an ideal platform for coordinating multiple associated

  2. CNV Workshop: an integrated platform for high-throughput copy number variation discovery and clinical diagnostics

    Directory of Open Access Journals (Sweden)

    Rappaport Eric F

    2010-02-01

    Full Text Available Abstract Background Recent studies have shown that copy number variations (CNVs are frequent in higher eukaryotes and associated with a substantial portion of inherited and acquired risk for various human diseases. The increasing availability of high-resolution genome surveillance platforms provides opportunity for rapidly assessing research and clinical samples for CNV content, as well as for determining the potential pathogenicity of identified variants. However, few informatics tools for accurate and efficient CNV detection and assessment currently exist. Results We developed a suite of software tools and resources (CNV Workshop for automated, genome-wide CNV detection from a variety of SNP array platforms. CNV Workshop includes three major components: detection, annotation, and presentation of structural variants from genome array data. CNV detection utilizes a robust and genotype-specific extension of the Circular Binary Segmentation algorithm, and the use of additional detection algorithms is supported. Predicted CNVs are captured in a MySQL database that supports cohort-based projects and incorporates a secure user authentication layer and user/admin roles. To assist with determination of pathogenicity, detected CNVs are also annotated automatically for gene content, known disease loci, and gene-based literature references. Results are easily queried, sorted, filtered, and visualized via a web-based presentation layer that includes a GBrowse-based graphical representation of CNV content and relevant public data, integration with the UCSC Genome Browser, and tabular displays of genomic attributes for each CNV. Conclusions To our knowledge, CNV Workshop represents the first cohesive and convenient platform for detection, annotation, and assessment of the biological and clinical significance of structural variants. CNV Workshop has been successfully utilized for assessment of genomic variation in healthy individuals and disease cohorts and

  3. Sampling of systematic errors to estimate likelihood weights in nuclear data uncertainty propagation

    International Nuclear Information System (INIS)

    Helgesson, P.; Sjöstrand, H.; Koning, A.J.; Rydén, J.; Rochman, D.; Alhassan, E.; Pomp, S.

    2016-01-01

    In methodologies for nuclear data (ND) uncertainty assessment and propagation based on random sampling, likelihood weights can be used to infer experimental information into the distributions for the ND. As the included number of correlated experimental points grows large, the computational time for the matrix inversion involved in obtaining the likelihood can become a practical problem. There are also other problems related to the conventional computation of the likelihood, e.g., the assumption that all experimental uncertainties are Gaussian. In this study, a way to estimate the likelihood which avoids matrix inversion is investigated; instead, the experimental correlations are included by sampling of systematic errors. It is shown that the model underlying the sampling methodology (using univariate normal distributions for random and systematic errors) implies a multivariate Gaussian for the experimental points (i.e., the conventional model). It is also shown that the likelihood estimates obtained through sampling of systematic errors approach the likelihood obtained with matrix inversion as the sample size for the systematic errors grows large. In studied practical cases, it is seen that the estimates for the likelihood weights converge impractically slowly with the sample size, compared to matrix inversion. The computational time is estimated to be greater than for matrix inversion in cases with more experimental points, too. Hence, the sampling of systematic errors has little potential to compete with matrix inversion in cases where the latter is applicable. Nevertheless, the underlying model and the likelihood estimates can be easier to intuitively interpret than the conventional model and the likelihood function involving the inverted covariance matrix. Therefore, this work can both have pedagogical value and be used to help motivating the conventional assumption of a multivariate Gaussian for experimental data. The sampling of systematic errors could also

  4. Clinical benefit from pharmacological elevation of high-density lipoprotein cholesterol: meta-regression analysis.

    Science.gov (United States)

    Hourcade-Potelleret, F; Laporte, S; Lehnert, V; Delmar, P; Benghozi, Renée; Torriani, U; Koch, R; Mismetti, P

    2015-06-01

    Epidemiological evidence that the risk of coronary heart disease is inversely associated with the level of high-density lipoprotein cholesterol (HDL-C) has motivated several phase III programmes with cholesteryl ester transfer protein (CETP) inhibitors. To assess alternative methods to predict clinical response of CETP inhibitors. Meta-regression analysis on raising HDL-C drugs (statins, fibrates, niacin) in randomised controlled trials. 51 trials in secondary prevention with a total of 167,311 patients for a follow-up >1 year where HDL-C was measured at baseline and during treatment. The meta-regression analysis showed no significant association between change in HDL-C (treatment vs comparator) and log risk ratio (RR) of clinical endpoint (non-fatal myocardial infarction or cardiac death). CETP inhibitors data are consistent with this finding (RR: 1.03; P5-P95: 0.99-1.21). A prespecified sensitivity analysis by drug class suggested that the strength of relationship might differ between pharmacological groups. A significant association for both statins (p<0.02, log RR=-0.169-0.0499*HDL-C change, R(2)=0.21) and niacin (p=0.02, log RR=1.07-0.185*HDL-C change, R(2)=0.61) but not fibrates (p=0.18, log RR=-0.367+0.077*HDL-C change, R(2)=0.40) was shown. However, the association was no longer detectable after adjustment for low-density lipoprotein cholesterol for statins or exclusion of open trials for niacin. Meta-regression suggested that CETP inhibitors might not influence coronary risk. The relation between change in HDL-C level and clinical endpoint may be drug dependent, which limits the use of HDL-C as a surrogate marker of coronary events. Other markers of HDL function may be more relevant. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  5. Does perioperative high-dose prednisolone have clinical benefits for generalized myasthenia gravis?

    Science.gov (United States)

    Sekine, Yasuo; Kawaguchi, Naoki; Hamada, Chikuma; Sekiguchi, Hiromi; Yasufuku, Kazuhiro; Iyoda, Akira; Shibuya, Kiyoshi; Fujisawa, Takehiko

    2006-06-01

    The purpose of this study was to clarify the clinical benefits of perioperative administration of high-dose prednisolone (PSL) combined with extended thymectomy on the long-term outcomes of 116 consecutive patients with generalized myasthenia gravis (MG). A retrospective review was conducted on 116 patients diagnosed with generalized MG who received alternate-day oral administration of high-dose PSL (100 mg/alternate days) and had undergone transsternal extended thymectomy. Incidences of postoperative myasthenic crisis, adverse effects of steroid, long-term outcomes, such as complete stable remission (CSR), pharmacologic remission (PR) or improvement (Imp), and disease recurrence after CSR were evaluated. Six patients (5.2%) experienced post-thymectomy myasthenic crisis. Crude cumulative CSR and PR + CSR rates were 44.8 and 62.7%, respectively. Life table analysis showed that 41.8, 52.8 and 63.4% of the patients were in CSR at 3, 5 and 10 years, respectively. Multivariate analysis revealed that age and pretreatment classification according to the Myasthenia Gravis Foundation of America (MGFA) criteria tended to be independent predictors of CSR. There were 6.9% with compressive vertebral fracture, 13.8% with cataract, and 5.2% with steroid-induced diabetes. Life table analysis revealed that recurrence rates after CSR were 36.8 and 46.0% at 3 and 5 years, respectively. Patients with thymoma had a significantly higher rate of recurrence than those without thymoma (p = 0.001). Alternate-day administration of high-dose prednisolone reduced the risk of post-thymectomy myasthenic crisis. Presence of thymoma was a risk factor for MG recurrence after CSR.

  6. A high-throughput assay of NK cell activity in whole blood and its clinical application

    International Nuclear Information System (INIS)

    Lee, Saet-byul; Cha, Junhoe; Kim, Im-kyung; Yoon, Joo Chun; Lee, Hyo Joon; Park, Sang Woo; Cho, Sunjung; Youn, Dong-Ye; Lee, Heyja; Lee, Choong Hwan; Lee, Jae Myun; Lee, Kang Young; Kim, Jongsun

    2014-01-01

    Graphical abstract: - Highlights: • We demonstrated a simple assay of NK cell activity from whole blood. • The measurement of secreted IFN-γ from NK cell enables high-throughput screening. • The NKA assay was validated by clinical results of colorectal cancer patients. - Abstract: Natural killer (NK) cells are lymphocytes of the innate immune system and have the ability to kill tumor cells and virus-infected cells without prior sensitization. Malignant tumors and viruses have developed, however, strategies to suppress NK cells to escape from their responses. Thus, the evaluation of NK cell activity (NKA) could be invaluable to estimate the status and the outcome of cancers, viral infections, and immune-mediated diseases. Established methods that measure NKA, such as 51 Cr release assay and CD107a degranulation assay, may be used to determine NK cell function, but they are complicated and time-consuming because they require isolation of peripheral blood mononuclear cells (PBMC) or NK cells. In some cases these assays require hazardous material such as radioactive isotopes. To overcome these difficulties, we developed a simple assay that uses whole blood instead of PBMC or isolated NK cells. This novel assay is suitable for high-throughput screening and the monitoring of diseases, because it employs serum of ex vivo stimulated whole blood to detect interferon (IFN)-γ secreted from NK cells as an indicator of NKA. After the stimulation of NK cells, the determination of IFNγ concentration in serum samples by enzyme-linked immunosorbent assay (ELISA) provided a swift, uncomplicated, and high-throughput assay of NKA ex vivo. The NKA results microsatellite stable (MSS) colorectal cancer patients was showed significantly lower NKA, 263.6 ± 54.5 pg/mL compared with healthy subjects, 867.5 ± 50.2 pg/mL (p value <0.0001). Therefore, the NKA could be utilized as a supportive diagnostic marker for microsatellite stable (MSS) colorectal cancer

  7. Incidence Rates of Clinical Mastitis among Canadian Holsteins Classified as High, Average, or Low Immune Responders

    Science.gov (United States)

    Miglior, Filippo; Mallard, Bonnie A.

    2013-01-01

    The objective of this study was to compare the incidence rate of clinical mastitis (IRCM) between cows classified as high, average, or low for antibody-mediated immune responses (AMIR) and cell-mediated immune responses (CMIR). In collaboration with the Canadian Bovine Mastitis Research Network, 458 lactating Holsteins from 41 herds were immunized with a type 1 and a type 2 test antigen to stimulate adaptive immune responses. A delayed-type hypersensitivity test to the type 1 test antigen was used as an indicator of CMIR, and serum antibody of the IgG1 isotype to the type 2 test antigen was used for AMIR determination. By using estimated breeding values for these traits, cows were classified as high, average, or low responders. The IRCM was calculated as the number of cases of mastitis experienced over the total time at risk throughout the 2-year study period. High-AMIR cows had an IRCM of 17.1 cases per 100 cow-years, which was significantly lower than average and low responders, with 27.9 and 30.7 cases per 100 cow-years, respectively. Low-AMIR cows tended to have the most severe mastitis. No differences in the IRCM were noted when cows were classified based on CMIR, likely due to the extracellular nature of mastitis-causing pathogens. The results of this study demonstrate the desirability of breeding dairy cattle for enhanced immune responses to decrease the incidence and severity of mastitis in the Canadian dairy industry. PMID:23175290

  8. A high-throughput assay of NK cell activity in whole blood and its clinical application

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Saet-byul [Department of Microbiology and Brain Korea 21 Project for Medical Sciences, Yonsei University College of Medicine, Seoul (Korea, Republic of); Cha, Junhoe [ATGen Co. Ltd., Sungnam (Korea, Republic of); Kim, Im-kyung [Department of Surgery, Gangnam Severance Hospital, Yonsei University College of Medicine, Seoul (Korea, Republic of); Yoon, Joo Chun [Department of Microbiology, Ewha Womans University School of Medicine, Seoul (Korea, Republic of); Lee, Hyo Joon [Department of Microbiology and Brain Korea 21 Project for Medical Sciences, Yonsei University College of Medicine, Seoul (Korea, Republic of); Park, Sang Woo; Cho, Sunjung; Youn, Dong-Ye; Lee, Heyja; Lee, Choong Hwan [ATGen Co. Ltd., Sungnam (Korea, Republic of); Lee, Jae Myun [Department of Microbiology and Brain Korea 21 Project for Medical Sciences, Yonsei University College of Medicine, Seoul (Korea, Republic of); Lee, Kang Young, E-mail: kylee117@yuhs.ac [Department of Surgery, Gangnam Severance Hospital, Yonsei University College of Medicine, Seoul (Korea, Republic of); Kim, Jongsun, E-mail: jkim63@yuhs.ac [Department of Microbiology and Brain Korea 21 Project for Medical Sciences, Yonsei University College of Medicine, Seoul (Korea, Republic of)

    2014-03-14

    Graphical abstract: - Highlights: • We demonstrated a simple assay of NK cell activity from whole blood. • The measurement of secreted IFN-γ from NK cell enables high-throughput screening. • The NKA assay was validated by clinical results of colorectal cancer patients. - Abstract: Natural killer (NK) cells are lymphocytes of the innate immune system and have the ability to kill tumor cells and virus-infected cells without prior sensitization. Malignant tumors and viruses have developed, however, strategies to suppress NK cells to escape from their responses. Thus, the evaluation of NK cell activity (NKA) could be invaluable to estimate the status and the outcome of cancers, viral infections, and immune-mediated diseases. Established methods that measure NKA, such as {sup 51}Cr release assay and CD107a degranulation assay, may be used to determine NK cell function, but they are complicated and time-consuming because they require isolation of peripheral blood mononuclear cells (PBMC) or NK cells. In some cases these assays require hazardous material such as radioactive isotopes. To overcome these difficulties, we developed a simple assay that uses whole blood instead of PBMC or isolated NK cells. This novel assay is suitable for high-throughput screening and the monitoring of diseases, because it employs serum of ex vivo stimulated whole blood to detect interferon (IFN)-γ secreted from NK cells as an indicator of NKA. After the stimulation of NK cells, the determination of IFNγ concentration in serum samples by enzyme-linked immunosorbent assay (ELISA) provided a swift, uncomplicated, and high-throughput assay of NKA ex vivo. The NKA results microsatellite stable (MSS) colorectal cancer patients was showed significantly lower NKA, 263.6 ± 54.5 pg/mL compared with healthy subjects, 867.5 ± 50.2 pg/mL (p value <0.0001). Therefore, the NKA could be utilized as a supportive diagnostic marker for microsatellite stable (MSS) colorectal cancer.

  9. Genetic Alterations and Their Clinical Implications in High-Recurrence Risk Papillary Thyroid Cancer.

    Science.gov (United States)

    Lee, Min-Young; Ku, Bo Mi; Kim, Hae Su; Lee, Ji Yun; Lim, Sung Hee; Sun, Jong-Mu; Lee, Se-Hoon; Park, Keunchil; Oh, Young Lyun; Hong, Mineui; Jeong, Han-Sin; Son, Young-Ik; Baek, Chung-Hwan; Ahn, Myung-Ju

    2017-10-01

    Papillary thyroid carcinomas (PTCs) frequently involve genetic alterations. The objective of this study was to investigate genetic alterations and further explore the relationships between these genetic alterations and clinicopathological characteristics in a high-recurrence risk (node positive, N1) PTC group. Tumor tissue blocks were obtained from 240 surgically resected patients with histologically confirmed stage III/IV (pT3/4 or N1) PTCs. We screened gene fusions using NanoString's nCounter technology and mutational analysis was performed by direct DNA sequencing. Data describing the clinicopathological characteristics and clinical courses were retrospectively collected. Of the 240 PTC patients, 207 (86.3%) had at least one genetic alteration, including BRAF mutation in 190 patients (79.2%), PIK3CA mutation in 25 patients (10.4%), NTRK1/3 fusion in six patients (2.5%), and RET fusion in 24 patients (10.0%). Concomitant presence of more than two genetic alterations was seen in 36 patients (15%). PTCs harboring BRAF mutation were associated with RET wild-type expression (p=0.001). RET fusion genes have been found to occur with significantly higher frequency in N1b stage patients (p=0.003) or groups of patients aged 45 years or older (p=0.031); however, no significant correlation was found between other genetic alterations. There was no trend toward favorable recurrence-free survival or overall survival among patients lacking genetic alterations. In the selected high-recurrence risk PTC group, most patients had more than one genetic alteration. However, these known alterations could not entirely account for clinicopathological features of high-recurrence risk PTC.

  10. pplacer: linear time maximum-likelihood and Bayesian phylogenetic placement of sequences onto a fixed reference tree

    Directory of Open Access Journals (Sweden)

    Kodner Robin B

    2010-10-01

    Full Text Available Abstract Background Likelihood-based phylogenetic inference is generally considered to be the most reliable classification method for unknown sequences. However, traditional likelihood-based phylogenetic methods cannot be applied to large volumes of short reads from next-generation sequencing due to computational complexity issues and lack of phylogenetic signal. "Phylogenetic placement," where a reference tree is fixed and the unknown query sequences are placed onto the tree via a reference alignment, is a way to bring the inferential power offered by likelihood-based approaches to large data sets. Results This paper introduces pplacer, a software package for phylogenetic placement and subsequent visualization. The algorithm can place twenty thousand short reads on a reference tree of one thousand taxa per hour per processor, has essentially linear time and memory complexity in the number of reference taxa, and is easy to run in parallel. Pplacer features calculation of the posterior probability of a placement on an edge, which is a statistically rigorous way of quantifying uncertainty on an edge-by-edge basis. It also can inform the user of the positional uncertainty for query sequences by calculating expected distance between placement locations, which is crucial in the estimation of uncertainty with a well-sampled reference tree. The software provides visualizations using branch thickness and color to represent number of placements and their uncertainty. A simulation study using reads generated from 631 COG alignments shows a high level of accuracy for phylogenetic placement over a wide range of alignment diversity, and the power of edge uncertainty estimates to measure placement confidence. Conclusions Pplacer enables efficient phylogenetic placement and subsequent visualization, making likelihood-based phylogenetics methodology practical for large collections of reads; it is freely available as source code, binaries, and a web service.

  11. Improvement and comparison of likelihood functions for model calibration and parameter uncertainty analysis within a Markov chain Monte Carlo scheme

    Science.gov (United States)

    Cheng, Qin-Bo; Chen, Xi; Xu, Chong-Yu; Reinhardt-Imjela, Christian; Schulte, Achim

    2014-11-01

    In this study, the likelihood functions for uncertainty analysis of hydrological models are compared and improved through the following steps: (1) the equivalent relationship between the Nash-Sutcliffe Efficiency coefficient (NSE) and the likelihood function with Gaussian independent and identically distributed residuals is proved; (2) a new estimation method of the Box-Cox transformation (BC) parameter is developed to improve the effective elimination of the heteroscedasticity of model residuals; and (3) three likelihood functions-NSE, Generalized Error Distribution with BC (BC-GED) and Skew Generalized Error Distribution with BC (BC-SGED)-are applied for SWAT-WB-VSA (Soil and Water Assessment Tool - Water Balance - Variable Source Area) model calibration in the Baocun watershed, Eastern China. Performances of calibrated models are compared using the observed river discharges and groundwater levels. The result shows that the minimum variance constraint can effectively estimate the BC parameter. The form of the likelihood function significantly impacts on the calibrated parameters and the simulated results of high and low flow components. SWAT-WB-VSA with the NSE approach simulates flood well, but baseflow badly owing to the assumption of Gaussian error distribution, where the probability of the large error is low, but the small error around zero approximates equiprobability. By contrast, SWAT-WB-VSA with the BC-GED or BC-SGED approach mimics baseflow well, which is proved in the groundwater level simulation. The assumption of skewness of the error distribution may be unnecessary, because all the results of the BC-SGED approach are nearly the same as those of the BC-GED approach.

  12. An alternative empirical likelihood method in missing response problems and causal inference.

    Science.gov (United States)

    Ren, Kaili; Drummond, Christopher A; Brewster, Pamela S; Haller, Steven T; Tian, Jiang; Cooper, Christopher J; Zhang, Biao

    2016-11-30

    Missing responses are common problems in medical, social, and economic studies. When responses are missing at random, a complete case data analysis may result in biases. A popular debias method is inverse probability weighting proposed by Horvitz and Thompson. To improve efficiency, Robins et al. proposed an augmented inverse probability weighting method. The augmented inverse probability weighting estimator has a double-robustness property and achieves the semiparametric efficiency lower bound when the regression model and propensity score model are both correctly specified. In this paper, we introduce an empirical likelihood-based estimator as an alternative to Qin and Zhang (2007). Our proposed estimator is also doubly robust and locally efficient. Simulation results show that the proposed estimator has better performance when the propensity score is correctly modeled. Moreover, the proposed method can be applied in the estimation of average treatment effect in observational causal inferences. Finally, we apply our method to an observational study of smoking, using data from the Cardiovascular Outcomes in Renal Atherosclerotic Lesions clinical trial. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  13. Statistical analysis of maximum likelihood estimator images of human brain FDG PET studies

    International Nuclear Information System (INIS)

    Llacer, J.; Veklerov, E.; Hoffman, E.J.; Nunez, J.; Coakley, K.J.

    1993-01-01

    The work presented in this paper evaluates the statistical characteristics of regional bias and expected error in reconstructions of real PET data of human brain fluorodeoxiglucose (FDG) studies carried out by the maximum likelihood estimator (MLE) method with a robust stopping rule, and compares them with the results of filtered backprojection (FBP) reconstructions and with the method of sieves. The task that the authors have investigated is that of quantifying radioisotope uptake in regions-of-interest (ROI's). They first describe a robust methodology for the use of the MLE method with clinical data which contains only one adjustable parameter: the kernel size for a Gaussian filtering operation that determines final resolution and expected regional error. Simulation results are used to establish the fundamental characteristics of the reconstructions obtained by out methodology, corresponding to the case in which the transition matrix is perfectly known. Then, data from 72 independent human brain FDG scans from four patients are used to show that the results obtained from real data are consistent with the simulation, although the quality of the data and of the transition matrix have an effect on the final outcome

  14. Theoretical Study of Penalized-Likelihood Image Reconstruction for Region of Interest Quantification

    International Nuclear Information System (INIS)

    Qi, Jinyi; Huesman, Ronald H.

    2006-01-01

    Region of interest (ROI) quantification is an important task in emission tomography (e.g., positron emission tomography and single photon emission computed tomography). It is essential for exploring clinical factors such as tumor activity, growth rate, and the efficacy of therapeutic interventions. Statistical image reconstruction methods based on the penalized maximum-likelihood (PML) or maximum a posteriori principle have been developed for emission tomography to deal with the low signal-to-noise ratio of the emission data. Similar to the filter cut-off frequency in the filtered backprojection method, the regularization parameter in PML reconstruction controls the resolution and noise tradeoff and, hence, affects ROI quantification. In this paper, we theoretically analyze the performance of ROI quantification in PML reconstructions. Building on previous work, we derive simplified theoretical expressions for the bias, variance, and ensemble mean-squared-error (EMSE) of the estimated total activity in an ROI that is surrounded by a uniform background. When the mean and covariance matrix of the activity inside the ROI are known, the theoretical expressions are readily computable and allow for fast evaluation of image quality for ROI quantification with different regularization parameters. The optimum regularization parameter can then be selected to minimize the EMSE. Computer simulations are conducted for small ROIs with variable uniform uptake. The results show that the theoretical predictions match the Monte Carlo results reasonably well

  15. Clinical Outcomes in Men and Women following Total Knee Arthroplasty with a High-Flex Knee: No Clinical Effect of Gender

    OpenAIRE

    Nassif, Jeffrey M.; Pietrzak, William S.

    2015-01-01

    While it is generally recognized that anatomical differences exist between the male and female knee, the literature generally refutes the clinical need for gender-specific total knee prostheses. It has been found that standard, unisex knees perform as well, or better, in women than men. Recently, high-flex knees have become available that mechanically accommodate increased flexion yet no studies have directly compared the outcomes of these devices in men and women to see if gender-based dif...

  16. The validation and clinical implementation of BRCAplus: a comprehensive high-risk breast cancer diagnostic assay.

    Directory of Open Access Journals (Sweden)

    Hansook Kim Chong

    Full Text Available Breast cancer is the most commonly diagnosed cancer in women, with 10% of disease attributed to hereditary factors. Although BRCA1 and BRCA2 account for a high percentage of hereditary cases, there are more than 25 susceptibility genes that differentially impact the risk for breast cancer. Traditionally, germline testing for breast cancer was performed by Sanger dideoxy terminator sequencing in a reflexive manner, beginning with BRCA1 and BRCA2. The introduction of next-generation sequencing (NGS has enabled the simultaneous testing of all genes implicated in breast cancer resulting in diagnostic labs offering large, comprehensive gene panels. However, some physicians prefer to only test for those genes in which established surveillance and treatment protocol exists. The NGS based BRCAplus test utilizes a custom tiled PCR based target enrichment design and bioinformatics pipeline coupled with array comparative genomic hybridization (aCGH to identify mutations in the six high-risk genes: BRCA1, BRCA2, PTEN, TP53, CDH1, and STK11. Validation of the assay with 250 previously characterized samples resulted in 100% detection of 3,025 known variants and analytical specificity of 99.99%. Analysis of the clinical performance of the first 3,000 BRCAplus samples referred for testing revealed an average coverage greater than 9,000X per target base pair resulting in excellent specificity and the sensitivity to detect low level mosaicism and allele-drop out. The unique design of the assay enabled the detection of pathogenic mutations missed by previous testing. With the abundance of NGS diagnostic tests being released, it is essential that clinicians understand the advantages and limitations of different test designs.

  17. Self-Reported Non-Celiac Wheat Sensitivity in High School Students: Demographic and Clinical Characteristics

    Directory of Open Access Journals (Sweden)

    Antonio Carroccio

    2017-07-01

    Full Text Available Background: Non-Celiac Wheat Sensitivity (NCWS has recently been included among the gluten-related disorders. As no biomarkers of this disease exist, its frequency has been estimated based on self-reported symptoms, but to date no data are available about self-reported NCWS in teenagers. Aim: To explore the prevalence of self-reported NCWS in a group of high school students and to study their demographic and clinical characteristics. Methods: The study was performed between April 2015 and January 2016 in two high schools of a coastal town in the south of Sicily (Italy. A total of 555 students (mean age 17 years, 191 male, 364 female completed a modified validated questionnaire for self-reported NCWS. The subjects who self-reported NCWS were then compared with all the others. Results: Seven individuals (1.26% had an established diagnosis of CD. The prevalence of self-reported NCWS was 12.2%, and 2.9% were following a gluten-free diet (GFD. Only 15 out of 68 (23% NCWS self-reporters had consulted a doctor for this problem and only nine (14% had undergone serological tests for celiac disease. The NCWS self-reporters very often had IBS symptoms (44%. Conclusions: Self-reported NCWS was found to be common in teenagers, with a frequency of 12.2%; the frequency of GFD use was 2.9%, which was much higher than the percentage of known CD in the same population (1.26%. A greater awareness of the possible implications on the part of the subjects involved, and a more thorough medical approach to the study of self-reported wheat-induced symptoms are required.

  18. Evaluation of a high resolution genotyping method for Chlamydia trachomatis using routine clinical samples.

    Directory of Open Access Journals (Sweden)

    Yibing Wang

    2011-02-01

    Full Text Available Genital chlamydia infection is the most commonly diagnosed sexually transmitted infection in the UK. C. trachomatis genital infections are usually caused by strains which fall into two pathovars: lymphogranuloma venereum (LGV and the genitourinary genotypes D-K. Although these genotypes can be discriminated by outer membrane protein gene (ompA sequencing or multi-locus sequence typing (MLST, neither protocol affords the high-resolution genotyping required for local epidemiology and accurate contact-tracing.We evaluated variable number tandem repeat (VNTR and ompA sequencing (now called multi-locus VNTR analysis and ompA or "MLVA-ompA" to study local epidemiology in Southampton over a period of six months. One hundred and fifty seven endocervical swabs that tested positive for C. trachomatis from both the Southampton genitourinary medicine (GUM clinic and local GP surgeries were tested by COBAS Taqman 48 (Roche PCR for the presence of C. trachomatis. Samples tested as positive by the commercial NAATs test were genotyped, where possible, by a MLVA-ompA sequencing technique. Attempts were made to isolate C. trachomatis from all 157 samples in cell culture, and 68 (43% were successfully recovered by repeatable passage in culture. Of the 157 samples, 93 (i.e. 59% were fully genotyped by MLVA-ompA. Only one mixed infection (E & D in a single sample was confirmed. There were two distinct D genotypes for the ompA gene. Most frequent ompA genotypes were D, E and F, comprising 20%, 41% and 16% of the type-able samples respectively. Within all genotypes we detected numerous MLVA sub-types.Amongst the common genotypes, there are a significant number of defined MLVA sub-types, which may reflect particular background demographics including age group, geography, high-risk sexual behavior, and sexual networks.

  19. A randomized controlled trial of cognitive behavioral therapy for individuals at clinical high risk of psychosis.

    Science.gov (United States)

    Addington, Jean; Epstein, Irvin; Liu, Lu; French, Paul; Boydell, Katherine M; Zipursky, Robert B

    2011-01-01

    There has been increasing interest in early detection during the prodromal phase of a psychotic disorder. To date a few treatment studies have been published with some promising results for both pharmacological treatments, using second generation antipsychotics, and psychological interventions, mainly cognitive behavioral therapy. The purpose of this study was to determine first if cognitive behavioral therapy (CBT) was more effective in reducing the rates of conversion compared to a supportive therapy and secondly whether those who received CBT had improved symptoms compared to those who received supportive therapy. Fifty-one individuals at clinical high risk of developing psychosis were randomized to CBT or a supportive therapy for up to 6 months. The sample was assessed at 6, 12 and 18 months post baseline on attenuated positive symptoms, negative symptoms, depression, anxiety and social functioning. Conversions to psychosis only occurred in the group who received supportive therapy although the difference was not significant. Both groups improved in attenuated positive symptoms, depression and anxiety and neither improved in social functioning and negative symptoms. There were no differences between the two treatment groups. However, the improvement in attenuated positive symptoms was more rapid for the CBT group. There are limitations of this trial and potential explanations for the lack of differences. However, both the results of this study and the possible explanations have significant implications for early detection and intervention in the pre-psychotic phase and for designing future treatments. Copyright © 2010 Elsevier B.V. All rights reserved.

  20. High rates of relapse in adolescents crack users after inpatient clinic discharge

    Directory of Open Access Journals (Sweden)

    Rosemeri Siqueira Pedroso

    Full Text Available ABSTRACT Objective The objective of the present study was to evaluate 88 adolescent crack users referred to hospitalization and to follow them up after discharge to investigate relapse and factors associated with treatment. Methods Cohort (30 and 90 days after discharge from a psychiatric hospital and a rehab clinic for treatment for chemical dependency in Porto Alegre between 2011 and 2012. Instruments: Semi-structured interview, conducted to evaluate the sociodemographic profile of the sample and describe the pattern of psychoactive substance use; Crack Use Relapse Scale/CURS; Questionnaire Tracking Users to Crack/QTUC; K-SADS-PL. Results In the first follow-up period (30 days after discharge, 65.9% of participants had relapsed. In the second follow-up period (90 days after discharge, 86.4% of participants had relapsed. Conclusion This is one of the first studies that show the extremely high prevalence of early relapse in adolescent crack users after discharge, questioning the cost/benefit of inpatient treatment for this population. Moreover, these results corroborate studies which suggested, young psychostimulants users might need tailored intensive outpatient treatment with contingency management and other behavioral strategies, in order to increase compliance and reduce drug or crime relapse, but this specific therapeutic modality is still scarce and must be developed in Brazil.