WorldWideScience

Sample records for dynamic context likelihood

  1. DREAM3: network inference using dynamic context likelihood of relatedness and the inferelator.

    Directory of Open Access Journals (Sweden)

    Aviv Madar

    2010-03-01

    Full Text Available Many current works aiming to learn regulatory networks from systems biology data must balance model complexity with respect to data availability and quality. Methods that learn regulatory associations based on unit-less metrics, such as Mutual Information, are attractive in that they scale well and reduce the number of free parameters (model complexity per interaction to a minimum. In contrast, methods for learning regulatory networks based on explicit dynamical models are more complex and scale less gracefully, but are attractive as they may allow direct prediction of transcriptional dynamics and resolve the directionality of many regulatory interactions.We aim to investigate whether scalable information based methods (like the Context Likelihood of Relatedness method and more explicit dynamical models (like Inferelator 1.0 prove synergistic when combined. We test a pipeline where a novel modification of the Context Likelihood of Relatedness (mixed-CLR, modified to use time series data is first used to define likely regulatory interactions and then Inferelator 1.0 is used for final model selection and to build an explicit dynamical model.Our method ranked 2nd out of 22 in the DREAM3 100-gene in silico networks challenge. Mixed-CLR and Inferelator 1.0 are complementary, demonstrating a large performance gain relative to any single tested method, with precision being especially high at low recall values. Partitioning the provided data set into four groups (knock-down, knock-out, time-series, and combined revealed that using comprehensive knock-out data alone provides optimal performance. Inferelator 1.0 proved particularly powerful at resolving the directionality of regulatory interactions, i.e. "who regulates who" (approximately of identified true positives were correctly resolved. Performance drops for high in-degree genes, i.e. as the number of regulators per target gene increases, but not with out-degree, i.e. performance is not affected by

  2. Likelihood-based Dynamic Factor Analysis for Measurement and Forecasting

    NARCIS (Netherlands)

    Jungbacker, B.M.J.P.; Koopman, S.J.

    2015-01-01

    We present new results for the likelihood-based analysis of the dynamic factor model. The latent factors are modelled by linear dynamic stochastic processes. The idiosyncratic disturbance series are specified as autoregressive processes with mutually correlated innovations. The new results lead to

  3. The Prior Can Often Only Be Understood in the Context of the Likelihood

    Directory of Open Access Journals (Sweden)

    Andrew Gelman

    2017-10-01

    Full Text Available A key sticking point of Bayesian analysis is the choice of prior distribution, and there is a vast literature on potential defaults including uniform priors, Jeffreys’ priors, reference priors, maximum entropy priors, and weakly informative priors. These methods, however, often manifest a key conceptual tension in prior modeling: a model encoding true prior information should be chosen without reference to the model of the measurement process, but almost all common prior modeling techniques are implicitly motivated by a reference likelihood. In this paper we resolve this apparent paradox by placing the choice of prior into the context of the entire Bayesian analysis, from inference to prediction to model evaluation.

  4. Maximum Likelihood Dynamic Factor Modeling for Arbitrary "N" and "T" Using SEM

    Science.gov (United States)

    Voelkle, Manuel C.; Oud, Johan H. L.; von Oertzen, Timo; Lindenberger, Ulman

    2012-01-01

    This article has 3 objectives that build on each other. First, we demonstrate how to obtain maximum likelihood estimates for dynamic factor models (the direct autoregressive factor score model) with arbitrary "T" and "N" by means of structural equation modeling (SEM) and compare the approach to existing methods. Second, we go beyond standard time…

  5. Efficient estimators for likelihood ratio sensitivity indices of complex stochastic dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Arampatzis, Georgios; Katsoulakis, Markos A.; Rey-Bellet, Luc [Department of Mathematics and Statistics, University of Massachusetts, Amherst, Massachusetts 01003 (United States)

    2016-03-14

    We demonstrate that centered likelihood ratio estimators for the sensitivity indices of complex stochastic dynamics are highly efficient with low, constant in time variance and consequently they are suitable for sensitivity analysis in long-time and steady-state regimes. These estimators rely on a new covariance formulation of the likelihood ratio that includes as a submatrix a Fisher information matrix for stochastic dynamics and can also be used for fast screening of insensitive parameters and parameter combinations. The proposed methods are applicable to broad classes of stochastic dynamics such as chemical reaction networks, Langevin-type equations and stochastic models in finance, including systems with a high dimensional parameter space and/or disparate decorrelation times between different observables. Furthermore, they are simple to implement as a standard observable in any existing simulation algorithm without additional modifications.

  6. Elaboration Likelihood Model and an Analysis of the Contexts of Its Application

    OpenAIRE

    Aslıhan Kıymalıoğlu

    2014-01-01

    Elaboration Likelihood Model (ELM), which supports the existence of two routes to persuasion: central and peripheral routes, has been one of the major models on persuasion. As the number of studies in the Turkish literature on ELM is limited, a detailed explanation of the model together with a comprehensive literature review was considered to be contributory for this gap. The findings of the review reveal that the model was mostly used in marketing and advertising researches, that the concept...

  7. Elaboration Likelihood Model and an Analysis of the Contexts of Its Application

    Directory of Open Access Journals (Sweden)

    Aslıhan Kıymalıoğlu

    2014-12-01

    Full Text Available Elaboration Likelihood Model (ELM, which supports the existence of two routes to persuasion: central and peripheral routes, has been one of the major models on persuasion. As the number of studies in the Turkish literature on ELM is limited, a detailed explanation of the model together with a comprehensive literature review was considered to be contributory for this gap. The findings of the review reveal that the model was mostly used in marketing and advertising researches, that the concept most frequently used in elaboration process was involvement, and that argument quality and endorser credibility were the factors most often employed in measuring their effect on the dependant variables. The review provides valuable insights as it presents a holistic view of the model and the variables used in the model.

  8. Laser-Based Slam with Efficient Occupancy Likelihood Map Learning for Dynamic Indoor Scenes

    Science.gov (United States)

    Li, Li; Yao, Jian; Xie, Renping; Tu, Jinge; Feng, Chen

    2016-06-01

    Location-Based Services (LBS) have attracted growing attention in recent years, especially in indoor environments. The fundamental technique of LBS is the map building for unknown environments, this technique also named as simultaneous localization and mapping (SLAM) in robotic society. In this paper, we propose a novel approach for SLAMin dynamic indoor scenes based on a 2D laser scanner mounted on a mobile Unmanned Ground Vehicle (UGV) with the help of the grid-based occupancy likelihood map. Instead of applying scan matching in two adjacent scans, we propose to match current scan with the occupancy likelihood map learned from all previous scans in multiple scales to avoid the accumulation of matching errors. Due to that the acquisition of the points in a scan is sequential but not simultaneous, there unavoidably exists the scan distortion at different extents. To compensate the scan distortion caused by the motion of the UGV, we propose to integrate a velocity of a laser range finder (LRF) into the scan matching optimization framework. Besides, to reduce the effect of dynamic objects such as walking pedestrians often existed in indoor scenes as much as possible, we propose a new occupancy likelihood map learning strategy by increasing or decreasing the probability of each occupancy grid after each scan matching. Experimental results in several challenged indoor scenes demonstrate that our proposed approach is capable of providing high-precision SLAM results.

  9. Evaluation of Dynamic Coastal Response to Sea-level Rise Modifies Inundation Likelihood

    Science.gov (United States)

    Lentz, Erika E.; Thieler, E. Robert; Plant, Nathaniel G.; Stippa, Sawyer R.; Horton, Radley M.; Gesch, Dean B.

    2016-01-01

    Sea-level rise (SLR) poses a range of threats to natural and built environments, making assessments of SLR-induced hazards essential for informed decision making. We develop a probabilistic model that evaluates the likelihood that an area will inundate (flood) or dynamically respond (adapt) to SLR. The broad-area applicability of the approach is demonstrated by producing 30x30m resolution predictions for more than 38,000 sq km of diverse coastal landscape in the northeastern United States. Probabilistic SLR projections, coastal elevation and vertical land movement are used to estimate likely future inundation levels. Then, conditioned on future inundation levels and the current land-cover type, we evaluate the likelihood of dynamic response versus inundation. We find that nearly 70% of this coastal landscape has some capacity to respond dynamically to SLR, and we show that inundation models over-predict land likely to submerge. This approach is well suited to guiding coastal resource management decisions that weigh future SLR impacts and uncertainty against ecological targets and economic constraints.

  10. Emigration dynamics: the Indian context.

    Science.gov (United States)

    Premi, M K; Mathur, M D

    1995-01-01

    This report on emigration dynamics in India opens by providing background on short- and long-distance migration to and from India in response to such events as the formation of Pakistan as well as to the policies of the British Empire and Commonwealth. Section 2 discusses India's demographic and sociocultural setting in terms of population growth, urbanization, patterns of internal migration, growth of the labor force, economic growth, poverty alleviation, health, and education. The third section describes the lack of data on international migration. Some data are available on emigrants, but the only information on return migration is that gleaned from surveys in Kerala. Section 4 considers emigration to industrialized countries and notes that it is almost exclusively permanent and largely composed of individuals with professional, technical, or managerial skills. The resulting brain drain is described as is the incidence of illegal migration. India does not create conditions from which citizens must seek asylum, rather the country has absorbed flows of refugees from Pakistan, Tibet, Bangladesh, Afghanistan, and Sri Lanka. Available data on the characteristics of emigrants and return migrants are reviewed in the next two sections, and section 7 looks at the data on financial flows gathered from macro-level estimates of remittances. Section 8 is devoted to the community, family, and individual factors which influence emigration including the networks that facilitate migration and means of meeting migration costs. The ninth section summarizes the political setting with an emphasis on the adverse reaction of Nepal to population movement from India. The final section of the report projects future population movements. It is noted that if there were no restrictions on migration, millions of Indians would emigrate to the Americas, Africa, and Australia. Whereas poverty, unemployment, and population growth will likely erode living conditions in India, the government has

  11. Comparative behaviour of the Dynamically Penalized Likelihood algorithm in inverse radiation therapy planning

    Energy Technology Data Exchange (ETDEWEB)

    Llacer, Jorge [EC Engineering Consultants, LLC, Los Gatos, CA (United States)]. E-mail: jllacer@home.com; Solberg, Timothy D. [Department of Radiation Oncology, University of California, Los Angeles, CA (United States)]. E-mail: Solberg@radonc.ucla.edu; Promberger, Claus [BrainLAB AG, Heimstetten (Germany)]. E-mail: promberg@brainlab.com

    2001-10-01

    This paper presents a description of tests carried out to compare the behaviour of five algorithms in inverse radiation therapy planning: (1) The Dynamically Penalized Likelihood (DPL), an algorithm based on statistical estimation theory; (2) an accelerated version of the same algorithm; (3) a new fast adaptive simulated annealing (ASA) algorithm; (4) a conjugate gradient method; and (5) a Newton gradient method. A three-dimensional mathematical phantom and two clinical cases have been studied in detail. The phantom consisted of a U-shaped tumour with a partially enclosed 'spinal cord'. The clinical examples were a cavernous sinus meningioma and a prostate case. The algorithms have been tested in carefully selected and controlled conditions so as to ensure fairness in the assessment of results. It has been found that all five methods can yield relatively similar optimizations, except when a very demanding optimization is carried out. For the easier cases, the differences are principally in robustness, ease of use and optimization speed. In the more demanding case, there are significant differences in the resulting dose distributions. The accelerated DPL emerges as possibly the algorithm of choice for clinical practice. An appendix describes the differences in behaviour between the new ASA method and the one based on a patent by the Nomos Corporation. (author)

  12. Information behavior in dynamic group work contexts

    DEFF Research Database (Denmark)

    Sonnenwald, Diane H.; Pierce, Linda G.

    2000-01-01

    personnel and documentation on C2. During data analysis, three important themes that highlight the why, what, how and consequences of information behavior in C2 emerged. The first is the concept of interwoven situational awareness consisting of individual, intragroup and intergroup shared understanding...... of the situation. Interwoven situational awareness appears to facilitate response to dynamic, constraint-bound situations. The second theme describes the need for dense social networks or frequent communication between participants about the work context and situation, the work process and domain...

  13. Philosophy and phylogenetic inference: a comparison of likelihood and parsimony methods in the context of Karl Popper's writings on corroboration.

    Science.gov (United States)

    de Queiroz, K; Poe, S

    2001-06-01

    Advocates of cladistic parsimony methods have invoked the philosophy of Karl Popper in an attempt to argue for the superiority of those methods over phylogenetic methods based on Ronald Fisher's statistical principle of likelihood. We argue that the concept of likelihood in general, and its application to problems of phylogenetic inference in particular, are highly compatible with Popper's philosophy. Examination of Popper's writings reveals that his concept of corroboration is, in fact, based on likelihood. Moreover, because probabilistic assumptions are necessary for calculating the probabilities that define Popper's corroboration, likelihood methods of phylogenetic inference--with their explicit probabilistic basis--are easily reconciled with his concept. In contrast, cladistic parsimony methods, at least as described by certain advocates of those methods, are less easily reconciled with Popper's concept of corroboration. If those methods are interpreted as lacking probabilistic assumptions, then they are incompatible with corroboration. Conversely, if parsimony methods are to be considered compatible with corroboration, then they must be interpreted as carrying implicit probabilistic assumptions. Thus, the non-probabilistic interpretation of cladistic parsimony favored by some advocates of those methods is contradicted by an attempt by the same authors to justify parsimony methods in terms of Popper's concept of corroboration. In addition to being compatible with Popperian corroboration, the likelihood approach to phylogenetic inference permits researchers to test the assumptions of their analytical methods (models) in a way that is consistent with Popper's ideas about the provisional nature of background knowledge.

  14. Dynamic Context Bindings, Infrastructural Support for Context-aware Applications

    NARCIS (Netherlands)

    Broens, T.H.F.

    2008-01-01

    The world is increasingly equipped with high-capacity, interconnected, mobile and embedded computing devices. Context-awareness provides an attractive approach to personalize applications such that they better suit the user’s needs in this rich computing environment. Context-aware applications use

  15. Dynamics of Context-Dependent Recall: An Examination of Internal and External Context Change

    Science.gov (United States)

    Unsworth, Nash; Spillers, Gregory J.; Brewer, Gene A.

    2012-01-01

    Retrieval dynamics in context-dependent recall were explored via manipulations of external and internal context in two experiments. Participants were tested in either the same or different context as the material was learned in and correct recalls, errors, and recall latency measures were examined. In both experiments changes in context resulted…

  16. The relationship between the neural computations for speech and music perception is context-dependent: an activation likelihood estimate study

    Directory of Open Access Journals (Sweden)

    Arianna eLaCroix

    2015-08-01

    Full Text Available The relationship between the neurobiology of speech and music has been investigated for more than a century. There remains no widespread agreement regarding how (or to what extent music perception utilizes the neural circuitry that is engaged in speech processing, particularly at the cortical level. Prominent models such as Patel’s Shared Syntactic Integration Resource Hypothesis (SSIRH and Koelsch’s neurocognitive model of music perception suggest a high degree of overlap, particularly in the frontal lobe, but also perhaps more distinct representations in the temporal lobe with hemispheric asymmetries. The present meta-analysis study used activation likelihood estimate analyses to identify the brain regions consistently activated for music as compared to speech across the functional neuroimaging (fMRI and PET literature. Eighty music and 91 speech neuroimaging studies of healthy adult control subjects were analyzed. Peak activations reported in the music and speech studies were divided into four paradigm categories: passive listening, discrimination tasks, error/anomaly detection tasks and memory-related tasks. We then compared activation likelihood estimates within each category for music versus speech, and each music condition with passive listening. We found that listening to music and to speech preferentially activate distinct temporo-parietal bilateral cortical networks. We also found music and speech to have shared resources in the left pars opercularis but speech-specific resources in the left pars triangularis. The extent to which music recruited speech-activated frontal resources was modulated by task. While there are certainly limitations to meta-analysis techniques particularly regarding sensitivity, this work suggests that the extent of shared resources between speech and music may be task-dependent and highlights the need to consider how task effects may be affecting conclusions regarding the neurobiology of speech and music.

  17. The relationship between the neural computations for speech and music perception is context-dependent: an activation likelihood estimate study

    Science.gov (United States)

    LaCroix, Arianna N.; Diaz, Alvaro F.; Rogalsky, Corianne

    2015-01-01

    The relationship between the neurobiology of speech and music has been investigated for more than a century. There remains no widespread agreement regarding how (or to what extent) music perception utilizes the neural circuitry that is engaged in speech processing, particularly at the cortical level. Prominent models such as Patel's Shared Syntactic Integration Resource Hypothesis (SSIRH) and Koelsch's neurocognitive model of music perception suggest a high degree of overlap, particularly in the frontal lobe, but also perhaps more distinct representations in the temporal lobe with hemispheric asymmetries. The present meta-analysis study used activation likelihood estimate analyses to identify the brain regions consistently activated for music as compared to speech across the functional neuroimaging (fMRI and PET) literature. Eighty music and 91 speech neuroimaging studies of healthy adult control subjects were analyzed. Peak activations reported in the music and speech studies were divided into four paradigm categories: passive listening, discrimination tasks, error/anomaly detection tasks and memory-related tasks. We then compared activation likelihood estimates within each category for music vs. speech, and each music condition with passive listening. We found that listening to music and to speech preferentially activate distinct temporo-parietal bilateral cortical networks. We also found music and speech to have shared resources in the left pars opercularis but speech-specific resources in the left pars triangularis. The extent to which music recruited speech-activated frontal resources was modulated by task. While there are certainly limitations to meta-analysis techniques particularly regarding sensitivity, this work suggests that the extent of shared resources between speech and music may be task-dependent and highlights the need to consider how task effects may be affecting conclusions regarding the neurobiology of speech and music. PMID:26321976

  18. Adaptive wave filtering for dynamic positioning of marine vessels using maximum likelihood identification: Theory and experiments

    Digital Repository Service at National Institute of Oceanography (India)

    Hassani, V.; Sorensen, A.J.; Pascoal, A.M.

    This paper addresses a filtering problem that arises in the design of dynamic positioning systems for ships and offshore rigs subjected to the influence of sea waves. The dynamic model of the vessel captures explicitly the sea state as an uncertain...

  19. Empirical likelihood

    CERN Document Server

    Owen, Art B

    2001-01-01

    Empirical likelihood provides inferences whose validity does not depend on specifying a parametric model for the data. Because it uses a likelihood, the method has certain inherent advantages over resampling methods: it uses the data to determine the shape of the confidence regions, and it makes it easy to combined data from multiple sources. It also facilitates incorporating side information, and it simplifies accounting for censored, truncated, or biased sampling.One of the first books published on the subject, Empirical Likelihood offers an in-depth treatment of this method for constructing confidence regions and testing hypotheses. The author applies empirical likelihood to a range of problems, from those as simple as setting a confidence region for a univariate mean under IID sampling, to problems defined through smooth functions of means, regression models, generalized linear models, estimating equations, or kernel smooths, and to sampling with non-identically distributed data. Abundant figures offer vi...

  20. Approximate Likelihood

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Most physics results at the LHC end in a likelihood ratio test. This includes discovery and exclusion for searches as well as mass, cross-section, and coupling measurements. The use of Machine Learning (multivariate) algorithms in HEP is mainly restricted to searches, which can be reduced to classification between two fixed distributions: signal vs. background. I will show how we can extend the use of ML classifiers to distributions parameterized by physical quantities like masses and couplings as well as nuisance parameters associated to systematic uncertainties. This allows for one to approximate the likelihood ratio while still using a high dimensional feature vector for the data. Both the MEM and ABC approaches mentioned above aim to provide inference on model parameters (like cross-sections, masses, couplings, etc.). ABC is fundamentally tied Bayesian inference and focuses on the “likelihood free” setting where only a simulator is available and one cannot directly compute the likelihood for the dat...

  1. Can Machines Learn Respiratory Virus Epidemiology?: A Comparative Study of Likelihood-Free Methods for the Estimation of Epidemiological Dynamics

    Directory of Open Access Journals (Sweden)

    Heidi L. Tessmer

    2018-03-01

    Full Text Available To estimate and predict the transmission dynamics of respiratory viruses, the estimation of the basic reproduction number, R0, is essential. Recently, approximate Bayesian computation methods have been used as likelihood free methods to estimate epidemiological model parameters, particularly R0. In this paper, we explore various machine learning approaches, the multi-layer perceptron, convolutional neural network, and long-short term memory, to learn and estimate the parameters. Further, we compare the accuracy of the estimates and time requirements for machine learning and the approximate Bayesian computation methods on both simulated and real-world epidemiological data from outbreaks of influenza A(H1N1pdm09, mumps, and measles. We find that the machine learning approaches can be verified and tested faster than the approximate Bayesian computation method, but that the approximate Bayesian computation method is more robust across different datasets.

  2. Theoretical Analysis of Penalized Maximum-Likelihood Patlak Parametric Image Reconstruction in Dynamic PET for Lesion Detection.

    Science.gov (United States)

    Yang, Li; Wang, Guobao; Qi, Jinyi

    2016-04-01

    Detecting cancerous lesions is a major clinical application of emission tomography. In a previous work, we studied penalized maximum-likelihood (PML) image reconstruction for lesion detection in static PET. Here we extend our theoretical analysis of static PET reconstruction to dynamic PET. We study both the conventional indirect reconstruction and direct reconstruction for Patlak parametric image estimation. In indirect reconstruction, Patlak parametric images are generated by first reconstructing a sequence of dynamic PET images, and then performing Patlak analysis on the time activity curves (TACs) pixel-by-pixel. In direct reconstruction, Patlak parametric images are estimated directly from raw sinogram data by incorporating the Patlak model into the image reconstruction procedure. PML reconstruction is used in both the indirect and direct reconstruction methods. We use a channelized Hotelling observer (CHO) to assess lesion detectability in Patlak parametric images. Simplified expressions for evaluating the lesion detectability have been derived and applied to the selection of the regularization parameter value to maximize detection performance. The proposed method is validated using computer-based Monte Carlo simulations. Good agreements between the theoretical predictions and the Monte Carlo results are observed. Both theoretical predictions and Monte Carlo simulation results show the benefit of the indirect and direct methods under optimized regularization parameters in dynamic PET reconstruction for lesion detection, when compared with the conventional static PET reconstruction.

  3. Building a Context World for Dynamic Service Composition

    DEFF Research Database (Denmark)

    Yu, Lian; Glenstrup, Arne John; Su, Shuang

    Dynamic service composition requires responding and adapting to changes in the computing environment when orchestrating existing services into one or more new services that fit better to a composite application. This paper abstracts the changes of the environment as a context world to store the p...... and capabilities of goals and services in a concise and editable manner. Goal-driven and planning techniques are used to dynamically implement the service composition according to the domain knowledge and facts in the context world....... the physical contexts of the computing environment, user profiles and computed results of services as well. We use ontology techniques to model the domain concepts of application contexts. Context Condition/Effect Description Language is designed to describe the dynamic semantics of the requirements...

  4. Plate dynamical mechanisms as constraints on the likelihood of earthquake precursors in the ionosphere

    Science.gov (United States)

    Osmaston, Miles

    2013-04-01

    In my oral(?) contribution to this session [1] I use my studies of the fundamental physics of gravitation to derive a reason for expecting the vertical gradient of electron density (= radial electric field) in the ionosphere to be closely affected by another field, directly associated with the ordinary gravitational potential (g) present at the Earth's surface. I have called that other field the Gravity-Electric (G-E) field. A calibration of this linkage relationship could be provided by noting corresponding co-seismic changes in (g) and in the ionosphere when, for example, a major normal-fault slippage occurs. But we are here concerned with precursory changes. This means we are looking for mechanisms which, on suitably short timescales, would generate pre-quake elastic deformation that changes the local (g). This poster supplements my talk by noting, for more relaxed discussion, what I see as potentially relevant plate dynamical mechanisms. Timescale constraints. If monitoring for ionospheric precursors is on only short timescales, their detectability is limited to correspondingly tectonically active regions. But as our monitoring becomes more precise and over longer terms, this constraint will relax. Most areas of the Earth are undergoing very slow heating or cooling and corresponding volume or epeirogenic change; major earthquakes can result but we won't have detected any accumulating ionospheric precursor. Transcurrent faulting. In principle, slip on a straight fault, even in a stick-slip manner, should produce little vertical deformation, but a kink, such as has caused the Transverse Ranges on the San Andreas Fault, would seem worth monitoring for precursory build-up in the ionosphere. Plate closure - subducting plate downbend. The traditionally presumed elastic flexure downbend mechanism is incorrect. 'Seismic coupling' has long been recognized by seismologists, invoking the repeated occurrence of 'asperities' to temporarily lock subduction and allow stress

  5. On Sustaining Dynamic Adaptation of Context-Aware Services

    Directory of Open Access Journals (Sweden)

    Boudjemaa Boudaa

    2015-03-01

    Full Text Available The modern human is getting more and more mobile having access to online services by using mobile cutting-edge computational devices. In the last decade, the field of context-aware services had led to emerge several works. However, most of the proposed approaches have not provided clear adaptation strategies in case of unforeseen contexts. Dealing with this last at runtime is also another crucial need that has been ignored in their proposals. This paper aims to propose a generic dynamic adaptation process as a phase in a model-driven development life-cycle for context-aware services using the MAPE-K control loop to meet the runtime adaptation. This process is validated by implementing an illustrative application on FraSCAti platform. The main benefit of the proposed process is to sustain the self-reconfiguration of such services at model and code levels by enabling successive dynamic adaptations depending on the changing context.

  6. Dynamics of western career attributes in the Russian context

    NARCIS (Netherlands)

    Khapova, S.N.; Korotov, K.

    2007-01-01

    Purpose - The purpose of this article is to raise awareness of the dynamic character of career and its key attributes, and the embeddedness of their definitions and meanings in national social, political and economic contexts. Design/methodology/approach - Features of three recent distinct social,

  7. Dynamic context logic and its application to norm change

    NARCIS (Netherlands)

    Aucher, G.; Grossi, D.; Herzig, A.; Lorini, E.

    2009-01-01

    Building on a simple modal logic of context, the paper presents a dynamic logic characterizing operations of contraction and expansion on theories. We investigate the mathematical properties of the logic, and use it to develop an axiomatic and semantic analysis of norm change in normative systems.

  8. Measurement of the Top Quark Mass by Dynamical Likelihood Method using the Lepton + Jets Events with the Collider Detector at Fermilab

    Energy Technology Data Exchange (ETDEWEB)

    Kubo, Taichi [Univ. of Tsukuba (Japan)

    2008-02-01

    We have measured the top quark mass with the dynamical likelihood method. The data corresponding to an integrated luminosity of 1.7fb-1 was collected in proton antiproton collisions at a center of mass energy of 1.96 TeV with the CDF detector at Fermilab Tevatron during the period March 2002-March 2007. We select t$\\bar{t}$ pair production candidates by requiring one high energy lepton and four jets, in which at least one of jets must be tagged as a b-jet. In order to reconstruct the top quark mass, we use the dynamical likelihood method based on maximum likelihood method where a likelihood is defined as the differential cross section multiplied by the transfer function from observed quantities to parton quantities, as a function of the top quark mass and the jet energy scale(JES). With this method, we measure the top quark mass to be 171.6 ± 2.0 (stat.+ JES) ± 1.3(syst.) = 171.6 ± 2.4 GeV/c2.

  9. GROUP DYNAMICS AND TEAM FUNCTIONING IN ORGANIZATIONAL CONTEXT

    Directory of Open Access Journals (Sweden)

    Raluca ZOLTAN

    2015-07-01

    Full Text Available In all kind of organization many activities are done by groups and teams. But how are they formed? What factors influence their existence and development? How members of groups and teams are selected? Which are the consequences in organizational context? In order to answer these questions, in the present paper we describe and analyze the main approaches regarding the formation of work groups and work teams (sociometric approach and group dynamics approach, the main factors that affects group dynamics and the FIRO model for evaluation the team members’ needs.

  10. Nursing students' learning dynamics and influencing factors in clinical contexts.

    Science.gov (United States)

    Lee, Jung Jae; Clarke, Charlotte L; Carson, Maggie N

    2018-03-01

    Clinical placements are essential for students to develop clinical skills to qualify as nurses. However, various difficulties encountered by nursing students during their clinical education detract from developing clinical competencies. This constructivist grounded theory study aims to explore nursing students' experiences in clinical nursing education, and to identify the factors that influence the clinical education students receive. Twenty-one individual and six group semi-structured interviews were conducted with sixteen fourth year nursing students and four registered nurses. This research identified six factors that influence nursing students' clinical education: interpersonal, socio-cultural, instructional, environmental, emotional and physical factors. The research has developed a dynamic model of learning in clinical contexts, which offers opportunities to understand how students' learning is influenced multifactorially during clinical placements. The understanding and application of the model can improve nursing instructional design, and subsequently, nursing students' learning in clinical contexts. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Placing ecosystem sustainability within the context of dynamic earth systems

    Science.gov (United States)

    Sidle, R. C.

    2013-12-01

    Because the concept of ecosystem sustainability and the practice of sustainable land management both have long-term foci, it is necessary to view these from the perspective of dynamic rather than static systems. In addition to the typical static system approach for assessing ecosystem sustainability, three additional perspectives are presented. These are resilient systems, systems where tipping points occur, and systems subject to episodic geophysical resetting. Ecosystem resilience accommodates both natural and anthropogenic stressors and should be considered to properly frame many ecosystem assessments. A more complex problem emerges when stressors push systems to tipping points, causing a regime shift. Both chronic anthropogenic activities (e.g., over-grazing, forest conversion, poor irrigation practices) and natural changes (e.g., climate anomalies, geochemical weathering, tectonic uplift, vegetative succession) can exhaust ecosystem resilience leading to a rapid change in state. Anthropogenic perturbations can also lower the initiation threshold and increase the magnitude and frequency of certain natural disasters, increasing the likelihood of ecosystem change. Furthermore, when major episodic geophysical events (e.g., large earthquakes, tsunami, and floods; widespread volcanic activity and landslides) exceed thresholds of ecosystem resilience they may reset the attributes of entire systems or landscapes. Large disasters can initiate a cascade of linked events, as in the 2011 Great East Japan Earthquake, where tsunami, fires, landslides, artificial fillslope collapses, radioactive releases, and associated health effects occurred. Understanding the potential for natural change (both chronic and episodic) in ecosystems is essential not only to the environmental aspect of sustainability but also to economic and social aspects. Examples are presented for: (1) ecosystems vulnerable to tipping points (Yunnan, China) and (2) ecosystems reset by earthquakes and

  12. Dynamic context discrimination : psychological evidence for the Sandia Cognitive Framework.

    Energy Technology Data Exchange (ETDEWEB)

    Speed, Ann Elizabeth

    2004-09-01

    Human behavior is a function of an iterative interaction between the stimulus environment and past experience. It is not simply a matter of the current stimulus environment activating the appropriate experience or rule from memory (e.g., if it is dark and I hear a strange noise outside, then I turn on the outside lights and investigate). Rather, it is a dynamic process that takes into account not only things one would generally do in a given situation, but things that have recently become known (e.g., there have recently been coyotes seen in the area and one is known to be rabid), as well as other immediate environmental characteristics (e.g., it is snowing outside, I know my dog is outside, I know the police are already outside, etc.). All of these factors combine to inform me of the most appropriate behavior for the situation. If it were the case that humans had a rule for every possible contingency, the amount of storage that would be required to enable us to fluidly deal with most situations we encounter would rapidly become biologically untenable. We can all deal with contingencies like the one above with fairly little effort, but if it isn't based on rules, what is it based on? The assertion of the Cognitive Systems program at Sandia for the past 5 years is that at the heart of this ability to effectively navigate the world is an ability to discriminate between different contexts (i.e., Dynamic Context Discrimination, or DCD). While this assertion in and of itself might not seem earthshaking, it is compelling that this ability and its components show up in a wide variety of paradigms across different subdisciplines in psychology. We begin by outlining, at a high functional level, the basic ideas of DCD. We then provide evidence from several different literatures and paradigms that support our assertion that DCD is a core aspect of cognitive functioning. Finally, we discuss DCD and the computational model that we have developed as an instantiation of DCD

  13. Dynamic geometry as a context for exploring conjectures

    Science.gov (United States)

    Wares, Arsalan

    2018-01-01

    The purpose of this paper is to provide examples of 'non-traditional' proof-related activities that can explored in a dynamic geometry environment by university and high school students of mathematics. These propositions were encountered in the dynamic geometry environment. The author believes that teachers can ask their students to construct proofs for these propositions.

  14. Motivational Dynamics in Language Learning: Change, Stability, and Context

    Science.gov (United States)

    Waninge, Freerkien; Dörnyei, Zoltán; De Bot, Kees

    2014-01-01

    Motivation as a variable in L2 development is no longer seen as the stable individual difference factor it was once believed to be: Influenced by process-oriented models and principles, and especially by the growing understanding of how complex dynamic systems work, researchers have been focusing increasingly on the dynamic and changeable nature…

  15. Electroencephalographic brain dynamics of memory encoding in emotionally arousing context

    Directory of Open Access Journals (Sweden)

    Carlos Enrique eUribe

    2011-06-01

    Full Text Available Emotional content/context enhances declarative memory through modulation of encoding and retrieval mechanisms. At encoding, neurophysiological data have consistently demonstrated the subsequent memory effect in theta and gamma oscillations. Yet, the existing studies were focused on the emotional content effect and let the emotional context effect unexplored. We hypothesized that theta and gamma oscillations show higher evoked/induced activity during the encoding of visual stimuli when delivered in an emotionally arousing context. Twenty-five healthy volunteers underwent evoked potentials recordings using a 21 scalp electrodes montage. They attended to an audiovisual test of emotional declarative memory being randomly assigned to either emotionally arousing or neutral context. Visual stimulus presentation was used as the time-locking event. Grand-averages of the evoked potentials and evoked spectral perturbations were calculated for each volunteer. Evoked potentials showed a higher negative deflection from 80 to 140 ms for the emotional condition. Such effect was observed over central, frontal and prefrontal locations bilaterally. Evoked theta power was higher in left parietal, central, frontal and prefrontal electrodes from -50 to 300 ms in the emotional condition. Evoked gamma power was higher in the emotional condition with a spatial distribution that overlapped at some points with the theta topography. The early theta power increase could be related to expectancy induced by auditory information processing that facilitates visual encoding in emotional contexts. Together, our results suggest that declarative memory enhancement for both emotional content and emotional context are supported by similar neural mechanisms at encoding, and offer new evidence about the brain processing of relevant environmental stimuli.

  16. Sensor-Based Activity Recognition with Dynamically Added Context

    Directory of Open Access Journals (Sweden)

    Jiahui Wen

    2015-08-01

    Full Text Available An activity recognition system essentially processes raw sensor data and maps them into latent activity classes. Most of the previous systems are built with supervised learning techniques and pre-defined data sources, and result in static models. However, in realistic and dynamic environments, original data sources may fail and new data sources become available, a robust activity recognition system should be able to perform evolution automatically with dynamic sensor availability in dynamic environments. In this paper, we propose methods that automatically incorporate dynamically available data sources to adapt and refine the recognition system at run-time. The system is built upon ensemble classifiers which can automatically choose the features with the most discriminative power. Extensive experimental results with publicly available datasets demonstrate the effectiveness of our methods.

  17. MXLKID: a maximum likelihood parameter identifier

    International Nuclear Information System (INIS)

    Gavel, D.T.

    1980-07-01

    MXLKID (MaXimum LiKelihood IDentifier) is a computer program designed to identify unknown parameters in a nonlinear dynamic system. Using noisy measurement data from the system, the maximum likelihood identifier computes a likelihood function (LF). Identification of system parameters is accomplished by maximizing the LF with respect to the parameters. The main body of this report briefly summarizes the maximum likelihood technique and gives instructions and examples for running the MXLKID program. MXLKID is implemented LRLTRAN on the CDC7600 computer at LLNL. A detailed mathematical description of the algorithm is given in the appendices. 24 figures, 6 tables

  18. Recall dynamics reveal the retrieval of emotional context.

    Science.gov (United States)

    Long, Nicole M; Danoff, Michelle S; Kahana, Michael J

    2015-10-01

    Memory is often better for emotional rather than neutral stimuli. The benefit for emotional items could be the result of an associative mechanism whereby items are associated to a slowly updating context. Through this process, emotional features are integrated with context during study, and are reactivated during test. The presence of emotion in context would both provide a stronger retrieval cue, enhancing memory of emotional items, as well as lead to emotional clustering, whereby emotionally similar items are recalled consecutively. To measure whether associative mechanisms can explain the enhancement for emotional items, we conducted a free recall study in which most items were emotionally neutral to minimize effects of mood induction and to more closely reflect naturalistic settings. We found that emotional items were significantly more likely to be recalled than neutral items and that participants were more likely to transition between emotional items rather than between emotional and neutral items. Together, these results suggest that contextual encoding and retrieval mechanisms may drive the benefit for emotional items both within and outside the laboratory.

  19. Ego involvement increases doping likelihood.

    Science.gov (United States)

    Ring, Christopher; Kavussanu, Maria

    2018-08-01

    Achievement goal theory provides a framework to help understand how individuals behave in achievement contexts, such as sport. Evidence concerning the role of motivation in the decision to use banned performance enhancing substances (i.e., doping) is equivocal on this issue. The extant literature shows that dispositional goal orientation has been weakly and inconsistently associated with doping intention and use. It is possible that goal involvement, which describes the situational motivational state, is a stronger determinant of doping intention. Accordingly, the current study used an experimental design to examine the effects of goal involvement, manipulated using direct instructions and reflective writing, on doping likelihood in hypothetical situations in college athletes. The ego-involving goal increased doping likelihood compared to no goal and a task-involving goal. The present findings provide the first evidence that ego involvement can sway the decision to use doping to improve athletic performance.

  20. The Dynamics of Memory: Context-Dependent Updating

    Science.gov (United States)

    Hupbach, Almut; Hardt, Oliver; Gomez, Rebecca; Nadel, Lynn

    2008-01-01

    Understanding the dynamics of memory change is one of the current challenges facing cognitive neuroscience. Recent animal work on memory reconsolidation shows that memories can be altered long after acquisition. When reactivated, memories can be modified and require a restabilization (reconsolidation) process. We recently extended this finding to…

  1. Privacy context model for dynamic privacy adaptation in ubiquitous computing

    NARCIS (Netherlands)

    Schaub, Florian; Koenings, Bastian; Dietzel, Stefan; Weber, M.; Kargl, Frank

    Ubiquitous computing is characterized by the merger of physical and virtual worlds as physical artifacts gain digital sensing, processing, and communication capabilities. Maintaining an appropriate level of privacy in the face of such complex and often highly dynamic systems is challenging. We argue

  2. Placing Ecosystem Sustainability Within the Context of Dynamic Earth Systems

    Science.gov (United States)

    Because the concept of ecosystem sustainability and the practice of sustainable land management both have long-term foci, it is necessary to view these from the perspective of dynamic rather than static systems. In addition to the typical static system approach for assessing ecos...

  3. Dynamic Planet Mercury in the Context of Its Environment

    CERN Document Server

    Clark, Pamela Elizabeth

    2007-01-01

    We are in a time of transition in our understanding of Mercury. Of particular interest here is the emerging picture of the planet as a system, with interactions between interior, surface, exosphere, and magnetosphere that have influenced and constrained the evolution of each part of the system. Previous books have emphasized the results of Mariner 10 and current ground-based measurements, with very little discussion of the nature and influence of the magnetosphere. This book will present the planet in the context of its surroundings, thus providing a foundation for the next major influx of information from Mercury and contributing to the planning for future missions.

  4. Logic of likelihood

    International Nuclear Information System (INIS)

    Wall, M.J.W.

    1992-01-01

    The notion of open-quotes probabilityclose quotes is generalized to that of open-quotes likelihood,close quotes and a natural logical structure is shown to exist for any physical theory which predicts likelihoods. Two physically based axioms are given for this logical structure to form an orthomodular poset, with an order-determining set of states. The results strengthen the basis of the quantum logic approach to axiomatic quantum theory. 25 refs

  5. The phylogenetic likelihood library.

    Science.gov (United States)

    Flouri, T; Izquierdo-Carrasco, F; Darriba, D; Aberer, A J; Nguyen, L-T; Minh, B Q; Von Haeseler, A; Stamatakis, A

    2015-03-01

    We introduce the Phylogenetic Likelihood Library (PLL), a highly optimized application programming interface for developing likelihood-based phylogenetic inference and postanalysis software. The PLL implements appropriate data structures and functions that allow users to quickly implement common, error-prone, and labor-intensive tasks, such as likelihood calculations, model parameter as well as branch length optimization, and tree space exploration. The highly optimized and parallelized implementation of the phylogenetic likelihood function and a thorough documentation provide a framework for rapid development of scalable parallel phylogenetic software. By example of two likelihood-based phylogenetic codes we show that the PLL improves the sequential performance of current software by a factor of 2-10 while requiring only 1 month of programming time for integration. We show that, when numerical scaling for preventing floating point underflow is enabled, the double precision likelihood calculations in the PLL are up to 1.9 times faster than those in BEAGLE. On an empirical DNA dataset with 2000 taxa the AVX version of PLL is 4 times faster than BEAGLE (scaling enabled and required). The PLL is available at http://www.libpll.org under the GNU General Public License (GPL). © The Author(s) 2014. Published by Oxford University Press, on behalf of the Society of Systematic Biologists.

  6. Algorithm to illustrate context using dynamic lighting effects

    Science.gov (United States)

    John, Roshy M.; Balasubramanian, T.

    2007-09-01

    With the invention of Ultra-Bright LED, solid state lighting has come to something which is much more efficient and energy saving when compared to conventional incandescent or fluorescent lighting. With the use of proper driver electronics now a days it is possible to install solid state lighting systems with the cost same as that of any other lighting technology. This paper is a part of the research project we are doing in our lab, which deals with using ultra bright LEDs of different colors for lighting applications. The driver electronics are made in such a way that, the color and brightness of the lights will change according to context. For instance, if one of the users is reading a story or listening to music in a Personal Computer or in a hand held device such as a PDA, the lighting systems and the HVAC (Heating Ventilation Air-conditioning) systems will change dramatically according to the content of the story or the music. The vulnerability of solid-state lighting helps to accomplish such an effect. Such a type of system will help the reader to feel the story mentally and physically as well. We developed complete driver electronics for the system using multiple microcomputers and a full software suite which uses complex algorithms to decode the context from text or music and synchronize it to lighting and HVAC information. The paper also presents some case-study statistics which shows the advantage of using the system to teach kindergarten children, deaf and dumb children and for language learning classes.

  7. Research on dynamic performance design of mobile phone application based on context awareness

    Science.gov (United States)

    Bo, Zhang

    2018-05-01

    It aims to explore the dynamic performance of different mobile phone applications and the user's cognitive differences, reduce the cognitive burden, and enhance the sense of experience. By analyzing the dynamic design performance in four different interactive contexts, and constructing the framework of information service process in the interactive context perception and the two perception principles of the cognitive consensus between designer and user, and the two kinds of knowledge in accordance with the perception principles. The analysis of the context will help users sense the dynamic performance more intuitively, so that the details of interaction will be performed more vividly and smoothly, thus enhance user's experience in the interactive process. The common perception experience enables designers and users to produce emotional resonance in different interactive contexts, and help them achieve rapid understanding of interactive content and perceive the logic and hierarchy of the content and the structure, therefore the effectiveness of mobile applications will be improved.

  8. Optimizing the Quality of Dynamic Context Subscriptions for Scarce Network Resources

    DEFF Research Database (Denmark)

    Shawky, Ahmed; Olsen, Rasmus Løvenstein; Pedersen, Jens Myrup

    2012-01-01

    Scalable access to dynamic context information is a key challenge for future context-sensitive systems. When increasing the access frequency, the information accuracy can improve but at the same time the additional context management traffic may reduce network performance, which creates...... the opposite effect on information reliability. In order to understand and control this trade-off, this paper develops a model that allows to calculate context reliability, captured by the so-called mismatch probability, in relation to the network load. The model is subsequently used for a real time algorithm...

  9. A dynamical context for the origin of Phobos and Deimos

    Science.gov (United States)

    Hansen, Bradley M. S.

    2018-04-01

    We show that a model in which Mars grows near Earth and Venus but is then scattered out of the terrestrial region yields a natural pathway to explain the low masses of the Martian moons Phobos and Deimos. In this scenario, the last giant impact experienced by Mars is followed by an extended period (tens to hundreds of Myr) of close passages by other planetary embryos. These close passages perturb and dynamically heat any system of forming satellites left over by the giant impact and can substantially reduce the mass in the satellite system (sometimes to zero). The close passage of massive perturbing bodies also offers the opportunity to capture small objects by three-body scattering. Both mechanisms lead to low-mass moon systems with a substantially collisional history.

  10. Dynamic Bayesian Networks for Context-Aware Fall Risk Assessment

    Directory of Open Access Journals (Sweden)

    Gregory Koshmak

    2014-05-01

    Full Text Available Fall incidents among the elderly often occur in the home and can cause serious injuries affecting their independent living. This paper presents an approach where data from wearable sensors integrated in a smart home environment is combined using a dynamic Bayesian network. The smart home environment provides contextual data, obtained from environmental sensors, and contributes to assessing a fall risk probability. The evaluation of the developed system is performed through simulation. Each time step is represented by a single user activity and interacts with a fall sensors located on a mobile device. A posterior probability is calculated for each recognized activity or contextual information. The output of the system provides a total risk assessment of falling given a response from the fall sensor.

  11. A single-rate context-dependent learning process underlies rapid adaptation to familiar object dynamics.

    Science.gov (United States)

    Ingram, James N; Howard, Ian S; Flanagan, J Randall; Wolpert, Daniel M

    2011-09-01

    Motor learning has been extensively studied using dynamic (force-field) perturbations. These induce movement errors that result in adaptive changes to the motor commands. Several state-space models have been developed to explain how trial-by-trial errors drive the progressive adaptation observed in such studies. These models have been applied to adaptation involving novel dynamics, which typically occurs over tens to hundreds of trials, and which appears to be mediated by a dual-rate adaptation process. In contrast, when manipulating objects with familiar dynamics, subjects adapt rapidly within a few trials. Here, we apply state-space models to familiar dynamics, asking whether adaptation is mediated by a single-rate or dual-rate process. Previously, we reported a task in which subjects rotate an object with known dynamics. By presenting the object at different visual orientations, adaptation was shown to be context-specific, with limited generalization to novel orientations. Here we show that a multiple-context state-space model, with a generalization function tuned to visual object orientation, can reproduce the time-course of adaptation and de-adaptation as well as the observed context-dependent behavior. In contrast to the dual-rate process associated with novel dynamics, we show that a single-rate process mediates adaptation to familiar object dynamics. The model predicts that during exposure to the object across multiple orientations, there will be a degree of independence for adaptation and de-adaptation within each context, and that the states associated with all contexts will slowly de-adapt during exposure in one particular context. We confirm these predictions in two new experiments. Results of the current study thus highlight similarities and differences in the processes engaged during exposure to novel versus familiar dynamics. In both cases, adaptation is mediated by multiple context-specific representations. In the case of familiar object dynamics

  12. A single-rate context-dependent learning process underlies rapid adaptation to familiar object dynamics.

    Directory of Open Access Journals (Sweden)

    James N Ingram

    2011-09-01

    Full Text Available Motor learning has been extensively studied using dynamic (force-field perturbations. These induce movement errors that result in adaptive changes to the motor commands. Several state-space models have been developed to explain how trial-by-trial errors drive the progressive adaptation observed in such studies. These models have been applied to adaptation involving novel dynamics, which typically occurs over tens to hundreds of trials, and which appears to be mediated by a dual-rate adaptation process. In contrast, when manipulating objects with familiar dynamics, subjects adapt rapidly within a few trials. Here, we apply state-space models to familiar dynamics, asking whether adaptation is mediated by a single-rate or dual-rate process. Previously, we reported a task in which subjects rotate an object with known dynamics. By presenting the object at different visual orientations, adaptation was shown to be context-specific, with limited generalization to novel orientations. Here we show that a multiple-context state-space model, with a generalization function tuned to visual object orientation, can reproduce the time-course of adaptation and de-adaptation as well as the observed context-dependent behavior. In contrast to the dual-rate process associated with novel dynamics, we show that a single-rate process mediates adaptation to familiar object dynamics. The model predicts that during exposure to the object across multiple orientations, there will be a degree of independence for adaptation and de-adaptation within each context, and that the states associated with all contexts will slowly de-adapt during exposure in one particular context. We confirm these predictions in two new experiments. Results of the current study thus highlight similarities and differences in the processes engaged during exposure to novel versus familiar dynamics. In both cases, adaptation is mediated by multiple context-specific representations. In the case of familiar

  13. A framework for extracting and representing project knowledge contexts using topic models and dynamic knowledge maps

    Science.gov (United States)

    Xu, Jin; Li, Zheng; Li, Shuliang; Zhang, Yanyan

    2015-07-01

    There is still a lack of effective paradigms and tools for analysing and discovering the contents and relationships of project knowledge contexts in the field of project management. In this paper, a new framework for extracting and representing project knowledge contexts using topic models and dynamic knowledge maps under big data environments is proposed and developed. The conceptual paradigm, theoretical underpinning, extended topic model, and illustration examples of the ontology model for project knowledge maps are presented, with further research work envisaged.

  14. Ubiquitous Geo-Sensing for Context-Aware Analysis: Exploring Relationships between Environmental and Human Dynamics

    Directory of Open Access Journals (Sweden)

    Euro Beinat

    2012-07-01

    Full Text Available Ubiquitous geo-sensing enables context-aware analyses of physical and social phenomena, i.e., analyzing one phenomenon in the context of another. Although such context-aware analysis can potentially enable a more holistic understanding of spatio-temporal processes, it is rarely documented in the scientific literature yet. In this paper we analyzed the collective human behavior in the context of the weather. We therefore explored the complex relationships between these two spatio-temporal phenomena to provide novel insights into the dynamics of urban systems. Aggregated mobile phone data, which served as a proxy for collective human behavior, was linked with the weather data from climate stations in the case study area, the city of Udine, Northern Italy. To identify and characterize potential patterns within the weather-human relationships, we developed a hybrid approach which integrates several spatio-temporal statistical analysis methods. Thereby we show that explanatory factor analysis, when applied to a number of meteorological variables, can be used to differentiate between normal and adverse weather conditions. Further, we measured the strength of the relationship between the ‘global’ adverse weather conditions and the spatially explicit effective variations in user-generated mobile network traffic for three distinct periods using the Maximal Information Coefficient (MIC. The analyses result in three spatially referenced maps of MICs which reveal interesting insights into collective human dynamics in the context of weather, but also initiate several new scientific challenges.

  15. Measurement of the top quark mass with the dynamical likelihood method using lepton plus jets events with b-tags in p anti-p collisions at s**(1/2) = 1.96-TeV

    Energy Technology Data Exchange (ETDEWEB)

    Abulencia, A.; Acosta, D.; Adelman, Jahred A.; Affolder, Anthony A.; Akimoto, T.; Albrow, M.G.; Ambrose, D.; Amerio, S.; Amidei, D.; Anastassov, A.; Anikeev, K.; /Taiwan,

    2005-12-01

    This report describes a measurement of the top quark mass, M{sub top}, with the dynamical likelihood method (DLM) using the CDF II detector at the Fermilab Tevatron. The Tevatron produces top/anti-top (t{bar t}) pairs in p{bar p} collisions at a center-of-mass energy of 1.96 TeV. The data sample used in this analysis was accumulated from March 2002 through August 2004, which corresponds to an integrated luminosity of 318 pb{sup -1}. They use the t{bar t} candidates in the ''lepton+jets'' decay channel, requiring at least one jet identified as a b quark by finding an displaced secondary vertex. The DLM defines a likelihood for each event based on the differential cross section as a function of M{sub top} per unit phase space volume of the final partons, multiplied by the transfer functions from jet to parton energies. The method takes into account all possible jet combinations in an event, and the likelihood is multiplied event by event to derive the top quark mass by the maximum likelihood method. Using 63 t{bar t} candidates observed in the data, with 9.2 events expected from background, they measure the top quark mass to be 173.2{sub -2.4}{sup +2.6}(stat.) {+-} 3.2(syst.) GeV/c{sup 2}, or 173.2{sub -4.0}{sup +4.1} GeV/c{sup 2}.

  16. The performance model of dynamic virtual organization (VO) formations within grid computing context

    International Nuclear Information System (INIS)

    Han Liangxiu

    2009-01-01

    Grid computing aims to enable 'resource sharing and coordinated problem solving in dynamic, multi-institutional virtual organizations (VOs)'. Within the grid computing context, successful dynamic VO formations mean a number of individuals and institutions associated with certain resources join together and form new VOs in order to effectively execute tasks within given time steps. To date, while the concept of VOs has been accepted, few research has been done on the impact of effective dynamic virtual organization formations. In this paper, we develop a performance model of dynamic VOs formation and analyze the effect of different complex organizational structures and their various statistic parameter properties on dynamic VO formations from three aspects: (1) the probability of a successful VO formation under different organizational structures and statistic parameters change, e.g. average degree; (2) the effect of task complexity on dynamic VO formations; (3) the impact of network scales on dynamic VO formations. The experimental results show that the proposed model can be used to understand the dynamic VO formation performance of the simulated organizations. The work provides a good path to understand how to effectively schedule and utilize resources based on the complex grid network and therefore improve the overall performance within grid environment.

  17. Earthquake likelihood model testing

    Science.gov (United States)

    Schorlemmer, D.; Gerstenberger, M.C.; Wiemer, S.; Jackson, D.D.; Rhoades, D.A.

    2007-01-01

    INTRODUCTIONThe Regional Earthquake Likelihood Models (RELM) project aims to produce and evaluate alternate models of earthquake potential (probability per unit volume, magnitude, and time) for California. Based on differing assumptions, these models are produced to test the validity of their assumptions and to explore which models should be incorporated in seismic hazard and risk evaluation. Tests based on physical and geological criteria are useful but we focus on statistical methods using future earthquake catalog data only. We envision two evaluations: a test of consistency with observed data and a comparison of all pairs of models for relative consistency. Both tests are based on the likelihood method, and both are fully prospective (i.e., the models are not adjusted to fit the test data). To be tested, each model must assign a probability to any possible event within a specified region of space, time, and magnitude. For our tests the models must use a common format: earthquake rates in specified “bins” with location, magnitude, time, and focal mechanism limits.Seismology cannot yet deterministically predict individual earthquakes; however, it should seek the best possible models for forecasting earthquake occurrence. This paper describes the statistical rules of an experiment to examine and test earthquake forecasts. The primary purposes of the tests described below are to evaluate physical models for earthquakes, assure that source models used in seismic hazard and risk studies are consistent with earthquake data, and provide quantitative measures by which models can be assigned weights in a consensus model or be judged as suitable for particular regions.In this paper we develop a statistical method for testing earthquake likelihood models. A companion paper (Schorlemmer and Gerstenberger 2007, this issue) discusses the actual implementation of these tests in the framework of the RELM initiative.Statistical testing of hypotheses is a common task and a

  18. Maintenance grouping strategy for multi-component systems with dynamic contexts

    International Nuclear Information System (INIS)

    Vu, Hai Canh; Do, Phuc; Barros, Anne; Bérenguer, Christophe

    2014-01-01

    This paper presents a dynamic maintenance grouping strategy for multi-component systems with both “positive” and “negative” economic dependencies. Positive dependencies are commonly due to setup cost whereas negative dependencies are related to shutdown cost. Actually, grouping maintenance activities can save part of the setup cost, but can also in the same time increase the shutdown cost. Until now, both types of dependencies have been jointly taken into account only for simple system structures as pure series. The first aim of this paper is to investigate the case of systems with any combination of basic structures (series, parallel or k-out-of n structures). A cost model and a heuristic optimization scheme are proposed since the optimization of maintenance grouping strategy for such multi-component systems leads to a NP-complete problem. Then the second objective is to propose a finite horizon (dynamic) model in order to optimize online the maintenance strategy in the presence of dynamic contexts (change of the environment, the working condition, the production process, etc). A numerical example of a 16-component system is finally introduced to illustrate the use and the advantages of the proposed approach in the maintenance optimization framework. - Highlights: • A dynamic grouping maintenance strategy for complex structure systems is proposed. • Impacts of the system structure on grouping maintenance are investigated. • A grouping approach based on the rolling horizon and GA algorithm is proposed. • Different dynamic contexts and their impacts on grouping maintenance are studied. • The proposed approach can help to update the maintenance planning in dynamic contexts

  19. Capturing Context-Related Change in Emotional Dynamics via Fixed Moderated Time Series Analysis.

    Science.gov (United States)

    Adolf, Janne K; Voelkle, Manuel C; Brose, Annette; Schmiedek, Florian

    2017-01-01

    Much of recent affect research relies on intensive longitudinal studies to assess daily emotional experiences. The resulting data are analyzed with dynamic models to capture regulatory processes involved in emotional functioning. Daily contexts, however, are commonly ignored. This may not only result in biased parameter estimates and wrong conclusions, but also ignores the opportunity to investigate contextual effects on emotional dynamics. With fixed moderated time series analysis, we present an approach that resolves this problem by estimating context-dependent change in dynamic parameters in single-subject time series models. The approach examines parameter changes of known shape and thus addresses the problem of observed intra-individual heterogeneity (e.g., changes in emotional dynamics due to observed changes in daily stress). In comparison to existing approaches to unobserved heterogeneity, model estimation is facilitated and different forms of change can readily be accommodated. We demonstrate the approach's viability given relatively short time series by means of a simulation study. In addition, we present an empirical application, targeting the joint dynamics of affect and stress and how these co-vary with daily events. We discuss potentials and limitations of the approach and close with an outlook on the broader implications for understanding emotional adaption and development.

  20. Dynamical control of quantum systems in the context of mean ergodic theorems

    International Nuclear Information System (INIS)

    Bernád, J Z

    2017-01-01

    Equidistant and non-equidistant single pulse ‘bang–bang’ dynamical controls are investigated in the context of mean ergodic theorems. We show the requirements in which the limit of infinite pulse control for both the equidistant and the non-equidistant dynamical control converges to the same unitary evolution. It is demonstrated that the generator of this evolution can be obtained by projecting the generator of the free evolution onto the commutant of the unitary operator representing the pulse. Inequalities are derived to prove this statement and in the case of non-equidistant approach these inequalities are optimised as a function of the time intervals. (paper)

  1. Word class and context affect alpha-band oscillatory dynamics in an older population

    Directory of Open Access Journals (Sweden)

    Monika eMellem

    2012-04-01

    Full Text Available Differences in the oscillatory EEG dynamics of reading open class and closed class words have previously been found (Bastiaansen et al., 2005 and are thought to reflect differences in lexical-semantic content between these word classes. In particular, the theta band (4–7 Hz seems to play a prominent role in lexical-semantic retrieval. We tested whether this theta effect is robust in an older population of subjects. Additionally, we examined how the context of a word can modulate the oscillatory dynamics underlying retrieval for the two different classes of words. Older participants (mean age 55 read words presented in either syntactically-correct sentences or in a scrambled order (scrambled sentence while their EEG was recorded. We performed time-frequency analysis to examine how power varied based on the context or class of the word. We observed larger power decreases in the alpha (8–12Hz band between 200–700 ms for the open class compared to closed class words, but this was true only for the scrambled sentence context. We did not observe differences in theta power between these conditions. Context exerted an effect on the alpha and low beta (13–18 Hz bands between 0–700 ms. These results suggest that the previously observed word class effects on theta power changes in a younger participant sample do not seem to be a robust effect in this older population. Though this is an indirect comparison between studies, it may suggest the existence of aging effects on word retrieval dynamics for different populations. Additionally, the interaction between word class and context suggests that word retrieval mechanisms interact with sentence-level comprehension mechanisms in the alpha band.

  2. Prototype Development: Context-Driven Dynamic XML Ophthalmologic Data Capture Application

    Science.gov (United States)

    Schwei, Kelsey M; Kadolph, Christopher; Finamore, Joseph; Cancel, Efrain; McCarty, Catherine A; Okorie, Asha; Thomas, Kate L; Allen Pacheco, Jennifer; Pathak, Jyotishman; Ellis, Stephen B; Denny, Joshua C; Rasmussen, Luke V; Tromp, Gerard; Williams, Marc S; Vrabec, Tamara R; Brilliant, Murray H

    2017-01-01

    Background The capture and integration of structured ophthalmologic data into electronic health records (EHRs) has historically been a challenge. However, the importance of this activity for patient care and research is critical. Objective The purpose of this study was to develop a prototype of a context-driven dynamic extensible markup language (XML) ophthalmologic data capture application for research and clinical care that could be easily integrated into an EHR system. Methods Stakeholders in the medical, research, and informatics fields were interviewed and surveyed to determine data and system requirements for ophthalmologic data capture. On the basis of these requirements, an ophthalmology data capture application was developed to collect and store discrete data elements with important graphical information. Results The context-driven data entry application supports several features, including ink-over drawing capability for documenting eye abnormalities, context-based Web controls that guide data entry based on preestablished dependencies, and an adaptable database or XML schema that stores Web form specifications and allows for immediate changes in form layout or content. The application utilizes Web services to enable data integration with a variety of EHRs for retrieval and storage of patient data. Conclusions This paper describes the development process used to create a context-driven dynamic XML data capture application for optometry and ophthalmology. The list of ophthalmologic data elements identified as important for care and research can be used as a baseline list for future ophthalmologic data collection activities. PMID:28903894

  3. The behavior of the likelihood ratio test for testing missingness

    OpenAIRE

    Hens, Niel; Aerts, Marc; Molenberghs, Geert; Thijs, Herbert

    2003-01-01

    To asses the sensitivity of conclusions to model choices in the context of selection models for non-random dropout, one can oppose the different missing mechanisms to each other; e.g. by the likelihood ratio tests. The finite sample behavior of the null distribution and the power of the likelihood ratio test is studied under a variety of missingness mechanisms. missing data; sensitivity analysis; likelihood ratio test; missing mechanisms

  4. Review of Elaboration Likelihood Model of persuasion

    OpenAIRE

    藤原, 武弘; 神山, 貴弥

    1989-01-01

    This article mainly introduces Elaboration Likelihood Model (ELM), proposed by Petty & Cacioppo, that is, a general attitude change theory. ELM posturates two routes to persuasion; central and peripheral route. Attitude change by central route is viewed as resulting from a diligent consideration of the issue-relevant informations presented. On the other hand, attitude change by peripheral route is viewed as resulting from peripheral cues in the persuasion context. Secondly we compare these tw...

  5. Prototype Development: Context-Driven Dynamic XML Ophthalmologic Data Capture Application.

    Science.gov (United States)

    Peissig, Peggy; Schwei, Kelsey M; Kadolph, Christopher; Finamore, Joseph; Cancel, Efrain; McCarty, Catherine A; Okorie, Asha; Thomas, Kate L; Allen Pacheco, Jennifer; Pathak, Jyotishman; Ellis, Stephen B; Denny, Joshua C; Rasmussen, Luke V; Tromp, Gerard; Williams, Marc S; Vrabec, Tamara R; Brilliant, Murray H

    2017-09-13

    The capture and integration of structured ophthalmologic data into electronic health records (EHRs) has historically been a challenge. However, the importance of this activity for patient care and research is critical. The purpose of this study was to develop a prototype of a context-driven dynamic extensible markup language (XML) ophthalmologic data capture application for research and clinical care that could be easily integrated into an EHR system. Stakeholders in the medical, research, and informatics fields were interviewed and surveyed to determine data and system requirements for ophthalmologic data capture. On the basis of these requirements, an ophthalmology data capture application was developed to collect and store discrete data elements with important graphical information. The context-driven data entry application supports several features, including ink-over drawing capability for documenting eye abnormalities, context-based Web controls that guide data entry based on preestablished dependencies, and an adaptable database or XML schema that stores Web form specifications and allows for immediate changes in form layout or content. The application utilizes Web services to enable data integration with a variety of EHRs for retrieval and storage of patient data. This paper describes the development process used to create a context-driven dynamic XML data capture application for optometry and ophthalmology. The list of ophthalmologic data elements identified as important for care and research can be used as a baseline list for future ophthalmologic data collection activities. ©Peggy Peissig, Kelsey M Schwei, Christopher Kadolph, Joseph Finamore, Efrain Cancel, Catherine A McCarty, Asha Okorie, Kate L Thomas, Jennifer Allen Pacheco, Jyotishman Pathak, Stephen B Ellis, Joshua C Denny, Luke V Rasmussen, Gerard Tromp, Marc S Williams, Tamara R Vrabec, Murray H Brilliant. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 13.09.2017.

  6. Large-Scale Context-Aware Volume Navigation using Dynamic Insets

    KAUST Repository

    Al-Awami, Ali

    2012-07-01

    Latest developments in electron microscopy (EM) technology produce high resolution images that enable neuro-scientists to identify and put together the complex neural connections in a nervous system. However, because of the massive size and underlying complexity of this kind of data, processing, navigation and analysis suffer drastically in terms of time and effort. In this work, we propose the use of state-of- the-art navigation techniques, such as dynamic insets, built on a peta-scale volume visualization framework to provide focus and context-awareness to help neuro-scientists in their mission to analyze, reconstruct, navigate and explore EM neuroscience data.

  7. Enhancing learners’ emotions in an L2 context through emotionalized dynamic assessment

    Directory of Open Access Journals (Sweden)

    Parisa Abdolrezapour

    2013-10-01

    Full Text Available The aim of this study was to gain more in-depth understanding of students’ emotions in an EFL context by applying dynamic assessment (DA procedures to the development of learners’ emotional intelligence. The study with 50 intermediate learners aged 12-15 used three modalities: a control group, which was taught under institute’s normal procedures; a comparison group, which received DA; and an experimental group, which received emotionalized dynamic assessment (EDA procedures, in the form of an intervention focusing on emotional characteristics of Goleman's emotional intelligence framework with the express purpose of inducing them to work with their emotions. The study shows the potential of EDA for increasing one’s emotional intelligence and affords practical guidelines to language teachers as to how to incorporate behaviors relating to emotional intelligence into assessment procedures

  8. Socio-Cultural Dynamics of Education in the Context of the Post-Non-Classical Science

    Directory of Open Access Journals (Sweden)

    V. A. Ignatova

    2012-01-01

    Full Text Available The paper deals with the interrelations between society, education and culture. Using the comparative analysis of classical approaches to defining the above spheres, the author comes to conclusion that the nature of socio-cultural processes can be explored and described most consistently by applying comprehensive models of the post-non-classical science and considering civilization, education and culture in the context of the unified dynamic flow of socio-cultural genesis. The research investigates the dialectics of socio-cultural processes in the light of systematic synergetic approach, the advancing role of education in socio-cultural dynamics being revealed and substantiated. The author emphasizes its inevitably rising priority due to sustained development of civilization bringing about the new environmentally-oriented meta-culture.The obtained results can be used in pedagogic research methodology, designing and modeling the educational process, its content, technology and organization. 

  9. Context-dependent retrieval of information by neural-network dynamics with continuous attractors.

    Science.gov (United States)

    Tsuboshita, Yukihiro; Okamoto, Hiroshi

    2007-08-01

    Memory retrieval in neural networks has traditionally been described by dynamic systems with discrete attractors. However, recent neurophysiological findings of graded persistent activity suggest that memory retrieval in the brain is more likely to be described by dynamic systems with continuous attractors. To explore what sort of information processing is achieved by continuous-attractor dynamics, keyword extraction from documents by a network of bistable neurons, which gives robust continuous attractors, is examined. Given an associative network of terms, a continuous attractor led by propagation of neuronal activation in this network appears to represent keywords that express underlying meaning of a document encoded in the initial state of the network-activation pattern. A dominant hypothesis in cognitive psychology is that long-term memory is archived in the network structure, which resembles associative networks of terms. Our results suggest that keyword extraction by the neural-network dynamics with continuous attractors might symbolically represent context-dependent retrieval of short-term memory from long-term memory in the brain.

  10. Likelihood devices in spatial statistics

    NARCIS (Netherlands)

    Zwet, E.W. van

    1999-01-01

    One of the main themes of this thesis is the application to spatial data of modern semi- and nonparametric methods. Another, closely related theme is maximum likelihood estimation from spatial data. Maximum likelihood estimation is not common practice in spatial statistics. The method of moments

  11. Context-dependent colonization dynamics: Regional reward contagion drives local compression in aquatic beetles.

    Science.gov (United States)

    Pintar, Matthew R; Resetarits, William J

    2017-09-01

    Habitat selection by colonizing organisms is an important factor in determining species abundance and community dynamics at multiple spatial scales. Many organisms select habitat patches based on intrinsic patch quality, but patches exist in complex landscapes linked by dispersal and colonization, forming metapopulations and metacommunities. Perceived patch quality can be influenced by neighbouring patches through spatial contagion, wherein perceived quality of one patch can extend beyond its borders and either increase or decrease the colonization of neighbouring patches and localities. These spatially explicit colonization dynamics can result in habitat compression, wherein more colonists occupy a patch or locality than in the absence of spatial context dependence. Previous work on contagion/compression focused primarily on the role of predators in driving colonization patterns. Our goal was to determine whether resource abundance can drive multi-scale colonization dynamics of aquatic beetles through the processes of contagion and compression in naturally colonized experimental pools. We established two levels (high/low quality) of within-patch resource abundances (leaf litter) using an experimental landscape of mesocosms, and assayed colonization by 35 species of aquatic beetles. Patches were arranged in localities (sets of two patches), which consisted of a combination of two patch-level resource levels in a 2 × 2 factorial design, allowing us to assay colonization at both locality and patch levels. We demonstrate that patterns of species abundance and richness of colonizing aquatic beetles are determined by patch quality and context-dependent processes at multiple spatial scales. Localities that consisted of at least one high-quality patch were colonized at equivalent rates that were higher than localities containing only low-quality patches, displaying regional reward contagion. In localities that consisted of one high- and one low-quality patch, reward

  12. Extended likelihood inference in reliability

    International Nuclear Information System (INIS)

    Martz, H.F. Jr.; Beckman, R.J.; Waller, R.A.

    1978-10-01

    Extended likelihood methods of inference are developed in which subjective information in the form of a prior distribution is combined with sampling results by means of an extended likelihood function. The extended likelihood function is standardized for use in obtaining extended likelihood intervals. Extended likelihood intervals are derived for the mean of a normal distribution with known variance, the failure-rate of an exponential distribution, and the parameter of a binomial distribution. Extended second-order likelihood methods are developed and used to solve several prediction problems associated with the exponential and binomial distributions. In particular, such quantities as the next failure-time, the number of failures in a given time period, and the time required to observe a given number of failures are predicted for the exponential model with a gamma prior distribution on the failure-rate. In addition, six types of life testing experiments are considered. For the binomial model with a beta prior distribution on the probability of nonsurvival, methods are obtained for predicting the number of nonsurvivors in a given sample size and for predicting the required sample size for observing a specified number of nonsurvivors. Examples illustrate each of the methods developed. Finally, comparisons are made with Bayesian intervals in those cases where these are known to exist

  13. Young women's dynamic family size preferences in the context of transitioning fertility.

    Science.gov (United States)

    Yeatman, Sara; Sennott, Christie; Culpepper, Steven

    2013-10-01

    Dynamic theories of family size preferences posit that they are not a fixed and stable goal but rather are akin to a moving target that changes within individuals over time. Nonetheless, in high-fertility contexts, changes in family size preferences tend to be attributed to low construct validity and measurement error instead of genuine revisions in preferences. To address the appropriateness of this incongruity, the present study examines evidence for the sequential model of fertility among a sample of young Malawian women living in a context of transitioning fertility. Using eight waves of closely spaced data and fixed-effects models, we find that these women frequently change their reported family size preferences and that these changes are often associated with changes in their relationship and reproductive circumstances. The predictability of change gives credence to the argument that ideal family size is a meaningful construct, even in this higher-fertility setting. Changes are not equally predictable across all women, however, and gamma regression results demonstrate that women for whom reproduction is a more distant goal change their fertility preferences in less-predictable ways.

  14. Young Women’s Dynamic Family Size Preferences in the Context of Transitioning Fertility

    Science.gov (United States)

    Yeatman, Sara; Sennott, Christie; Culpepper, Steven

    2013-01-01

    Dynamic theories of family size preferences posit that they are not a fixed and stable goal but rather are akin to a moving target that changes within individuals over time. Nonetheless, in high-fertility contexts, changes in family size preferences tend to be attributed to low construct validity and measurement error instead of genuine revisions in preferences. To address the appropriateness of this incongruity, the present study examines evidence for the sequential model of fertility among a sample of young Malawian women living in a context of transitioning fertility. Using eight waves of closely spaced data and fixed-effects models, we find that these women frequently change their reported family size preferences and that these changes are often associated with changes in their relationship and reproductive circumstances. The predictability of change gives credence to the argument that ideal family size is a meaningful construct, even in this higher-fertility setting. Changes are not equally predictable across all women, however, and gamma regression results demonstrate that women for whom reproduction is a more distant goal change their fertility preferences in less-predictable ways. PMID:23619999

  15. Context-dependent JPEG backward-compatible high-dynamic range image compression

    Science.gov (United States)

    Korshunov, Pavel; Ebrahimi, Touradj

    2013-10-01

    High-dynamic range (HDR) imaging is expected, together with ultrahigh definition and high-frame rate video, to become a technology that may change photo, TV, and film industries. Many cameras and displays capable of capturing and rendering both HDR images and video are already available in the market. The popularity and full-public adoption of HDR content is, however, hindered by the lack of standards in evaluation of quality, file formats, and compression, as well as large legacy base of low-dynamic range (LDR) displays that are unable to render HDR. To facilitate the wide spread of HDR usage, the backward compatibility of HDR with commonly used legacy technologies for storage, rendering, and compression of video and images are necessary. Although many tone-mapping algorithms are developed for generating viewable LDR content from HDR, there is no consensus of which algorithm to use and under which conditions. We, via a series of subjective evaluations, demonstrate the dependency of the perceptual quality of the tone-mapped LDR images on the context: environmental factors, display parameters, and image content itself. Based on the results of subjective tests, it proposes to extend JPEG file format, the most popular image format, in a backward compatible manner to deal with HDR images also. An architecture to achieve such backward compatibility with JPEG is proposed. A simple implementation of lossy compression demonstrates the efficiency of the proposed architecture compared with the state-of-the-art HDR image compression.

  16. Dynamic Exposure to Alcohol Advertising in a Sports Context Influences Implicit Attitudes.

    Science.gov (United States)

    Zerhouni, Oulmann; Bègue, Laurent; Duke, Aaron A; Flaudias, Valentin

    2016-02-01

    Experimental studies investigating the impact of advertising with ecological stimuli on alcohol-related cognition are scarce. This research investigated the cognitive processes involved in learning implicit attitudes toward alcohol after incidental exposure to alcohol advertisements presented in a dynamic context. We hypothesized that incidental exposure to a specific alcohol brand would lead to heightened positive implicit attitudes toward alcohol due to a mere exposure effect. In total, 108 participants were randomly exposed to dynamic sporting events excerpts with and without advertising for a specific brand of alcohol, after completing self-reported measures of alcohol-related expectancies, alcohol consumption, and attitudes toward sport. Participants then completed a lexical decision task and an affective priming task. We showed that participants were faster to detect brand name after being exposed to advertising during a sports game, and that implicit attitudes of participants toward the brand were more positive after they were exposed to advertising, even when alcohol usage patterns were controlled for. Incidental exposure to alcohol sponsorship in sport events impacts implicit attitudes toward the advertised brand and alcohol in general. The effect of incidental advertising on implicit attitudes is also likely to be due to a mere exposure effect. However, further studies should address this point specifically. Copyright © 2016 by the Research Society on Alcoholism.

  17. ANALYSIS DYNAMICS VALUES FORMULATION IN THE CONTEXT OF THE BUSINESS ORGANIZATION’ S MISSION

    Directory of Open Access Journals (Sweden)

    Marius Costel Esi

    2015-02-01

    Full Text Available The economic activity goals reveal a number of aspects which express the need for re-evaluating the way in which the dynamics analysis values may be correlated with the wording business mission. Under these conditions, managerial undertaken strategies at the level of business may be validated in so far as they reveal purpose/ objectives assumed/ undertaken by decision makers (particularly top-managers. Moreover, compliance with eligibility criteria according to which management strategies are reflected, should be aimed at in our opinion improving decision-making process. But such a decision-making process involves an understanding of the judicious economic actors/ labor with regard to the way in which it is possible to analyze the dynamics values in relation to formulation  of  business organization's mission. In these circumstances, a first objective of this research is analysis dynamics values in the context of formulation of business mission. In this way, by this approach, we strive to show you those conditionings that make it possible formulation of business mission in relation to organizational culture.  On the other hand, a second objective that we have in view is given of the way in which is to bring about the process of defining and statement of organizational mission, a process linked to the size of axiological mission statement of business organization. This status as a matter of fact, in the light of the analysis we take into account,  a business model in which the objectives, strategies, organization mission business become materialized in so far as that contextuality  venture is validated in relation to socio-economic prospects. Therefore, the existence of phenomena such as social and economic situation involves a series of connections between different levels of displacing of the organization of business which provides, in fact, its legitimacy

  18. The likelihood of Latino women to seek help in response to interpersonalvictimization: An examination of individual, interpersonal and socioculturalinfluences

    Directory of Open Access Journals (Sweden)

    Chiara Sabina

    2014-07-01

    Full Text Available Help-seeking is a process that is influenced by individual, interpersonal, and sociocultural factors. Thecurrent study examined these influences on the likelihood of seeking help (police, pressing charges,medical services, social services, and informal help for interpersonal violence among a national sample ofLatino women. Women living in high-density Latino neighborhoods in the USA were interviewed by phonein their preferred language. Women reporting being, on average, between "somewhat likely" and "verylikely" to seek help should they experience interpersonal victimization. Sequential linear regression resultsindicated that individual (age, depression, interpersonal (having children, past victimization, andsociocultural factors (immigrant status, acculturation were associated with the self-reported likelihood ofseeking help for interpersonal violence. Having children was consistently related to a greater likelihood toseek all forms of help. Overall, women appear to respond to violence in ways that reflects their ecologicalcontext. Help-seeking is best understood within a multi-layered and dynamic context.

  19. Dynamics of wages in the region and the problem of measurement of wages in the context of economic instability

    Directory of Open Access Journals (Sweden)

    S. S. Gordeev

    2010-12-01

    Full Text Available The paper deals with the analysis of current state and basic tendencies in the dynamics of wages. The authors consider the basic contradictions in the context of establishment of the market institution of wages in the subjects of the Russian Federation. The dynamics of wages is appraised on the basis of the tax accounting of the regions. This approach, according to the authors, allows reflecting the current processes in the sphere of remuneration of labor in the context of economic instability more objectively.

  20. Obtaining reliable Likelihood Ratio tests from simulated likelihood functions

    DEFF Research Database (Denmark)

    Andersen, Laura Mørch

    It is standard practice by researchers and the default option in many statistical programs to base test statistics for mixed models on simulations using asymmetric draws (e.g. Halton draws). This paper shows that when the estimated likelihood functions depend on standard deviations of mixed param...

  1. I. WORKING MEMORY CAPACITY IN CONTEXT: MODELING DYNAMIC PROCESSES OF BEHAVIOR, MEMORY, AND DEVELOPMENT.

    Science.gov (United States)

    Simmering, Vanessa R

    2016-09-01

    Working memory is a vital cognitive skill that underlies a broad range of behaviors. Higher cognitive functions are reliably predicted by working memory measures from two domains: children's performance on complex span tasks, and infants' performance in looking paradigms. Despite the similar predictive power across these research areas, theories of working memory development have not connected these different task types and developmental periods. The current project takes a first step toward bridging this gap by presenting a process-oriented theory, focusing on two tasks designed to assess visual working memory capacity in infants (the change-preference task) versus children and adults (the change detection task). Previous studies have shown inconsistent results, with capacity estimates increasing from one to four items during infancy, but only two to three items during early childhood. A probable source of this discrepancy is the different task structures used with each age group, but prior theories were not sufficiently specific to explain how performance relates across tasks. The current theory focuses on cognitive dynamics, that is, how memory representations are formed, maintained, and used within specific task contexts over development. This theory was formalized in a computational model to generate three predictions: 1) capacity estimates in the change-preference task should continue to increase beyond infancy; 2) capacity estimates should be higher in the change-preference versus change detection task when tested within individuals; and 3) performance should correlate across tasks because both rely on the same underlying memory system. I also tested a fourth prediction, that development across tasks could be explained through increasing real-time stability, realized computationally as strengthening connectivity within the model. Results confirmed these predictions, supporting the cognitive dynamics account of performance and developmental changes in real

  2. Trust and community. Exploring the meanings, contexts and dynamics of community renewable energy

    Energy Technology Data Exchange (ETDEWEB)

    Walker, Gordon [University of Lancaster, Department of Geography, Lancaster LA1 4YN (United Kingdom); Devine-Wright, Patrick [University of Manchester, The School of Environment and Development, Humanities Bridgeford Street Building, Oxford Road, Manchester M13 9PL (United Kingdom); Hunter, Sue; High, Helen; Evans, Bob [University of Lancaster, Department of Geography, Lancaster LA1 4YN (United Kingdom); University of Manchester, The School of Environment and Development, Humanities Bridgeford Street Building, Oxford Road, Manchester M13 9PL (United Kingdom)

    2010-06-15

    Community renewable energy projects have recently been promoted and supported in the UK by government policy. A community approach, it is argued in the rhetoric of both government and grassroots activists will change the experience and outcomes of the energy sustainable technology implementation. In this paper, we consider how interpersonal and social trust is implicated in the different meanings given to community in RE programmes and projects, and in the qualities and outcomes that are implied or assumed by taking a community approach. We examine how these meanings play out in examples of projects on the ground, focusing on two contrasting cases in which the relationships between those involved locally have exhibited different patterns of cohesiveness and fracture. We argue that trust does have a necessary part to play in the contingencies and dynamics of community RE projects and in the outcomes they can achieve. Trust between local people and groups that take projects forward is part of the package of conditions which can help projects work. Whilst trust may therefore be functional for the development of community RE and potentially can be enhanced by the adoption of a community approach, this cannot be either assured or assumed under the wide diversity of contexts, conditions and arrangements under which community RE is being pursued and practiced. (author)

  3. Trust and community: Exploring the meanings, contexts and dynamics of community renewable energy

    Energy Technology Data Exchange (ETDEWEB)

    Walker, Gordon, E-mail: g.p.walker@lancaster.ac.u [University of Lancaster, Department of Geography, Lancaster LA1 4YN (United Kingdom); Devine-Wright, Patrick [University of Manchester, School of Environment and Development, Humanities Bridgeford Street Building, Oxford Road, Manchester M13 9PL (United Kingdom); Hunter, Sue; High, Helen; Evans, Bob [University of Lancaster, Department of Geography, Lancaster LA1 4YN (United Kingdom); University of Manchester, School of Environment and Development, Humanities Bridgeford Street Building, Oxford Road, Manchester M13 9PL (United Kingdom)

    2010-06-15

    Community renewable energy projects have recently been promoted and supported in the UK by government policy. A community approach, it is argued in the rhetoric of both government and grassroots activists will change the experience and outcomes of the energy sustainable technology implementation. In this paper, we consider how interpersonal and social trust is implicated in the different meanings given to community in RE programmes and projects, and in the qualities and outcomes that are implied or assumed by taking a community approach. We examine how these meanings play out in examples of projects on the ground, focusing on two contrasting cases in which the relationships between those involved locally have exhibited different patterns of cohesiveness and fracture. We argue that trust does have a necessary part to play in the contingencies and dynamics of community RE projects and in the outcomes they can achieve. Trust between local people and groups that take projects forward is part of the package of conditions which can help projects work. Whilst trust may therefore be functional for the development of community RE and potentially can be enhanced by the adoption of a community approach, this cannot be either assured or assumed under the wide diversity of contexts, conditions and arrangements under which community RE is being pursued and practiced.

  4. On the Context-Aware, Dynamic Spectrum Access for Robust Intraplatoon Communications

    Directory of Open Access Journals (Sweden)

    Michał Sybis

    2018-01-01

    Full Text Available Vehicle platooning is a promising technology that allows to improve the traffic efficiency and passengers safety. Platoons that use cooperative adaptive cruise control, however, require a reliable radio link between platoon members to ensure a required distance between the cars within the platoon, thus maintaining platoon safety. Nowadays, the communication can be realized with the use of 802.11p or cellular vehicle-to-vehicle (C-V2V, but none of this technology is able to provide a reliable link especially in the presence of high traffic or urban scenarios. Therefore, in this paper, we propose a dynamic spectrum management mechanism in V2V communications for platooning purposes. A management system architecture is proposed that comprises the use of context-aware databases, sensing nodes, and spectrum allocation entity. The proposed robust system design aims to keep only the minimum necessary information transmitted over the conventional intelligent transportation system (ITS channel, while moving the remaining data (nonsafety, service-aided, or infotainment to an alternative channel that is selected from the available pool of spectrum white spaces. The initial analysis indicates that the proposed system may significantly improve the performance of wireless communications for the purpose of vehicle platooning.

  5. Trust and community: Exploring the meanings, contexts and dynamics of community renewable energy

    International Nuclear Information System (INIS)

    Walker, Gordon; Devine-Wright, Patrick; Hunter, Sue; High, Helen; Evans, Bob

    2010-01-01

    Community renewable energy projects have recently been promoted and supported in the UK by government policy. A community approach, it is argued in the rhetoric of both government and grassroots activists will change the experience and outcomes of the energy sustainable technology implementation. In this paper, we consider how interpersonal and social trust is implicated in the different meanings given to community in RE programmes and projects, and in the qualities and outcomes that are implied or assumed by taking a community approach. We examine how these meanings play out in examples of projects on the ground, focusing on two contrasting cases in which the relationships between those involved locally have exhibited different patterns of cohesiveness and fracture. We argue that trust does have a necessary part to play in the contingencies and dynamics of community RE projects and in the outcomes they can achieve. Trust between local people and groups that take projects forward is part of the package of conditions which can help projects work. Whilst trust may therefore be functional for the development of community RE and potentially can be enhanced by the adoption of a community approach, this cannot be either assured or assumed under the wide diversity of contexts, conditions and arrangements under which community RE is being pursued and practiced.

  6. How mechanical context and feedback jointly determine the use of mechanical variables in length perception by dynamic touch

    NARCIS (Netherlands)

    Menger, Rudmer; Withagen, Rob

    Earlier studies have revealed that both mechanical context and feedback determine what mechanical invariant is used to perceive length by dynamic touch. In the present article, the authors examined how these two factors jointly constrain the informational variable that is relied upon. Participants

  7. How mechanical context and feedback jointly determine the use of mechanical variables in length perception by dynamic touch

    NARCIS (Netherlands)

    Menger, Rudmer; Withagen, Rob

    2009-01-01

    Earlier studies have revealed that both mechanical context and feedback determine what mechanical invariant is used to perceive length by dynamic touch. In the present article, the authors examined how these two factors jointly constrain the informational variable that is relied upon. Participants

  8. Counseling Pretreatment and the Elaboration Likelihood Model of Attitude Change.

    Science.gov (United States)

    Heesacker, Martin

    1986-01-01

    Results of the application of the Elaboration Likelihood Model (ELM) to a counseling context revealed that more favorable attitudes toward counseling occurred as subjects' ego involvement increased and as intervention quality improved. Counselor credibility affected the degree to which subjects' attitudes reflected argument quality differences.…

  9. Likelihood estimators for multivariate extremes

    KAUST Repository

    Huser, Raphaë l; Davison, Anthony C.; Genton, Marc G.

    2015-01-01

    The main approach to inference for multivariate extremes consists in approximating the joint upper tail of the observations by a parametric family arising in the limit for extreme events. The latter may be expressed in terms of componentwise maxima, high threshold exceedances or point processes, yielding different but related asymptotic characterizations and estimators. The present paper clarifies the connections between the main likelihood estimators, and assesses their practical performance. We investigate their ability to estimate the extremal dependence structure and to predict future extremes, using exact calculations and simulation, in the case of the logistic model.

  10. Likelihood estimators for multivariate extremes

    KAUST Repository

    Huser, Raphaël

    2015-11-17

    The main approach to inference for multivariate extremes consists in approximating the joint upper tail of the observations by a parametric family arising in the limit for extreme events. The latter may be expressed in terms of componentwise maxima, high threshold exceedances or point processes, yielding different but related asymptotic characterizations and estimators. The present paper clarifies the connections between the main likelihood estimators, and assesses their practical performance. We investigate their ability to estimate the extremal dependence structure and to predict future extremes, using exact calculations and simulation, in the case of the logistic model.

  11. Haptic subjective vertical shows context dependence: task and vision play a role during dynamic tilt stimulation.

    Science.gov (United States)

    Wright, William Geoffrey; Glasauer, Stefan

    2003-10-01

    Perceiving one's vertical is an integral part of efficiently functioning in an environment physically polarized along that dimension. How one determines the direction of gravity is not a task left only to inertial sensors, such as the vestibular organs, rather as numerous studies have shown, this task is influenced visually and somatosensorily. In addition, there is evidence that higher order cognitive effects such as expectancies and context are critical in perception of the vertical. One's ability to integrate these various inputs during normal activity is not generally questioned, one's doubts being satisfied by observing a waiter navigating a crowded restaurant with a tray balanced on one hand, neither tripping or dropping an entree. But how these various sources are integrated is still debated. Most research focuses on subjective vertical perception used visual matching/alignment tasks, verbal reports, or saccadic eye movements as a dependent measure. Although a motor task involving a joystick or indicator to be aligned with gravity without visual feedback is used much less frequently, there is good evidence that individuals easily orient limbs to an external gravity-aligned coordinate axis while being statically tilted. By exposure to a dynamic situation, the central nervous system should be no more challenged by the task of determining the subjective vertical than during static conditions, because our spatial orientation systems were likely selected for just that. In addition, the sensitive calibration between visual and other sensory input also must have been key to its selection. This sensory interaction can be tested by changing the relation between the various sources. With the advent of virtual reality technology, a complex and "natural" visual stimulus is achievable and is easily manipulable. How one tests perception of verticality is also a pertinent question when researching spatial orientation systems. The system's performance may be better

  12. Interdependence and dynamics of essential services in an extensive risk context: a case study in Montserrat, West Indies

    Science.gov (United States)

    Sword-Daniels, V. L.; Rossetto, T.; Wilson, T. M.; Sargeant, S.

    2015-05-01

    The essential services that support urban living are complex and interdependent, and their disruption in disasters directly affects society. Yet there are few empirical studies to inform our understanding of the vulnerabilities and resilience of complex infrastructure systems in disasters. This research takes a systems thinking approach to explore the dynamic behaviour of a network of essential services, in the presence and absence of volcanic ashfall hazards in Montserrat, West Indies. Adopting a case study methodology and qualitative methods to gather empirical data, we centre the study on the healthcare system and its interconnected network of essential services. We identify different types of relationship between sectors and develop a new interdependence classification system for analysis. Relationships are further categorised by hazard conditions, for use in extensive risk contexts. During heightened volcanic activity, relationships between systems transform in both number and type: connections increase across the network by 41%, and adapt to increase cooperation and information sharing. Interconnections add capacities to the network, increasing the resilience of prioritised sectors. This in-depth and context-specific approach provides a new methodology for studying the dynamics of infrastructure interdependence in an extensive risk context, and can be adapted for use in other hazard contexts.

  13. Maximum likelihood of phylogenetic networks.

    Science.gov (United States)

    Jin, Guohua; Nakhleh, Luay; Snir, Sagi; Tuller, Tamir

    2006-11-01

    Horizontal gene transfer (HGT) is believed to be ubiquitous among bacteria, and plays a major role in their genome diversification as well as their ability to develop resistance to antibiotics. In light of its evolutionary significance and implications for human health, developing accurate and efficient methods for detecting and reconstructing HGT is imperative. In this article we provide a new HGT-oriented likelihood framework for many problems that involve phylogeny-based HGT detection and reconstruction. Beside the formulation of various likelihood criteria, we show that most of these problems are NP-hard, and offer heuristics for efficient and accurate reconstruction of HGT under these criteria. We implemented our heuristics and used them to analyze biological as well as synthetic data. In both cases, our criteria and heuristics exhibited very good performance with respect to identifying the correct number of HGT events as well as inferring their correct location on the species tree. Implementation of the criteria as well as heuristics and hardness proofs are available from the authors upon request. Hardness proofs can also be downloaded at http://www.cs.tau.ac.il/~tamirtul/MLNET/Supp-ML.pdf

  14. Architecture of firm dynamic capabilities across inter-organizational activities: Explaining innovativeness in the context of nanotechnology

    Science.gov (United States)

    Petricevic, Olga

    In this dissertation I first develop a theoretical framework that explores different components of dynamic capabilities related to firm's boundary-spanning linkages across two different types of inter-organizational activities---alliances and networks. I argue that there are four different subsets of dynamic capabilities simultaneously at work: alliance opportunity-sensing, alliance opportunity-seizing, network opportunity-sensing and network opportunity-seizing. Furthermore, I argue that there are significant interaction effects between these distinctive subsets driving the firm's overall effectiveness in sensing and seizing of novel and innovative external opportunities. In order to explore potential interdependencies and draw distinctions among different dynamic capability subsets I integrate concepts from the two theoretical perspectives that often neglect the emphasis of the other---the dynamic capability view and the social network perspective. I then test the hypothesized relationships in the context of firms actively patenting in nanotechnology. Nanotechnology innovations are multidisciplinary in nature and require search and discovery across multiple inter-organizational, scientific, geographic, industry, or technological domains by a particular firm. The findings offer support for the conceptualizations of dynamic capabilities as consisting of distinct subsets of capabilities for the sensing and the seizing of external new-knowledge opportunities. The findings suggest that firm's innovativeness in an interdisciplinary scientific field such as nanotechnology is the function of the vector of multi-dimensional dynamic capabilities that are context-specific. Furthermore, the findings also suggest that there are inherent trade-offs embedded in different dimensions of dynamic capabilities when deployed across a wide range of inter-organizational relationships.

  15. Binding neutral information to emotional contexts: Brain dynamics of long-term recognition memory.

    Science.gov (United States)

    Ventura-Bort, Carlos; Löw, Andreas; Wendt, Julia; Moltó, Javier; Poy, Rosario; Dolcos, Florin; Hamm, Alfons O; Weymar, Mathias

    2016-04-01

    There is abundant evidence in memory research that emotional stimuli are better remembered than neutral stimuli. However, effects of an emotionally charged context on memory for associated neutral elements is also important, particularly in trauma and stress-related disorders, where strong memories are often activated by neutral cues due to their emotional associations. In the present study, we used event-related potentials (ERPs) to investigate long-term recognition memory (1-week delay) for neutral objects that had been paired with emotionally arousing or neutral scenes during encoding. Context effects were clearly evident in the ERPs: An early frontal ERP old/new difference (300-500 ms) was enhanced for objects encoded in unpleasant compared to pleasant and neutral contexts; and a late central-parietal old/new difference (400-700 ms) was observed for objects paired with both pleasant and unpleasant contexts but not for items paired with neutral backgrounds. Interestingly, objects encoded in emotional contexts (and novel objects) also prompted an enhanced frontal early (180-220 ms) positivity compared to objects paired with neutral scenes indicating early perceptual significance. The present data suggest that emotional--particularly unpleasant--backgrounds strengthen memory for items encountered within these contexts and engage automatic and explicit recognition processes. These results could help in understanding binding mechanisms involved in the activation of trauma-related memories by neutral cues.

  16. The self-adaptation to dynamic failures for efficient virtual organization formations in grid computing context

    International Nuclear Information System (INIS)

    Han Liangxiu

    2009-01-01

    Grid computing aims to enable 'resource sharing and coordinated problem solving in dynamic, multi-institutional virtual organizations (VOs)'. However, due to the nature of heterogeneous and dynamic resources, dynamic failures in the distributed grid environment usually occur more than in traditional computation platforms, which cause failed VO formations. In this paper, we develop a novel self-adaptive mechanism to dynamic failures during VO formations. Such a self-adaptive scheme allows an individual and member of VOs to automatically find other available or replaceable one once a failure happens and therefore makes systems automatically recover from dynamic failures. We define dynamic failure situations of a system by using two standard indicators: mean time between failures (MTBF) and mean time to recover (MTTR). We model both MTBF and MTTR as Poisson distributions. We investigate and analyze the efficiency of the proposed self-adaptation mechanism to dynamic failures by comparing the success probability of VO formations before and after adopting it in three different cases: (1) different failure situations; (2) different organizational structures and scales; (3) different task complexities. The experimental results show that the proposed scheme can automatically adapt to dynamic failures and effectively improve the dynamic VO formation performance in the event of node failures, which provide a valuable addition to the field.

  17. A Predictive Likelihood Approach to Bayesian Averaging

    Directory of Open Access Journals (Sweden)

    Tomáš Jeřábek

    2015-01-01

    Full Text Available Multivariate time series forecasting is applied in a wide range of economic activities related to regional competitiveness and is the basis of almost all macroeconomic analysis. In this paper we combine multivariate density forecasts of GDP growth, inflation and real interest rates from four various models, two type of Bayesian vector autoregression (BVAR models, a New Keynesian dynamic stochastic general equilibrium (DSGE model of small open economy and DSGE-VAR model. The performance of models is identified using historical dates including domestic economy and foreign economy, which is represented by countries of the Eurozone. Because forecast accuracy of observed models are different, the weighting scheme based on the predictive likelihood, the trace of past MSE matrix, model ranks are used to combine the models. The equal-weight scheme is used as a simple combination scheme. The results show that optimally combined densities are comparable to the best individual models.

  18. Maximum likelihood versus likelihood-free quantum system identification in the atom maser

    International Nuclear Information System (INIS)

    Catana, Catalin; Kypraios, Theodore; Guţă, Mădălin

    2014-01-01

    We consider the problem of estimating a dynamical parameter of a Markovian quantum open system (the atom maser), by performing continuous time measurements in the system's output (outgoing atoms). Two estimation methods are investigated and compared. Firstly, the maximum likelihood estimator (MLE) takes into account the full measurement data and is asymptotically optimal in terms of its mean square error. Secondly, the ‘likelihood-free’ method of approximate Bayesian computation (ABC) produces an approximation of the posterior distribution for a given set of summary statistics, by sampling trajectories at different parameter values and comparing them with the measurement data via chosen statistics. Building on previous results which showed that atom counts are poor statistics for certain values of the Rabi angle, we apply MLE to the full measurement data and estimate its Fisher information. We then select several correlation statistics such as waiting times, distribution of successive identical detections, and use them as input of the ABC algorithm. The resulting posterior distribution follows closely the data likelihood, showing that the selected statistics capture ‘most’ statistical information about the Rabi angle. (paper)

  19. A maximum likelihood framework for protein design

    Directory of Open Access Journals (Sweden)

    Philippe Hervé

    2006-06-01

    Full Text Available Abstract Background The aim of protein design is to predict amino-acid sequences compatible with a given target structure. Traditionally envisioned as a purely thermodynamic question, this problem can also be understood in a wider context, where additional constraints are captured by learning the sequence patterns displayed by natural proteins of known conformation. In this latter perspective, however, we still need a theoretical formalization of the question, leading to general and efficient learning methods, and allowing for the selection of fast and accurate objective functions quantifying sequence/structure compatibility. Results We propose a formulation of the protein design problem in terms of model-based statistical inference. Our framework uses the maximum likelihood principle to optimize the unknown parameters of a statistical potential, which we call an inverse potential to contrast with classical potentials used for structure prediction. We propose an implementation based on Markov chain Monte Carlo, in which the likelihood is maximized by gradient descent and is numerically estimated by thermodynamic integration. The fit of the models is evaluated by cross-validation. We apply this to a simple pairwise contact potential, supplemented with a solvent-accessibility term, and show that the resulting models have a better predictive power than currently available pairwise potentials. Furthermore, the model comparison method presented here allows one to measure the relative contribution of each component of the potential, and to choose the optimal number of accessibility classes, which turns out to be much higher than classically considered. Conclusion Altogether, this reformulation makes it possible to test a wide diversity of models, using different forms of potentials, or accounting for other factors than just the constraint of thermodynamic stability. Ultimately, such model-based statistical analyses may help to understand the forces

  20. Global Innovation Systems—A conceptual framework for innovation dynamics in transnational contexts

    NARCIS (Netherlands)

    Binz, Christian; Truffer, Bernhard|info:eu-repo/dai/nl/6603148005

    2017-01-01

    This paper proposes a framework for the analysis of technological innovation processes in transnational contexts. By drawing on existing innovation system concepts and recent elaborations on the globalization of innovation, we develop a multi-scalar conceptualization of innovation systems. Two key

  1. Context-sensitive Dynamic Ordinal Regression for Intensity Estimation of Facial Action Units

    NARCIS (Netherlands)

    Rudovic, Ognjen; Pavlovic, Vladimir; Pantic, Maja

    2015-01-01

    Modeling intensity of facial action units from spontaneously displayed facial expressions is challenging mainly because of high variability in subject-specific facial expressiveness, head-movements, illumination changes, etc. These factors make the target problem highly context-sensitive. However,

  2. A guideline for the validation of likelihood ratio methods used for forensic evidence evaluation

    NARCIS (Netherlands)

    Meuwly, Didier; Ramos, Daniel; Haraksim, Rudolf

    2017-01-01

    This Guideline proposes a protocol for the validation of forensic evaluation methods at the source level, using the Likelihood Ratio framework as defined within the Bayes’ inference model. In the context of the inference of identity of source, the Likelihood Ratio is used to evaluate the strength of

  3. Dynamic temperature dependence patterns in future energy demand models in the context of climate change

    International Nuclear Information System (INIS)

    Hekkenberg, M.; Moll, H.C.; Uiterkamp, A.J.M. Schoot

    2009-01-01

    Energy demand depends on outdoor temperature in a 'u' shaped fashion. Various studies have used this temperature dependence to investigate the effects of climate change on energy demand. Such studies contain implicit or explicit assumptions to describe expected socio-economic changes that may affect future energy demand. This paper critically analyzes these implicit or explicit assumptions and their possible effect on the studies' outcomes. First we analyze the interaction between the socio-economic structure and the temperature dependence pattern (TDP) of energy demand. We find that socio-economic changes may alter the TDP in various ways. Next we investigate how current studies manage these dynamics in socio-economic structure. We find that many studies systematically misrepresent the possible effect of socio-economic changes on the TDP of energy demand. Finally, we assess the consequences of these misrepresentations in an energy demand model based on temperature dependence and climate scenarios. Our model results indicate that expected socio-economic dynamics generally lead to an underestimation of future energy demand in models that misrepresent such dynamics. We conclude that future energy demand models should improve the incorporation of socio-economic dynamics. We propose dynamically modeling several key parameters and using direct meteorological data instead of degree days. (author)

  4. Dynamics of context and psychological well-being : the role of subjective health perceptions, personality factors and spirituality / Qambeshile Michael Temane

    OpenAIRE

    Temane, Qambeshile Michael

    2006-01-01

    There is a lacuna in the field of positive psychology as far as the conceptualisation of influences of environmental contexts on psychological well-being is concerned, and there is also a lack of credible empirical findings on the dynamics of processes involved. The aim of the current study was to test various models on the possible mediating role of subjective perceptions of health, personality factors and spirituality in the dynamics of context and psychological well-being. ...

  5. Mother-Infant Dyadic State Behaviour: Dynamic Systems in the Context of Risk

    Science.gov (United States)

    Coburn, Shayna S.; Crnic, Keith A.; Ross, Emily K.

    2015-01-01

    Dynamic systems methods offer invaluable insight into the nuances of the early parent-child relationship. This prospective study aimed to highlight the characteristics of mother-infant dyadic behavior at 12?weeks post-partum using state space grid analysis (total n?=?322). We also examined whether maternal prenatal depressive symptoms and…

  6. Reexamining Demotivators and Motivators: A Longitudinal Study of Japanese Freshmen's Dynamic System in an EFL Context

    Science.gov (United States)

    Kikuchi, Keita

    2017-01-01

    Twenty Japanese university freshmen majoring in International Studies (N = 4) and Nursing (N = 16) participated in a 10-month project examining changes in their motivation. Using monthly focus group interviews and a 35-item questionnaire, the dynamic systems of various types of learners of English over two semesters were explored. Trajectories of…

  7. Dynamics of a low-density tiger population in Southeast Asia in the context of improved law enforcement.

    Science.gov (United States)

    Duangchantrasiri, Somphot; Umponjan, Mayuree; Simcharoen, Saksit; Pattanavibool, Anak; Chaiwattana, Soontorn; Maneerat, Sompoch; Kumar, N Samba; Jathanna, Devcharan; Srivathsa, Arjun; Karanth, K Ullas

    2016-06-01

    Recovering small populations of threatened species is an important global conservation strategy. Monitoring the anticipated recovery, however, often relies on uncertain abundance indices rather than on rigorous demographic estimates. To counter the severe threat from poaching of wild tigers (Panthera tigris), the Government of Thailand established an intensive patrolling system in 2005 to protect and recover its largest source population in Huai Kha Khaeng Wildlife Sanctuary. Concurrently, we assessed the dynamics of this tiger population over the next 8 years with rigorous photographic capture-recapture methods. From 2006 to 2012, we sampled across 624-1026 km(2) with 137-200 camera traps. Cameras deployed for 21,359 trap days yielded photographic records of 90 distinct individuals. We used closed model Bayesian spatial capture-recapture methods to estimate tiger abundances annually. Abundance estimates were integrated with likelihood-based open model analyses to estimate rates of annual and overall rates of survival, recruitment, and changes in abundance. Estimates of demographic parameters fluctuated widely: annual density ranged from 1.25 to 2.01 tigers/100 km(2) , abundance from 35 to 58 tigers, survival from 79.6% to 95.5%, and annual recruitment from 0 to 25 tigers. The number of distinct individuals photographed demonstrates the value of photographic capture-recapture methods for assessments of population dynamics in rare and elusive species that are identifiable from natural markings. Possibly because of poaching pressure, overall tiger densities at Huai Kha Khaeng were 82-90% lower than in ecologically comparable sites in India. However, intensified patrolling after 2006 appeared to reduce poaching and was correlated with marginal improvement in tiger survival and recruitment. Our results suggest that population recovery of low-density tiger populations may be slower than anticipated by current global strategies aimed at doubling the number of wild tigers

  8. Phase field modelling of dynamic thermal fracture in the context of irradiation damage

    CERN Document Server

    Schlüter, Alexander; Müller, Ralf; Tomut, Marilena; Trautmann , Christina; Weick, Helmut; Plate, Carolin

    2015-01-01

    This work presents a continuum mechanics approach to model fracturing processes in brittle materials that are subjected to rapidly applied high-temperature gradients. Such a type of loading typically occurs when a solid is exposed to an intense high-energy particle beam that deposits a large amount of energy into a small sample volume. Given the rapid energy deposition leading to a fast temperature increase, dynamic effects have to be considered. Our existing phase field model for dynamic fracture is thus extended in a way that allows modelling of thermally induced fracture. A finite element scheme is employed to solve the governing partial differential equations numerically. Finally, the functionality of our model is illustrated by two examples.

  9. Homogeneous development and segregation - Power dynamics in the urban context: San Jose project case of Manizales city

    International Nuclear Information System (INIS)

    Noguera de Echeverri, Ana Patricia; Gomez Sanchez, Diana Marcela

    2013-01-01

    This article seeks to show specific situations in which power is mobilized by urban dynamics in the context of development as discourse generator homogeneous models of the city. These models are imposed on local contexts to generate economic progress, but their implementation is linked to urban conflicts related with segregation and social exclusion. This aspect points out the inconsistency between the global discourses of development, with local conditions of communities facing directly the results of their application. The arguments presented below are the result of various investigations carried out in the research group Environmental Thought at the National University of Colombia, Manizales headquarters in the context of the environmental crisis, development and the urban environment. In the period 2011-2012, we addressed the topic of the environmental and aesthetic configurations of the city of Manizales, in terms of spatial planning and urban living. This research is the most concrete support of the contextual references expressed in this article, which are based on a strong fieldwork addressed since different social sectors of the city.

  10. Theoretical aspects of synthetic measurement of the development dynamics in the context of city

    Directory of Open Access Journals (Sweden)

    Zbyszko Pawlak

    2012-12-01

    Full Text Available  Background:  The paper presents the theoretical basis for the proposal of modeling of the dynamics of the modern cities’ development by the use of a properly constructed synthetic indicator. Additionally to the possibility of the quantification of the development of social and economic systems of cities, its implementation allows the identification of nonlinear processes as phase transitions, which occur e.g. under influence of technological and social innovations. The economic and physical approach to this allows to learn more about the nature of these processes and to set new instruments supporting the management of urban areas in conditions of an increasing competiveness.  Methods: The mathematical modeling of social and economical processes and economical and physical approach to dynamics of systems of nonlinear development. Results and conclusions: Based on conducted simulation researches, it can be concluded that the synthetic measure of the development of urban areas can be a good tool supporting the city management by local authorities. The economical and physical approach to the nonlinear dynamics of urban systems marks out new areas for further researches, the determination of minimum required conditions (the necessary level for stimulation of the phase transition and the analysis of factors allowing to avoid the negative consequences of a phase transition, especially in smaller cities areas, seems to be the most important ones.  

  11. The Laplace Likelihood Ratio Test for Heteroscedasticity

    Directory of Open Access Journals (Sweden)

    J. Martin van Zyl

    2011-01-01

    Full Text Available It is shown that the likelihood ratio test for heteroscedasticity, assuming the Laplace distribution, gives good results for Gaussian and fat-tailed data. The likelihood ratio test, assuming normality, is very sensitive to any deviation from normality, especially when the observations are from a distribution with fat tails. Such a likelihood test can also be used as a robust test for a constant variance in residuals or a time series if the data is partitioned into groups.

  12. Modeling and simulation of a controlled steam generator in the context of dynamic reliability using a Stochastic Hybrid Automaton

    International Nuclear Information System (INIS)

    Babykina, Génia; Brînzei, Nicolae; Aubry, Jean-François; Deleuze, Gilles

    2016-01-01

    The paper proposes a modeling framework to support Monte Carlo simulations of the behavior of a complex industrial system. The aim is to analyze the system dependability in the presence of random events, described by any type of probability distributions. Continuous dynamic evolutions of physical parameters are taken into account by a system of differential equations. Dynamic reliability is chosen as theoretical framework. Based on finite state automata theory, the formal model is built by parallel composition of elementary sub-models using a bottom-up approach. Considerations of a stochastic nature lead to a model called the Stochastic Hybrid Automaton. The Scilab/Scicos open source environment is used for implementation. The case study is carried out on an example of a steam generator of a nuclear power plant. The behavior of the system is studied by exploring its trajectories. Possible system trajectories are analyzed both empirically, using the results of Monte Carlo simulations, and analytically, using the formal system model. The obtained results are show to be relevant. The Stochastic Hybrid Automaton appears to be a suitable tool to address the dynamic reliability problem and to model real systems of high complexity; the bottom-up design provides precision and coherency of the system model. - Highlights: • A part of a nuclear power plant is modeled in the context of dynamic reliability. • Stochastic Hybrid Automaton is used as an input model for Monte Carlo simulations. • The model is formally built using a bottom-up approach. • The behavior of the system is analyzed empirically and analytically. • A formally built SHA shows to be a suitable tool to approach dynamic reliability.

  13. Problem solving and intercultural dynamics in a PBL context: Challenges and solutions

    DEFF Research Database (Denmark)

    Brambini-Pedersen, Jan Vang; Brambini, Annalisa; Prætorius, Thim

    2018-01-01

    Recent years has witnessed an increased internationalization of universities including AAU Copenhagen’s study programs where 25% of the students are Non-Danish. This provides new opportunities and challenges for the students and teachers. This survey investigates if Danish and non-Danish students...... impact the forming phase negatively, which in turn increases the risk for intercultural conflicts. The student survey also indicates that group supervisors might be too task focused and that they need to pay more attention to group and intercultural dynamics. For developing the PBL model further...

  14. Revisiting the body-schema concept in the context of whole-body postural-focal dynamics.

    Science.gov (United States)

    Morasso, Pietro; Casadio, Maura; Mohan, Vishwanathan; Rea, Francesco; Zenzeri, Jacopo

    2015-01-01

    The body-schema concept is revisited in the context of embodied cognition, further developing the theory formulated by Marc Jeannerod that the motor system is part of a simulation network related to action, whose function is not only to shape the motor system for preparing an action (either overt or covert) but also to provide the self with information on the feasibility and the meaning of potential actions. The proposed computational formulation is based on a dynamical system approach, which is linked to an extension of the equilibrium-point hypothesis, called Passive Motor Paradigm: this dynamical system generates goal-oriented, spatio-temporal, sensorimotor patterns, integrating a direct and inverse internal model in a multi-referential framework. The purpose of such computational model is to operate at the same time as a general synergy formation machinery for planning whole-body actions in humanoid robots and/or for predicting coordinated sensory-motor patterns in human movements. In order to illustrate the computational approach, the integration of simultaneous, even partially conflicting tasks will be analyzed in some detail with regard to postural-focal dynamics, which can be defined as the fusion of a focal task, namely reaching a target with the whole-body, and a postural task, namely maintaining overall stability.

  15. Revisiting the Body-Schema Concept in the Context of Whole-Body Postural-Focal Dynamics

    Science.gov (United States)

    Morasso, Pietro; Casadio, Maura; Mohan, Vishwanathan; Rea, Francesco; Zenzeri, Jacopo

    2015-01-01

    The body-schema concept is revisited in the context of embodied cognition, further developing the theory formulated by Marc Jeannerod that the motor system is part of a simulation network related to action, whose function is not only to shape the motor system for preparing an action (either overt or covert) but also to provide the self with information on the feasibility and the meaning of potential actions. The proposed computational formulation is based on a dynamical system approach, which is linked to an extension of the equilibrium-point hypothesis, called Passive Motor Paradigm: this dynamical system generates goal-oriented, spatio-temporal, sensorimotor patterns, integrating a direct and inverse internal model in a multi-referential framework. The purpose of such computational model is to operate at the same time as a general synergy formation machinery for planning whole-body actions in humanoid robots and/or for predicting coordinated sensory–motor patterns in human movements. In order to illustrate the computational approach, the integration of simultaneous, even partially conflicting tasks will be analyzed in some detail with regard to postural-focal dynamics, which can be defined as the fusion of a focal task, namely reaching a target with the whole-body, and a postural task, namely maintaining overall stability. PMID:25741274

  16. Revisiting the body-schema concept in the context of Whole-Body Postural-Focal Dynamics

    Directory of Open Access Journals (Sweden)

    Pietro eMorasso

    2015-02-01

    Full Text Available The body schema concept is revisited in the context of embodied cognition, further developing the theory formulated by Marc Jeannerod that the motor system is part of a simulation network related to action, whose function is not only to shape the motor system for preparing an action (either overt or covert, but also to provide the self with information on the feasibility and the meaning of potential actions. The proposed computational formulation is based on a dynamical system approach, which is linked to an extension of the Equilibrium Point Hypothesis, called Passive Motor Paradigm: this dynamical system generates goal-oriented, spatio-temporal, sensorimotor patterns, integrating a direct and inverse internal model in a multi-referential framework. The purpose of such computational model is to operate at the same time as a general synergy formation machinery for planning whole-body actions in humanoid robots and/or for predicting coordinated sensory-motor patterns in human movements. In order to illustrate the computational approach, the integration of simultaneous, even partially conflicting tasks will be analyzed in some detail with regard to postural-focal dynamics, which can be defined as the fusion of a focal task, namely reaching a target with the whole-body, and a postural task, namely maintaining overall stability.

  17. [Family dynamics and chronic illness: children with diabetes in the context of their families].

    Science.gov (United States)

    Wirlach-Bartosik, S; Schubert, M T; Freilinger, M; Schober, E

    2005-01-01

    The present study is based on the assumption of an interaction between family functioning and chronic illness. Using a systemic approach, the intra-familial situation of families with a diabetes-affected child is examined. 44 families were evaluated using a family diagnostic instrument ("Familienbögen") and compared with 31 control families with a healthy child. Furthermore, the study looked at the influence of the level of family functioning on glycemic control, as measured by HbA1c values, and vice versa. Families with a child affected by diabetes showed significantly more dysfunctional domains and higher discrepancies of the ratings in the family diagnostic instrument (p familial dynamics, it may, at the same time, offer opportunities for an improvement of family relationships. However, if physiological parameters deteriorate in the child (poor glycemic control), family problems seem to become less important. Success in the treatment of diabetes patients should therefore not only be measured by the quality of glycemic control, but also by considering psychological factors and aspects of family dynamics.

  18. Assessing the Added Value of Dynamical Downscaling in the Context of Hydrologic Implication

    Science.gov (United States)

    Lu, M.; IM, E. S.; Lee, M. H.

    2017-12-01

    There is a scientific consensus that high-resolution climate simulations downscaled by Regional Climate Models (RCMs) can provide valuable refined information over the target region. However, a significant body of hydrologic impact assessment has been performing using the climate information provided by Global Climate Models (GCMs) in spite of a fundamental spatial scale gap. It is probably based on the assumption that the substantial biases and spatial scale gap from GCMs raw data can be simply removed by applying the statistical bias correction and spatial disaggregation. Indeed, many previous studies argue that the benefit of dynamical downscaling using RCMs is minimal when linking climate data with the hydrological model, from the comparison of the impact between bias-corrected GCMs and bias-corrected RCMs on hydrologic simulations. It may be true for long-term averaged climatological pattern, but it is not necessarily the case when looking into variability across various temporal spectrum. In this study, we investigate the added value of dynamical downscaling focusing on the performance in capturing climate variability. For doing this, we evaluate the performance of the distributed hydrological model over the Korean river basin using the raw output from GCM and RCM, and bias-corrected output from GCM and RCM. The impacts of climate input data on streamflow simulation are comprehensively analyzed. [Acknowledgements]This research is supported by the Korea Agency for Infrastructure Technology Advancement (KAIA) grant funded by the Ministry of Land, Infrastructure and Transport (Grant 17AWMP-B083066-04).

  19. Corporate governance effect on financial distress likelihood: Evidence from Spain

    Directory of Open Access Journals (Sweden)

    Montserrat Manzaneque

    2016-01-01

    Full Text Available The paper explores some mechanisms of corporate governance (ownership and board characteristics in Spanish listed companies and their impact on the likelihood of financial distress. An empirical study was conducted between 2007 and 2012 using a matched-pairs research design with 308 observations, with half of them classified as distressed and non-distressed. Based on the previous study by Pindado, Rodrigues, and De la Torre (2008, a broader concept of bankruptcy is used to define business failure. Employing several conditional logistic models, as well as to other previous studies on bankruptcy, the results confirm that in difficult situations prior to bankruptcy, the impact of board ownership and proportion of independent directors on business failure likelihood are similar to those exerted in more extreme situations. These results go one step further, to offer a negative relationship between board size and the likelihood of financial distress. This result is interpreted as a form of creating diversity and to improve the access to the information and resources, especially in contexts where the ownership is highly concentrated and large shareholders have a great power to influence the board structure. However, the results confirm that ownership concentration does not have a significant impact on financial distress likelihood in the Spanish context. It is argued that large shareholders are passive as regards an enhanced monitoring of management and, alternatively, they do not have enough incentives to hold back the financial distress. These findings have important implications in the Spanish context, where several changes in the regulatory listing requirements have been carried out with respect to corporate governance, and where there is no empirical evidence regarding this respect.

  20. Compliant contact versus rigid contact: A comparison in the context of granular dynamics

    Science.gov (United States)

    Pazouki, Arman; Kwarta, Michał; Williams, Kyle; Likos, William; Serban, Radu; Jayakumar, Paramsothy; Negrut, Dan

    2017-10-01

    We summarize and numerically compare two approaches for modeling and simulating the dynamics of dry granular matter. The first one, the discrete-element method via penalty (DEM-P), is commonly used in the soft matter physics and geomechanics communities; it can be traced back to the work of Cundall and Strack [P. Cundall, Proc. Symp. ISRM, Nancy, France 1, 129 (1971); P. Cundall and O. Strack, Geotechnique 29, 47 (1979), 10.1680/geot.1979.29.1.47]. The second approach, the discrete-element method via complementarity (DEM-C), considers the grains perfectly rigid and enforces nonpenetration via complementarity conditions; it is commonly used in robotics and computer graphics applications and had two strong promoters in Moreau and Jean [J. J. Moreau, in Nonsmooth Mechanics and Applications, edited by J. J. Moreau and P. D. Panagiotopoulos (Springer, Berlin, 1988), pp. 1-82; J. J. Moreau and M. Jean, Proceedings of the Third Biennial Joint Conference on Engineering Systems and Analysis, Montpellier, France, 1996, pp. 201-208]. The DEM-P and DEM-C are manifestly unlike each other: They use different (i) approaches to model the frictional contact problem, (ii) sets of model parameters to capture the physics of interest, and (iii) classes of numerical methods to solve the differential equations that govern the dynamics of the granular material. Herein, we report numerical results for five experiments: shock wave propagation, cone penetration, direct shear, triaxial loading, and hopper flow, which we use to compare the DEM-P and DEM-C solutions. This exercise helps us reach two conclusions. First, both the DEM-P and DEM-C are predictive, i.e., they predict well the macroscale emergent behavior by capturing the dynamics at the microscale. Second, there are classes of problems for which one of the methods has an advantage. Unlike the DEM-P, the DEM-C cannot capture shock-wave propagation through granular media. However, the DEM-C is proficient at handling arbitrary grain

  1. Uncertainty quantification of dynamic responses in the frequency domain in the context of virtual testing

    Science.gov (United States)

    Brehm, Maik; Deraemaeker, Arnaud

    2015-04-01

    For the development of innovative materials, construction types or maintenance strategies, experimental investigations are inevitable to validate theoretical approaches in praxis. Numerical simulations, embedded in a general virtual testing approach, are alternatives to expensive experimental investigations. The statistical properties of the dynamic response in the frequency domain obtained from continuously measured data are often the basis for many developments, such as the optimization of damage indicators for structural health monitoring systems or the investigation of data-based frequency response function estimates. Two straightforward numerical simulation approaches exist to derive the statistics of a response due to random excitation and measurement errors. One approach is the sample-based technique, wherein for each excitation sample a time integration solution is needed. This can be computationally very demanding if a high accuracy of the statistical properties is of interest. The other approach consists in using the relationship between the excitation and the response directly in the frequency domain, wherein a weakly stationary process is assumed. This approach is inherently related to an infinite time response, which can hardly be derived from measured data. In this paper, a novel approach is proposed that overcomes the limitation of both aforementioned methods, by providing a fast analytical probabilistic framework for uncertainty quantification to determine accurately the statistics of short time dynamic responses. It is assumed that the structural system is known and can be described by deterministic parameters. The influences of signal processing techniques, such as linear combinations, windowing, and segmentation used in Welch's method, are considered as well. The performance of the new algorithm is investigated in comparison to both previous approaches on a three degrees of freedom system. The benchmark shows that the novel approach outperforms

  2. DYNAMIC MATHEMATICS SOFTWARE: A QUANTITATIVE ANALYSIS IN THE CONTEXT OF THE PREPARATION OF MATH TEACHER

    Directory of Open Access Journals (Sweden)

    Olena V. Semenikhina

    2015-09-01

    Full Text Available The results of the pedagogical experiment to clarify the number of dynamic mathematics software, which modern math teacher should know, are described.Quantitative results of the questionnaire survey of math teachers concerning the use of DMS in the learning process are given. Thus, the preferences of the teachers are software Gran and GeoGebra, the preferences of the students are software GeoGebra and MathKit. It is noted that math teachers in determining the number of DMS, which they should know and be able to use in the future, notice 3-5 DMS and students, future math teachers, notice 5-7 DMS. Statistical processing of the results was made based on the sign test. It allowed to justify the conclusion, that the study of 5 DMS is needed, at the significance level of 0.05.

  3. Influence of Housing Wall Compliance on Shock Absorbers in the Context of Vehicle Dynamics

    Science.gov (United States)

    Pulvirenti, G.; Faria, C.

    2017-10-01

    Shock absorbers play a key role in vehicle dynamics. Researchers have spent significant effort in order to understand phenomena associated with this component, but there are still several issues to address, in part because new technology development and design trends continually lead to new challenges, among which weight reduction is crucial. For shock absorbers, weight reduction is related to the use of new materials (e.g. composite) or new design paradigms (e.g. more complex geometry, wall thickness, etc.). All of them are directly linked to wall compliance values higher than the actual ones. The present article proposes a first analysis of the phenomena introduced by a high wall compliance, through a modelling approach and various simulations in order to understand the vehicle behaviour changes. It is shown that high values of wall compliance lead to increased hysteresis in the force-velocity curve. However, comfort, handling and ride performances are not significantly affected by this designing parameter.

  4. Essays on empirical likelihood in economics

    NARCIS (Netherlands)

    Gao, Z.

    2012-01-01

    This thesis intends to exploit the roots of empirical likelihood and its related methods in mathematical programming and computation. The roots will be connected and the connections will induce new solutions for the problems of estimation, computation, and generalization of empirical likelihood.

  5. Migration as a Context-Dependent Dynamic in a World of Global Inequalities

    Directory of Open Access Journals (Sweden)

    Jana Sládková

    2011-11-01

    Full Text Available Global migration is a topic of utmost importance in psychological research. As over 200 million people are on the move across national borders, and many more within their own countries, the processes of these migrations must be examined from different points of view and from different geographical locations. The articles in this special journal issue pointedly illustrate the role of international, national, community, and individual factors that shape these migrations. One cross-cutting theme is the importance of studying how multiple levels of context affect immigrant and migrant experiences. All six contributions, collectively, enrich the often individual-centric psychological literature. Issues of resilience and spaces of resistance emerged as a second cross-cutting theme, pointing to new directions for acculturation research and intervention. The challenge of recognizing diversity within migrant communities and among migration patterns is a third cross-cutting theme essential to address as we work toward a more equal world in which people can more freely chose whether to stay or leave their homes.

  6. Dynamics of animal movement in an ecological context: dragonfly wing damage reduces flight performance and predation success.

    Science.gov (United States)

    Combes, S A; Crall, J D; Mukherjee, S

    2010-06-23

    Much of our understanding of the control and dynamics of animal movement derives from controlled laboratory experiments. While many aspects of animal movement can be probed only in these settings, a more complete understanding of animal locomotion may be gained by linking experiments on relatively simple motions in the laboratory to studies of more complex behaviours in natural settings. To demonstrate the utility of this approach, we examined the effects of wing damage on dragonfly flight performance in both a laboratory drop-escape response and the more natural context of aerial predation. The laboratory experiment shows that hindwing area loss reduces vertical acceleration and average flight velocity, and the predation experiment demonstrates that this type of wing damage results in a significant decline in capture success. Taken together, these results suggest that wing damage may take a serious toll on wild dragonflies, potentially reducing both reproductive success and survival.

  7. Reorganization of personal identity in the context of motivational dynamics and internal dialogical activity.

    Science.gov (United States)

    Batory, Anna

    2014-08-01

    Identity is constantly constructed and reconstructed. It may be assumed that there are six fundamental motivational goals according to which it is organized: self-esteem, self-efficacy, continuity, distinctiveness, belonging, and meaning (Vignoles, 2011). Moreover, identity is shaped by its dialogical nature (Hermans, 2003; van Halen & Janssen, 2004). The longitudinal study was conducted to examine both the motivational and the dialogical basis of identity structure dynamics. The results showed that the more the identity element was connected with a sense of continuity and the more dialogical it was, the greater the perceived centrality of this element was after two months. Furthermore, the more the identity element satisfied the self-esteem and belonging motives, the more positive was the affect ascribed to it. In the behavioral domain of identity, participants more strongly manifested those identity aspects that were earlier rated as more dialogical and satisfying the motive of belonging. The results showed that the motivational underpinnings of identity along with its dialogical nature explain changes in identity structure. © 2014 Scandinavian Psychological Associations and John Wiley & Sons Ltd.

  8. Multi-year climate variability in the Southwestern United States within a context of a dynamically downscaled twentieth century reanalysis

    Science.gov (United States)

    Carrillo, Carlos M.; Castro, Christopher L.; Chang, Hsin-I.; Luong, Thang M.

    2017-12-01

    This investigation evaluates whether there is coherency in warm and cool season precipitation at the low-frequency scale that may be responsible for multi-year droughts in the US Southwest. This low-frequency climate variability at the decadal scale and longer is studied within the context of a twentieth-century reanalysis (20CR) and its dynamically-downscaled version (DD-20CR). A spectral domain matrix methods technique (Multiple-Taper-Method Singular Value Decomposition) is applied to these datasets to identify statistically significant spatiotemporal precipitation patterns for the cool (November-April) and warm (July-August) seasons. The low-frequency variability in the 20CR is evaluated by exploring global to continental-scale spatiotemporal variability in moisture flux convergence (MFC) to the occurrence of multiyear droughts and pluvials in Central America, as this region has a demonstrated anti-phase relationship in low-frequency climate variability with northern Mexico and the southwestern US By using the MFC in lieu of precipitation, this study reveals that the 20CR is able to resolve well the low-frequency, multiyear climate variability. In the context of the DD-20CR, multiyear droughts and pluvials in the southwestern US (in the early twentieth century) are significantly related to this low-frequency climate variability. The precipitation anomalies at these low-frequency timescales are in phase between the cool and warm seasons, consistent with the concept of dual-season drought as has been suggested in tree ring studies.

  9. Emotional insecurity about the community: A dynamic, within-person mediator of child adjustment in contexts of political violence.

    Science.gov (United States)

    Cummings, E Mark; Merrilees, Christine; Taylor, Laura K; Goeke-Morey, Marcie; Shirlow, Peter

    2017-02-01

    Over 1 billion children worldwide are exposed to political violence and armed conflict. The current conclusions are qualified by limited longitudinal research testing sophisticated process-oriented explanatory models for child adjustment outcomes. In this study, consistent with a developmental psychopathology perspective emphasizing the value of process-oriented longitudinal study of child adjustment in developmental and social-ecological contexts, we tested emotional insecurity about the community as a dynamic, within-person mediating process for relations between sectarian community violence and child adjustment. Specifically, this study explored children's emotional insecurity at a person-oriented level of analysis assessed over 5 consecutive years, with child gender examined as a moderator of indirect effects between sectarian community violence and child adjustment. In the context of a five-wave longitudinal research design, participants included 928 mother-child dyads in Belfast (453 boys, 475 girls) drawn from socially deprived, ethnically homogenous areas that had experienced political violence. Youth ranged in age from 10 to 20 years and were 13.24 (SD = 1.83) years old on average at the initial time point. Greater insecurity about the community measured over multiple time points mediated relations between sectarian community violence and youth's total adjustment problems. The pathway from sectarian community violence to emotional insecurity about the community was moderated by child gender, with relations to emotional insecurity about the community stronger for girls than for boys. The results suggest that ameliorating children's insecurity about community in contexts of political violence is an important goal toward improving adolescents' well-being and adjustment. These results are discussed in terms of their translational research implications, consistent with a developmental psychopathology model for the interface between basic and intervention

  10. A Comparison of Pseudo-Maximum Likelihood and Asymptotically Distribution-Free Dynamic Factor Analysis Parameter Estimation in Fitting Covariance-Structure Models to Block-Toeplitz Representing Single-Subject Multivariate Time-Series

    NARCIS (Netherlands)

    Molenaar, P.C.M.; Nesselroade, J.R.

    1998-01-01

    The study of intraindividual variability pervades empirical inquiry in virtually all subdisciplines of psychology. The statistical analysis of multivariate time-series data - a central product of intraindividual investigations - requires special modeling techniques. The dynamic factor model (DFM),

  11. Numerical integration methods and layout improvements in the context of dynamic RNA visualization.

    Science.gov (United States)

    Shabash, Boris; Wiese, Kay C

    2017-05-30

    RNA visualization software tools have traditionally presented a static visualization of RNA molecules with limited ability for users to interact with the resulting image once it is complete. Only a few tools allowed for dynamic structures. One such tool is jViz.RNA. Currently, jViz.RNA employs a unique method for the creation of the RNA molecule layout by mapping the RNA nucleotides into vertexes in a graph, which we call the detailed graph, and then utilizes a Newtonian mechanics inspired system of forces to calculate a layout for the RNA molecule. The work presented here focuses on improvements to jViz.RNA that allow the drawing of RNA secondary structures according to common drawing conventions, as well as dramatic run-time performance improvements. This is done first by presenting an alternative method for mapping the RNA molecule into a graph, which we call the compressed graph, and then employing advanced numerical integration methods for the compressed graph representation. Comparing the compressed graph and detailed graph implementations, we find that the compressed graph produces results more consistent with RNA drawing conventions. However, we also find that employing the compressed graph method requires a more sophisticated initial layout to produce visualizations that would require minimal user interference. Comparing the two numerical integration methods demonstrates the higher stability of the Backward Euler method, and its resulting ability to handle much larger time steps, a high priority feature for any software which entails user interaction. The work in this manuscript presents the preferred use of compressed graphs to detailed ones, as well as the advantages of employing the Backward Euler method over the Forward Euler method. These improvements produce more stable as well as visually aesthetic representations of the RNA secondary structures. The results presented demonstrate that both the compressed graph representation, as well as the Backward

  12. The great triangular seismic region in eastern Asia: Thoughts on its dynamic context

    Directory of Open Access Journals (Sweden)

    Xianglin Gao

    2011-01-01

    Full Text Available A huge triangle-shaped tectonic region in eastern Asia plays host to numerous major earthquakes. The three boundaries of this region, which contains plateaus, mountains, and intermountain basins, are roughly the Himalayan arc, the Tianshan-Baikal, and longitude line ∼105°E. Within this triangular region, tectonism is intense and major deformation occurs both between crustal blocks and within most of them. Outside of this region, rigid blocks move as a whole with relatively few major earthquakes and relatively weak Cenozoic deformation. On a large tectonic scale, the presence of this broad region of intraplate deformation results from dynamic interactions between the Indian, Philippine Sea-West Pacific, and Eurasian plates, as well as the influence of deep-level mantle flow. The Indian subcontinent, which continues to move northwards at ∼40 mm/a since its collision with Eurasia, has plunged beneath Tibet, resulting in various movements and deformations along the Himalayan arc that diffuse over a long distance into the hinterland of Asia. The northward crustal escape of Asia from the Himalayan collisional zone turns eastwards and southeastwards along 95°–100°E longitude and defines the eastern Himalayan syntaxis. At the western Himalayan syntaxis, the Pamirs continue to move into central Asia, leading to crustal deformation and earthquakes that are largely accommodated by old EW or NW trending faults in the bordering areas between China, Mongolia, and Russia, and are restricted by the stable landmass northwest of the Tianshan-Altai-Baikal region. The subduction of the Philippine and Pacific plates under the Eurasian continent has generated a very long and narrow seismic zone along trenches and island arcs in the marginal seas while imposing only slight horizontal compression on the Asian continent that does not impede the eastward motion of eastern Asia. In the third dimension, there may be southeastward deep mantle flow beneath most of

  13. A dinâmica familiar no contexto da crise suicida The family dynamics in the context of suicide crisis

    Directory of Open Access Journals (Sweden)

    Liara Lopes Krüger

    2010-04-01

    Full Text Available Famílias inseridas no contexto suicida organizam suas relações em torno de histórias opressoras construídas através das gerações, que impedem o desenvolvimento de autonomia e continuidade. Este artigo objetiva pensar sistemicamente sobre a dinâmica familiar da crise gerada pela tentativa de suicídio de um dos seus membros. Neste estudo, seis famílias participaram de uma intervenção breve, desenvolvida com base na teoria sistêmica. Os dados foram analisados com base no Método de Comparação Constante, identificando-se categorias e a construção de hipóteses a respeito da dinâmica familiar no contexto da crise suicida. Os resultados mostram que os participantes estão limitados em sua capacidade de apoiar o desenvolvimento de uma identidade autônoma, porque a dinâmica familiar identifica as novas oportunidades de narrar a si mesmo como ameaça ao sistema de lealdades que mantém a continuidade da família, impedindo a renegociação desses códigos. O sofrimento se apresenta como emoção que limita novas trocas, surgindo o comportamento suicida como alternativa.Families inserted in the suicide context tend to organize their relations around oppressive histories constructed over generations that hinder the development of autonomy and continuity. This paper aims at thinking systematically about the family dynamics of the crisis generated by the suicide attempt of one of its members. In this study, six families participated of a brief intervention developed on the basis of the systemic theory. The Grounded Theory analysis procedures made possible the identification of the categories and the construction of hypotheses regarding the family dynamics within the context of the suicide crisis. The result shows that participants are limited in their capacity of supporting the development of an independent identity because the family dynamics identifies new opportunities of narrating to oneself as a menace to the system of loyalties

  14. Asymptotic Likelihood Distribution for Correlated & Constrained Systems

    CERN Document Server

    Agarwal, Ujjwal

    2016-01-01

    It describes my work as summer student at CERN. The report discusses the asymptotic distribution of the likelihood ratio for total no. of parameters being h and 2 out of these being are constrained and correlated.

  15. Maximum-Likelihood Detection Of Noncoherent CPM

    Science.gov (United States)

    Divsalar, Dariush; Simon, Marvin K.

    1993-01-01

    Simplified detectors proposed for use in maximum-likelihood-sequence detection of symbols in alphabet of size M transmitted by uncoded, full-response continuous phase modulation over radio channel with additive white Gaussian noise. Structures of receivers derived from particular interpretation of maximum-likelihood metrics. Receivers include front ends, structures of which depends only on M, analogous to those in receivers of coherent CPM. Parts of receivers following front ends have structures, complexity of which would depend on N.

  16. Likelihood functions for the analysis of single-molecule binned photon sequences

    Energy Technology Data Exchange (ETDEWEB)

    Gopich, Irina V., E-mail: irinag@niddk.nih.gov [Laboratory of Chemical Physics, National Institute of Diabetes and Digestive and Kidney Diseases, National Institutes of Health, Bethesda, MD 20892 (United States)

    2012-03-02

    Graphical abstract: Folding of a protein with attached fluorescent dyes, the underlying conformational trajectory of interest, and the observed binned photon trajectory. Highlights: Black-Right-Pointing-Pointer A sequence of photon counts can be analyzed using a likelihood function. Black-Right-Pointing-Pointer The exact likelihood function for a two-state kinetic model is provided. Black-Right-Pointing-Pointer Several approximations are considered for an arbitrary kinetic model. Black-Right-Pointing-Pointer Improved likelihood functions are obtained to treat sequences of FRET efficiencies. - Abstract: We consider the analysis of a class of experiments in which the number of photons in consecutive time intervals is recorded. Sequence of photon counts or, alternatively, of FRET efficiencies can be studied using likelihood-based methods. For a kinetic model of the conformational dynamics and state-dependent Poisson photon statistics, the formalism to calculate the exact likelihood that this model describes such sequences of photons or FRET efficiencies is developed. Explicit analytic expressions for the likelihood function for a two-state kinetic model are provided. The important special case when conformational dynamics are so slow that at most a single transition occurs in a time bin is considered. By making a series of approximations, we eventually recover the likelihood function used in hidden Markov models. In this way, not only is insight gained into the range of validity of this procedure, but also an improved likelihood function can be obtained.

  17. Maximum Likelihood and Restricted Likelihood Solutions in Multiple-Method Studies.

    Science.gov (United States)

    Rukhin, Andrew L

    2011-01-01

    A formulation of the problem of combining data from several sources is discussed in terms of random effects models. The unknown measurement precision is assumed not to be the same for all methods. We investigate maximum likelihood solutions in this model. By representing the likelihood equations as simultaneous polynomial equations, the exact form of the Groebner basis for their stationary points is derived when there are two methods. A parametrization of these solutions which allows their comparison is suggested. A numerical method for solving likelihood equations is outlined, and an alternative to the maximum likelihood method, the restricted maximum likelihood, is studied. In the situation when methods variances are considered to be known an upper bound on the between-method variance is obtained. The relationship between likelihood equations and moment-type equations is also discussed.

  18. Context-dependent Dynamic Processes in Attention Deficit/Hyperactivity Disorder : Differentiating Common and Unique Effects of State Regulation Deficits and Delay Aversion

    NARCIS (Netherlands)

    Sonuga-Barke, Edmund J. S.; Wiersema, Jan R.; van der Meere, Jacob J.; Roeyers, Herbert

    The ability to specify differential predictions is a mark of a scientific models' value. State regulation deficits (SRD) and delay aversion (DAv) have both been hypothesized as context-dependent dynamic dysfunctions in ADHD. However, to date there has been no systematic comparison of their common

  19. Likelihood inference for unions of interacting discs

    DEFF Research Database (Denmark)

    Møller, Jesper; Helisová, Katarina

    To the best of our knowledge, this is the first paper which discusses likelihood inference or a random set using a germ-grain model, where the individual grains are unobservable edge effects occur, and other complications appear. We consider the case where the grains form a disc process modelled...... is specified with respect to a given marked Poisson model (i.e. a Boolean model). We show how edge effects and other complications can be handled by considering a certain conditional likelihood. Our methodology is illustrated by analyzing Peter Diggle's heather dataset, where we discuss the results...... of simulation-based maximum likelihood inference and the effect of specifying different reference Poisson models....

  20. Maximum likelihood estimation for integrated diffusion processes

    DEFF Research Database (Denmark)

    Baltazar-Larios, Fernando; Sørensen, Michael

    We propose a method for obtaining maximum likelihood estimates of parameters in diffusion models when the data is a discrete time sample of the integral of the process, while no direct observations of the process itself are available. The data are, moreover, assumed to be contaminated...... EM-algorithm to obtain maximum likelihood estimates of the parameters in the diffusion model. As part of the algorithm, we use a recent simple method for approximate simulation of diffusion bridges. In simulation studies for the Ornstein-Uhlenbeck process and the CIR process the proposed method works...... by measurement errors. Integrated volatility is an example of this type of observations. Another example is ice-core data on oxygen isotopes used to investigate paleo-temperatures. The data can be viewed as incomplete observations of a model with a tractable likelihood function. Therefore we propose a simulated...

  1. Maintaining symmetry of simulated likelihood functions

    DEFF Research Database (Denmark)

    Andersen, Laura Mørch

    This paper suggests solutions to two different types of simulation errors related to Quasi-Monte Carlo integration. Likelihood functions which depend on standard deviations of mixed parameters are symmetric in nature. This paper shows that antithetic draws preserve this symmetry and thereby...... improves precision substantially. Another source of error is that models testing away mixing dimensions must replicate the relevant dimensions of the quasi-random draws in the simulation of the restricted likelihood. These simulation errors are ignored in the standard estimation procedures used today...

  2. Likelihood inference for unions of interacting discs

    DEFF Research Database (Denmark)

    Møller, Jesper; Helisova, K.

    2010-01-01

    This is probably the first paper which discusses likelihood inference for a random set using a germ-grain model, where the individual grains are unobservable, edge effects occur and other complications appear. We consider the case where the grains form a disc process modelled by a marked point...... process, where the germs are the centres and the marks are the associated radii of the discs. We propose to use a recent parametric class of interacting disc process models, where the minimal sufficient statistic depends on various geometric properties of the random set, and the density is specified......-based maximum likelihood inference and the effect of specifying different reference Poisson models....

  3. Composite likelihood estimation of demographic parameters

    Directory of Open Access Journals (Sweden)

    Garrigan Daniel

    2009-11-01

    Full Text Available Abstract Background Most existing likelihood-based methods for fitting historical demographic models to DNA sequence polymorphism data to do not scale feasibly up to the level of whole-genome data sets. Computational economies can be achieved by incorporating two forms of pseudo-likelihood: composite and approximate likelihood methods. Composite likelihood enables scaling up to large data sets because it takes the product of marginal likelihoods as an estimator of the likelihood of the complete data set. This approach is especially useful when a large number of genomic regions constitutes the data set. Additionally, approximate likelihood methods can reduce the dimensionality of the data by summarizing the information in the original data by either a sufficient statistic, or a set of statistics. Both composite and approximate likelihood methods hold promise for analyzing large data sets or for use in situations where the underlying demographic model is complex and has many parameters. This paper considers a simple demographic model of allopatric divergence between two populations, in which one of the population is hypothesized to have experienced a founder event, or population bottleneck. A large resequencing data set from human populations is summarized by the joint frequency spectrum, which is a matrix of the genomic frequency spectrum of derived base frequencies in two populations. A Bayesian Metropolis-coupled Markov chain Monte Carlo (MCMCMC method for parameter estimation is developed that uses both composite and likelihood methods and is applied to the three different pairwise combinations of the human population resequence data. The accuracy of the method is also tested on data sets sampled from a simulated population model with known parameters. Results The Bayesian MCMCMC method also estimates the ratio of effective population size for the X chromosome versus that of the autosomes. The method is shown to estimate, with reasonable

  4. Likelihood-Based Inference in Nonlinear Error-Correction Models

    DEFF Research Database (Denmark)

    Kristensen, Dennis; Rahbæk, Anders

    We consider a class of vector nonlinear error correction models where the transfer function (or loadings) of the stationary relation- ships is nonlinear. This includes in particular the smooth transition models. A general representation theorem is given which establishes the dynamic properties...... and a linear trend in general. Gaussian likelihood-based estimators are considered for the long- run cointegration parameters, and the short-run parameters. Asymp- totic theory is provided for these and it is discussed to what extend asymptotic normality and mixed normaity can be found. A simulation study...

  5. Maximum Likelihood Joint Tracking and Association in Strong Clutter

    Directory of Open Access Journals (Sweden)

    Leonid I. Perlovsky

    2013-01-01

    Full Text Available We have developed a maximum likelihood formulation for a joint detection, tracking and association problem. An efficient non-combinatorial algorithm for this problem is developed in case of strong clutter for radar data. By using an iterative procedure of the dynamic logic process “from vague-to-crisp” explained in the paper, the new tracker overcomes the combinatorial complexity of tracking in highly-cluttered scenarios and results in an orders-of-magnitude improvement in signal-to-clutter ratio.

  6. Water and sediment dynamics in the context of climate change and variability (Cañete river, Peru).

    Science.gov (United States)

    Rosas, Miluska; Vanacker, Veerle; Huggel, Christian; Gutierrez, Ronald R.

    2017-04-01

    Water erosion is one of the main environmental problems in Peru. The elevated rates of soil erosion are related to the rough topography of the Andes, shallow soils, highly erosive climate and the inappropriate land use management. Agricultural activities are directly affected by the elevated soil erosion rates, either through reduced crop production and/or damage to irrigation infrastructure. Similarly, the development of water infrastructure and hydropower facilities can be negatively affected by high sedimentation rates. However, critical information about sediment production, transport and deposition is still mostly lacking. This paper focuses on sediment dynamics in the context of land use and climate change in the Peruvian Andes. Within the Peruvian Coastal Range, the catchment of the Cañete River is studied as it plays an important role in the social and economic development of the region, and due to its provision of water and energy to rural and urban areas. The lower part of the basin is an arid desert, the middle sub-humid part sustains subsistence agriculture, and the upper part of the basin is a treeless high-elevation puna landscape. Snow cover and glaciers are present at its headwaters located above 5000 m asl. The retreat of glaciers due to climate change is expected to have an impact on water availability, and the production and mobilization of sediment within the river channels. Likewise, climate variability and land cover changes might trigger an important increase of erosion and sediment transport rates. The methodology applied to face this issue is principally based on the analysis of sediment samples recollected in the basin in the period 1998 to 2001, and the application of a water and sediment routing model. The paper presents new data on the sensitivity of water infrastructure and hydropower facilities to climate-induced changes in sediment mobilization.

  7. Efficient Bit-to-Symbol Likelihood Mappings

    Science.gov (United States)

    Moision, Bruce E.; Nakashima, Michael A.

    2010-01-01

    This innovation is an efficient algorithm designed to perform bit-to-symbol and symbol-to-bit likelihood mappings that represent a significant portion of the complexity of an error-correction code decoder for high-order constellations. Recent implementation of the algorithm in hardware has yielded an 8- percent reduction in overall area relative to the prior design.

  8. Likelihood-ratio-based biometric verification

    NARCIS (Netherlands)

    Bazen, A.M.; Veldhuis, Raymond N.J.

    2002-01-01

    This paper presents results on optimal similarity measures for biometric verification based on fixed-length feature vectors. First, we show that the verification of a single user is equivalent to the detection problem, which implies that for single-user verification the likelihood ratio is optimal.

  9. Likelihood Ratio-Based Biometric Verification

    NARCIS (Netherlands)

    Bazen, A.M.; Veldhuis, Raymond N.J.

    The paper presents results on optimal similarity measures for biometric verification based on fixed-length feature vectors. First, we show that the verification of a single user is equivalent to the detection problem, which implies that, for single-user verification, the likelihood ratio is optimal.

  10. Phylogenetic analysis using parsimony and likelihood methods.

    Science.gov (United States)

    Yang, Z

    1996-02-01

    The assumptions underlying the maximum-parsimony (MP) method of phylogenetic tree reconstruction were intuitively examined by studying the way the method works. Computer simulations were performed to corroborate the intuitive examination. Parsimony appears to involve very stringent assumptions concerning the process of sequence evolution, such as constancy of substitution rates between nucleotides, constancy of rates across nucleotide sites, and equal branch lengths in the tree. For practical data analysis, the requirement of equal branch lengths means similar substitution rates among lineages (the existence of an approximate molecular clock), relatively long interior branches, and also few species in the data. However, a small amount of evolution is neither a necessary nor a sufficient requirement of the method. The difficulties involved in the application of current statistical estimation theory to tree reconstruction were discussed, and it was suggested that the approach proposed by Felsenstein (1981, J. Mol. Evol. 17: 368-376) for topology estimation, as well as its many variations and extensions, differs fundamentally from the maximum likelihood estimation of a conventional statistical parameter. Evidence was presented showing that the Felsenstein approach does not share the asymptotic efficiency of the maximum likelihood estimator of a statistical parameter. Computer simulations were performed to study the probability that MP recovers the true tree under a hierarchy of models of nucleotide substitution; its performance relative to the likelihood method was especially noted. The results appeared to support the intuitive examination of the assumptions underlying MP. When a simple model of nucleotide substitution was assumed to generate data, the probability that MP recovers the true topology could be as high as, or even higher than, that for the likelihood method. When the assumed model became more complex and realistic, e.g., when substitution rates were

  11. Class-Based Context Quality Optimization For Context Management Frameworks

    DEFF Research Database (Denmark)

    Shawky, Ahmed; Olsen, Rasmus Løvenstein; Pedersen, Jens Myrup

    2012-01-01

    Context-awareness is a key requirement in many of today's networks, services and applications. Context Management systems are in this respect used to provide access to distributed, dynamic context information. The reliability of remotely accessed dynamic context information is challenged by network...

  12. Factors Associated with Young Adults’ Pregnancy Likelihood

    Science.gov (United States)

    Kitsantas, Panagiota; Lindley, Lisa L.; Wu, Huichuan

    2014-01-01

    OBJECTIVES While progress has been made to reduce adolescent pregnancies in the United States, rates of unplanned pregnancy among young adults (18–29 years) remain high. In this study, we assessed factors associated with perceived likelihood of pregnancy (likelihood of getting pregnant/getting partner pregnant in the next year) among sexually experienced young adults who were not trying to get pregnant and had ever used contraceptives. METHODS We conducted a secondary analysis of 660 young adults, 18–29 years old in the United States, from the cross-sectional National Survey of Reproductive and Contraceptive Knowledge. Logistic regression and classification tree analyses were conducted to generate profiles of young adults most likely to report anticipating a pregnancy in the next year. RESULTS Nearly one-third (32%) of young adults indicated they believed they had at least some likelihood of becoming pregnant in the next year. Young adults who believed that avoiding pregnancy was not very important were most likely to report pregnancy likelihood (odds ratio [OR], 5.21; 95% CI, 2.80–9.69), as were young adults for whom avoiding a pregnancy was important but not satisfied with their current contraceptive method (OR, 3.93; 95% CI, 1.67–9.24), attended religious services frequently (OR, 3.0; 95% CI, 1.52–5.94), were uninsured (OR, 2.63; 95% CI, 1.31–5.26), and were likely to have unprotected sex in the next three months (OR, 1.77; 95% CI, 1.04–3.01). DISCUSSION These results may help guide future research and the development of pregnancy prevention interventions targeting sexually experienced young adults. PMID:25782849

  13. Unbinned likelihood analysis of EGRET observations

    International Nuclear Information System (INIS)

    Digel, Seth W.

    2000-01-01

    We present a newly-developed likelihood analysis method for EGRET data that defines the likelihood function without binning the photon data or averaging the instrumental response functions. The standard likelihood analysis applied to EGRET data requires the photons to be binned spatially and in energy, and the point-spread functions to be averaged over energy and inclination angle. The full-width half maximum of the point-spread function increases by about 40% from on-axis to 30 degree sign inclination, and depending on the binning in energy can vary by more than that in a single energy bin. The new unbinned method avoids the loss of information that binning and averaging cause and can properly analyze regions where EGRET viewing periods overlap and photons with different inclination angles would otherwise be combined in the same bin. In the poster, we describe the unbinned analysis method and compare its sensitivity with binned analysis for detecting point sources in EGRET data

  14. Superfast maximum-likelihood reconstruction for quantum tomography

    Science.gov (United States)

    Shang, Jiangwei; Zhang, Zhengyun; Ng, Hui Khoon

    2017-06-01

    Conventional methods for computing maximum-likelihood estimators (MLE) often converge slowly in practical situations, leading to a search for simplifying methods that rely on additional assumptions for their validity. In this work, we provide a fast and reliable algorithm for maximum-likelihood reconstruction that avoids this slow convergence. Our method utilizes the state-of-the-art convex optimization scheme, an accelerated projected-gradient method, that allows one to accommodate the quantum nature of the problem in a different way than in the standard methods. We demonstrate the power of our approach by comparing its performance with other algorithms for n -qubit state tomography. In particular, an eight-qubit situation that purportedly took weeks of computation time in 2005 can now be completed in under a minute for a single set of data, with far higher accuracy than previously possible. This refutes the common claim that MLE reconstruction is slow and reduces the need for alternative methods that often come with difficult-to-verify assumptions. In fact, recent methods assuming Gaussian statistics or relying on compressed sensing ideas are demonstrably inapplicable for the situation under consideration here. Our algorithm can be applied to general optimization problems over the quantum state space; the philosophy of projected gradients can further be utilized for optimization contexts with general constraints.

  15. Corporate brand extensions based on the purchase likelihood: governance implications

    Directory of Open Access Journals (Sweden)

    Spyridon Goumas

    2018-03-01

    Full Text Available This paper is examining the purchase likelihood of hypothetical service brand extensions from product companies focusing on consumer electronics based on sector categorization and perceptions of fit between the existing product category and image of the company. Prior research has recognized that levels of brand knowledge eases the transference of associations and affect to the new products. Similarity to the existing products of the parent company and perceived image also influence the success of brand extensions. However, sector categorization may interfere with this relationship. The purpose of this study is to examine Greek consumers’ attitudes towards hypothetical brand extensions, and how these are affected by consumers’ existing knowledge about the brand, sector categorization and perceptions of image and category fit of cross-sector extensions. This aim is examined in the context of technological categories, where less-known companies exhibited significance in purchase likelihood, and contradictory with the existing literature, service companies did not perform as positively as expected. Additional insights to the existing literature about sector categorization are provided. The effect of both image and category fit is also examined and predictions regarding the effect of each are made.

  16. Estimation of stochastic frontier models with fixed-effects through Monte Carlo Maximum Likelihood

    NARCIS (Netherlands)

    Emvalomatis, G.; Stefanou, S.E.; Oude Lansink, A.G.J.M.

    2011-01-01

    Estimation of nonlinear fixed-effects models is plagued by the incidental parameters problem. This paper proposes a procedure for choosing appropriate densities for integrating the incidental parameters from the likelihood function in a general context. The densities are based on priors that are

  17. Maximum likelihood estimation for Cox's regression model under nested case-control sampling

    DEFF Research Database (Denmark)

    Scheike, Thomas; Juul, Anders

    2004-01-01

    Nested case-control sampling is designed to reduce the costs of large cohort studies. It is important to estimate the parameters of interest as efficiently as possible. We present a new maximum likelihood estimator (MLE) for nested case-control sampling in the context of Cox's proportional hazard...

  18. Dimension-Independent Likelihood-Informed MCMC

    KAUST Repository

    Cui, Tiangang; Law, Kody; Marzouk, Youssef

    2015-01-01

    Many Bayesian inference problems require exploring the posterior distribution of high-dimensional parameters, which in principle can be described as functions. By exploiting low-dimensional structure in the change from prior to posterior [distributions], we introduce a suite of MCMC samplers that can adapt to the complex structure of the posterior distribution, yet are well-defined on function space. Posterior sampling in nonlinear inverse problems arising from various partial di erential equations and also a stochastic differential equation are used to demonstrate the e ciency of these dimension-independent likelihood-informed samplers.

  19. Multi-Channel Maximum Likelihood Pitch Estimation

    DEFF Research Database (Denmark)

    Christensen, Mads Græsbøll

    2012-01-01

    In this paper, a method for multi-channel pitch estimation is proposed. The method is a maximum likelihood estimator and is based on a parametric model where the signals in the various channels share the same fundamental frequency but can have different amplitudes, phases, and noise characteristics....... This essentially means that the model allows for different conditions in the various channels, like different signal-to-noise ratios, microphone characteristics and reverberation. Moreover, the method does not assume that a certain array structure is used but rather relies on a more general model and is hence...

  20. Dimension-Independent Likelihood-Informed MCMC

    KAUST Repository

    Cui, Tiangang

    2015-01-07

    Many Bayesian inference problems require exploring the posterior distribution of high-dimensional parameters, which in principle can be described as functions. By exploiting low-dimensional structure in the change from prior to posterior [distributions], we introduce a suite of MCMC samplers that can adapt to the complex structure of the posterior distribution, yet are well-defined on function space. Posterior sampling in nonlinear inverse problems arising from various partial di erential equations and also a stochastic differential equation are used to demonstrate the e ciency of these dimension-independent likelihood-informed samplers.

  1. Approximate maximum parsimony and ancestral maximum likelihood.

    Science.gov (United States)

    Alon, Noga; Chor, Benny; Pardi, Fabio; Rapoport, Anat

    2010-01-01

    We explore the maximum parsimony (MP) and ancestral maximum likelihood (AML) criteria in phylogenetic tree reconstruction. Both problems are NP-hard, so we seek approximate solutions. We formulate the two problems as Steiner tree problems under appropriate distances. The gist of our approach is the succinct characterization of Steiner trees for a small number of leaves for the two distances. This enables the use of known Steiner tree approximation algorithms. The approach leads to a 16/9 approximation ratio for AML and asymptotically to a 1.55 approximation ratio for MP.

  2. Population dynamics throughout the urban context: A case study in sub-Saharan Africa utilizing remotely sensed imagery and GIS

    Science.gov (United States)

    Benza, Magdalena

    The characteristics of places where people live and work play an important role in explaining complex social, political, economic and demographic processes. In sub-Saharan Africa rapid urban growth combined with rising poverty is creating diverse urban environments inhabited by people with a wide variety of lifestyles. This research examines how spatial patterns of land cover in a southern portion of the West African country of Ghana are associated with particular characteristics of family organization and reproduction decisions. Satellite imagery and landscape metrics are used to create an urban context definition based on landscape patterns using a gradient approach. Census data are used to estimate fertility levels and household structure, and the association between urban context, household composition and fertility levels is modeled through OLS regression, spatial autoregressive models and geographically weighted regression. Results indicate that there are significant differences in fertility levels between different urban contexts, with below average fertility levels found in the most urbanized end of the urban context definition and above average fertility levels found on the opposite end. The spatial patterns identified in the association between urban context and fertility levels indicate that, within the city areas with lower fertility have significant impacts on the reproductive levels of adjacent neighborhoods. Findings also indicate that there are clear patterns that link urban context to living arrangements and fertility levels. Female- and single-headed households are associated with below average fertility levels, a result that connects dropping fertility levels with the spread of smaller nuclear households in developing countries. At the same time, larger extended family households are linked to below average fertility levels for highly clustered areas, a finding that points to the prevalence of extended family housing in the West African city.

  3. Anticipating cognitive effort: roles of perceived error-likelihood and time demands.

    Science.gov (United States)

    Dunn, Timothy L; Inzlicht, Michael; Risko, Evan F

    2017-11-13

    Why are some actions evaluated as effortful? In the present set of experiments we address this question by examining individuals' perception of effort when faced with a trade-off between two putative cognitive costs: how much time a task takes vs. how error-prone it is. Specifically, we were interested in whether individuals anticipate engaging in a small amount of hard work (i.e., low time requirement, but high error-likelihood) vs. a large amount of easy work (i.e., high time requirement, but low error-likelihood) as being more effortful. In between-subject designs, Experiments 1 through 3 demonstrated that individuals anticipate options that are high in perceived error-likelihood (yet less time consuming) as more effortful than options that are perceived to be more time consuming (yet low in error-likelihood). Further, when asked to evaluate which of the two tasks was (a) more effortful, (b) more error-prone, and (c) more time consuming, effort-based and error-based choices closely tracked one another, but this was not the case for time-based choices. Utilizing a within-subject design, Experiment 4 demonstrated overall similar pattern of judgments as Experiments 1 through 3. However, both judgments of error-likelihood and time demand similarly predicted effort judgments. Results are discussed within the context of extant accounts of cognitive control, with considerations of how error-likelihood and time demands may independently and conjunctively factor into judgments of cognitive effort.

  4. Simulation-based marginal likelihood for cluster strong lensing cosmology

    Science.gov (United States)

    Killedar, M.; Borgani, S.; Fabjan, D.; Dolag, K.; Granato, G.; Meneghetti, M.; Planelles, S.; Ragone-Figueroa, C.

    2018-01-01

    Comparisons between observed and predicted strong lensing properties of galaxy clusters have been routinely used to claim either tension or consistency with Λ cold dark matter cosmology. However, standard approaches to such cosmological tests are unable to quantify the preference for one cosmology over another. We advocate approximating the relevant Bayes factor using a marginal likelihood that is based on the following summary statistic: the posterior probability distribution function for the parameters of the scaling relation between Einstein radii and cluster mass, α and β. We demonstrate, for the first time, a method of estimating the marginal likelihood using the X-ray selected z > 0.5 Massive Cluster Survey clusters as a case in point and employing both N-body and hydrodynamic simulations of clusters. We investigate the uncertainty in this estimate and consequential ability to compare competing cosmologies, which arises from incomplete descriptions of baryonic processes, discrepancies in cluster selection criteria, redshift distribution and dynamical state. The relation between triaxial cluster masses at various overdensities provides a promising alternative to the strong lensing test.

  5. Morphological Analysis in Context versus Isolation: Use of a Dynamic Assessment Task with School-Age Children

    Science.gov (United States)

    Ram, Gayatri; Marinellie, Sally A.; Benigno, Joann; McCarthy, John

    2013-01-01

    Purpose: The current study investigated the ability of typically developing children in Grades 3 and 5 to use morphological analysis to determine the meanings of derived words with and without context clues. Also of interest was the relation between children's reading practices and their performance in determining the meanings of derived words.…

  6. Types and Dynamics of Gendered Space: A Case of Emirati Female Learners in a Single-Gender Context

    Science.gov (United States)

    Alzeer, Gergana

    2018-01-01

    This article is concerned with gendered spaces as they emerge from exploring Emirati female learners' spatiality in a single-gender context. By conducting ethnographic research and utilising Lefebvre's triad of perceived, conceived and lived space for the analysis and categorisation of students' spaces, three types of gendered spaces emerged:…

  7. Transfer Entropy as a Log-Likelihood Ratio

    Science.gov (United States)

    Barnett, Lionel; Bossomaier, Terry

    2012-09-01

    Transfer entropy, an information-theoretic measure of time-directed information transfer between joint processes, has steadily gained popularity in the analysis of complex stochastic dynamics in diverse fields, including the neurosciences, ecology, climatology, and econometrics. We show that for a broad class of predictive models, the log-likelihood ratio test statistic for the null hypothesis of zero transfer entropy is a consistent estimator for the transfer entropy itself. For finite Markov chains, furthermore, no explicit model is required. In the general case, an asymptotic χ2 distribution is established for the transfer entropy estimator. The result generalizes the equivalence in the Gaussian case of transfer entropy and Granger causality, a statistical notion of causal influence based on prediction via vector autoregression, and establishes a fundamental connection between directed information transfer and causality in the Wiener-Granger sense.

  8. Participatory Climate Research in a Dynamic Urban Context: Activities of the Consortium for Climate Risk in the Urban Northeast (CCRUN)

    Science.gov (United States)

    Horton, Radley M.; Bader, Daniel A.; Montalto, Franco; Solecki, William

    2016-01-01

    The Consortium for Climate Risk in the Urban Northeast (CCRUN), one of ten NOAA-RISAs, supports resilience efforts in the urban corridor stretching from Philadelphia to Boston. Challenges and opportunities include the diverse set of needs in broad urban contexts, as well as the integration of interdisciplinary perspectives. CCRUN is addressing these challenges through strategies including: 1) the development of an integrated project framework, 2) stakeholder surveys, 3) leveraging extreme weather events as focusing opportunities, and 4) a seminar series that enables scientists and stakeholders to partner. While recognizing that the most extreme weather events will always lead to surprises (even with sound planning), CCRUN endeavors to remain flexible by facilitating place-based research in an interdisciplinary context.

  9. Maximum Likelihood Reconstruction for Magnetic Resonance Fingerprinting.

    Science.gov (United States)

    Zhao, Bo; Setsompop, Kawin; Ye, Huihui; Cauley, Stephen F; Wald, Lawrence L

    2016-08-01

    This paper introduces a statistical estimation framework for magnetic resonance (MR) fingerprinting, a recently proposed quantitative imaging paradigm. Within this framework, we present a maximum likelihood (ML) formalism to estimate multiple MR tissue parameter maps directly from highly undersampled, noisy k-space data. A novel algorithm, based on variable splitting, the alternating direction method of multipliers, and the variable projection method, is developed to solve the resulting optimization problem. Representative results from both simulations and in vivo experiments demonstrate that the proposed approach yields significantly improved accuracy in parameter estimation, compared to the conventional MR fingerprinting reconstruction. Moreover, the proposed framework provides new theoretical insights into the conventional approach. We show analytically that the conventional approach is an approximation to the ML reconstruction; more precisely, it is exactly equivalent to the first iteration of the proposed algorithm for the ML reconstruction, provided that a gridding reconstruction is used as an initialization.

  10. Subtracting and Fitting Histograms using Profile Likelihood

    CERN Document Server

    D'Almeida, F M L

    2008-01-01

    It is known that many interesting signals expected at LHC are of unknown shape and strongly contaminated by background events. These signals will be dif cult to detect during the rst years of LHC operation due to the initial low luminosity. In this work, one presents a method of subtracting histograms based on the pro le likelihood function when the background is previously estimated by Monte Carlo events and one has low statistics. Estimators for the signal in each bin of the histogram difference are calculated so as limits for the signals with 68.3% of Con dence Level in a low statistics case when one has a exponential background and a Gaussian signal. The method can also be used to t histograms when the signal shape is known. Our results show a good performance and avoid the problem of negative values when subtracting histograms.

  11. Modelling maximum likelihood estimation of availability

    International Nuclear Information System (INIS)

    Waller, R.A.; Tietjen, G.L.; Rock, G.W.

    1975-01-01

    Suppose the performance of a nuclear powered electrical generating power plant is continuously monitored to record the sequence of failure and repairs during sustained operation. The purpose of this study is to assess one method of estimating the performance of the power plant when the measure of performance is availability. That is, we determine the probability that the plant is operational at time t. To study the availability of a power plant, we first assume statistical models for the variables, X and Y, which denote the time-to-failure and the time-to-repair variables, respectively. Once those statistical models are specified, the availability, A(t), can be expressed as a function of some or all of their parameters. Usually those parameters are unknown in practice and so A(t) is unknown. This paper discusses the maximum likelihood estimator of A(t) when the time-to-failure model for X is an exponential density with parameter, lambda, and the time-to-repair model for Y is an exponential density with parameter, theta. Under the assumption of exponential models for X and Y, it follows that the instantaneous availability at time t is A(t)=lambda/(lambda+theta)+theta/(lambda+theta)exp[-[(1/lambda)+(1/theta)]t] with t>0. Also, the steady-state availability is A(infinity)=lambda/(lambda+theta). We use the observations from n failure-repair cycles of the power plant, say X 1 , X 2 , ..., Xsub(n), Y 1 , Y 2 , ..., Ysub(n) to present the maximum likelihood estimators of A(t) and A(infinity). The exact sampling distributions for those estimators and some statistical properties are discussed before a simulation model is used to determine 95% simulation intervals for A(t). The methodology is applied to two examples which approximate the operating history of two nuclear power plants. (author)

  12. Monte Carlo Maximum Likelihood Estimation for Generalized Long-Memory Time Series Models

    NARCIS (Netherlands)

    Mesters, G.; Koopman, S.J.; Ooms, M.

    2016-01-01

    An exact maximum likelihood method is developed for the estimation of parameters in a non-Gaussian nonlinear density function that depends on a latent Gaussian dynamic process with long-memory properties. Our method relies on the method of importance sampling and on a linear Gaussian approximating

  13. Three Levels of Push-Pull Dynamics among Chinese International Students' Decision to Study Abroad in the Canadian Context

    Science.gov (United States)

    Chen, Jun Mian

    2017-01-01

    The extant literature on student migration flows generally focus on the traditional push-pull factors of migration at the individual level. Such a tendency excludes the broader levels affecting international student mobility. This paper proposes a hybrid of three levels of push-pull dynamics (micro-individual decision-making, meso-academic…

  14. Modeling context-sensitive, dynamic activity travel behavior by linking short-and long-term responses to accumulated stress

    NARCIS (Netherlands)

    Psarra, I.; Liao, F.; Arentze, T.A.; Timmermans, H.J.P.

    2014-01-01

    As existing activity-based models of travel demand simulate activity travel patterns for a typical day, dynamic models simulate behavioral response to endogenous or exogenous change along various time horizons. Prior research predominantly addressed a specific kind of change, which usually affected

  15. Applying the Dynamic Social Systems Model to HIV Prevention in a Rural African Context: The Maasai and the "Esoto" Dance

    Science.gov (United States)

    Siegler, Aaron J.; Mbwambo, Jessie K.; DiClemente, Ralph J.

    2013-01-01

    This study applied the Dynamic Social Systems Model (DSSM) to the issue of HIV risk among the Maasai tribe of Tanzania, using data from a cross-sectional, cluster survey among 370 randomly selected participants from Ngorongoro and Siha Districts. A culturally appropriate survey instrument was developed to explore traditions reportedly coadunate…

  16. FlowMax: A Computational Tool for Maximum Likelihood Deconvolution of CFSE Time Courses.

    Directory of Open Access Journals (Sweden)

    Maxim Nikolaievich Shokhirev

    Full Text Available The immune response is a concerted dynamic multi-cellular process. Upon infection, the dynamics of lymphocyte populations are an aggregate of molecular processes that determine the activation, division, and longevity of individual cells. The timing of these single-cell processes is remarkably widely distributed with some cells undergoing their third division while others undergo their first. High cell-to-cell variability and technical noise pose challenges for interpreting popular dye-dilution experiments objectively. It remains an unresolved challenge to avoid under- or over-interpretation of such data when phenotyping gene-targeted mouse models or patient samples. Here we develop and characterize a computational methodology to parameterize a cell population model in the context of noisy dye-dilution data. To enable objective interpretation of model fits, our method estimates fit sensitivity and redundancy by stochastically sampling the solution landscape, calculating parameter sensitivities, and clustering to determine the maximum-likelihood solution ranges. Our methodology accounts for both technical and biological variability by using a cell fluorescence model as an adaptor during population model fitting, resulting in improved fit accuracy without the need for ad hoc objective functions. We have incorporated our methodology into an integrated phenotyping tool, FlowMax, and used it to analyze B cells from two NFκB knockout mice with distinct phenotypes; we not only confirm previously published findings at a fraction of the expended effort and cost, but reveal a novel phenotype of nfkb1/p105/50 in limiting the proliferative capacity of B cells following B-cell receptor stimulation. In addition to complementing experimental work, FlowMax is suitable for high throughput analysis of dye dilution studies within clinical and pharmacological screens with objective and quantitative conclusions.

  17. Maximum likelihood estimation for Cox's regression model under nested case-control sampling

    DEFF Research Database (Denmark)

    Scheike, Thomas Harder; Juul, Anders

    2004-01-01

    -like growth factor I was associated with ischemic heart disease. The study was based on a population of 3784 Danes and 231 cases of ischemic heart disease where controls were matched on age and gender. We illustrate the use of the MLE for these data and show how the maximum likelihood framework can be used......Nested case-control sampling is designed to reduce the costs of large cohort studies. It is important to estimate the parameters of interest as efficiently as possible. We present a new maximum likelihood estimator (MLE) for nested case-control sampling in the context of Cox's proportional hazards...... model. The MLE is computed by the EM-algorithm, which is easy to implement in the proportional hazards setting. Standard errors are estimated by a numerical profile likelihood approach based on EM aided differentiation. The work was motivated by a nested case-control study that hypothesized that insulin...

  18. Multi-stage gene normalization for full-text articles with context-based species filtering for dynamic dictionary entry selection.

    Science.gov (United States)

    Tsai, Richard Tzong-Han; Lai, Po-Ting

    2011-10-03

    Gene normalization (GN) is the task of identifying the unique database IDs of genes and proteins in literature. The best-known public competition of GN systems is the GN task of the BioCreative challenge, which has been held four times since 2003. The last two BioCreatives, II.5 & III, had two significant differences from earlier tasks: firstly, they provided full-length articles in addition to abstracts; and secondly, they included multiple species without providing species ID information. Full papers introduce more complex targets for GN processing, while the inclusion of multiple species vastly increases the potential size of dictionaries needed for GN. BioCreative III GN uses Threshold Average Precision at a median of k errors per query (TAP-k), a new measure closely related to the well-known average precision, but also reflecting the reliability of the score provided by each GN system. To use full-paper text, we employed a multi-stage GN algorithm and a ranking method which exploit information in different sections and parts of a paper. To handle the inclusion of multiple unknown species, we developed two context-based dynamic strategies to select dictionary entries related to the species that appear in the paper-section-wide and article-wide context. Our originally submitted BioCreative III system uses a static dictionary containing only the most common species entries. It already exceeds the BioCreative III average team performance by at least 24% in every evaluation. However, using our proposed dynamic dictionary strategies, we were able to further improve TAP-5, TAP-10, and TAP-20 by 16.47%, 13.57% and 6.01%, respectively in the Gold 50 test set. Our best dynamic strategy outperforms the best BioCreative III systems in TAP-10 on the Silver 50 test set and in TAP-5 on the Silver 507 set. Our experimental results demonstrate the superiority of our proposed dynamic dictionary selection strategies over our original static strategy and most BioCreative III

  19. The Application of Dynamic Capabilities in E-commerce Innovation Context : The Implications for Chinese E-commerce companies

    OpenAIRE

    Chen, YongJia; Liang, WeiMin

    2007-01-01

    This study mainly investigated how Chinese E-commerce companies should cope with E-commerce innovation with specific dynamic capabilities. E-commerce (Electronic Commerce) innovation includes three phases of innovation based on technology and time. They are web-based commerce, mobile commerce (M-commerce) and ubiquitous commerce (U-commerce). They caused not only technological changes but also organizational changes. To cope with E-commerce innovation, a prerequisite is to understand the impa...

  20. Likelihood analysis of the minimal AMSB model

    Energy Technology Data Exchange (ETDEWEB)

    Bagnaschi, E.; Weiglein, G. [DESY, Hamburg (Germany); Borsato, M.; Chobanova, V.; Lucio, M.; Santos, D.M. [Universidade de Santiago de Compostela, Santiago de Compostela (Spain); Sakurai, K. [Institute for Particle Physics Phenomenology, University of Durham, Science Laboratories, Department of Physics, Durham (United Kingdom); University of Warsaw, Faculty of Physics, Institute of Theoretical Physics, Warsaw (Poland); Buchmueller, O.; Citron, M.; Costa, J.C.; Richards, A. [Imperial College, High Energy Physics Group, Blackett Laboratory, London (United Kingdom); Cavanaugh, R. [Fermi National Accelerator Laboratory, Batavia, IL (United States); University of Illinois at Chicago, Physics Department, Chicago, IL (United States); De Roeck, A. [Experimental Physics Department, CERN, Geneva (Switzerland); Antwerp University, Wilrijk (Belgium); Dolan, M.J. [School of Physics, University of Melbourne, ARC Centre of Excellence for Particle Physics at the Terascale, Melbourne (Australia); Ellis, J.R. [King' s College London, Theoretical Particle Physics and Cosmology Group, Department of Physics, London (United Kingdom); CERN, Theoretical Physics Department, Geneva (Switzerland); Flaecher, H. [University of Bristol, H.H. Wills Physics Laboratory, Bristol (United Kingdom); Heinemeyer, S. [Campus of International Excellence UAM+CSIC, Madrid (Spain); Instituto de Fisica Teorica UAM-CSIC, Madrid (Spain); Instituto de Fisica de Cantabria (CSIC-UC), Cantabria (Spain); Isidori, G. [Physik-Institut, Universitaet Zuerich, Zurich (Switzerland); Luo, F. [Kavli IPMU (WPI), UTIAS, The University of Tokyo, Kashiwa, Chiba (Japan); Olive, K.A. [School of Physics and Astronomy, University of Minnesota, William I. Fine Theoretical Physics Institute, Minneapolis, MN (United States)

    2017-04-15

    We perform a likelihood analysis of the minimal anomaly-mediated supersymmetry-breaking (mAMSB) model using constraints from cosmology and accelerator experiments. We find that either a wino-like or a Higgsino-like neutralino LSP, χ{sup 0}{sub 1}, may provide the cold dark matter (DM), both with similar likelihoods. The upper limit on the DM density from Planck and other experiments enforces m{sub χ{sup 0}{sub 1}} 0) but the scalar mass m{sub 0} is poorly constrained. In the wino-LSP case, m{sub 3/2} is constrained to about 900 TeV and m{sub χ{sup 0}{sub 1}} to 2.9 ± 0.1 TeV, whereas in the Higgsino-LSP case m{sub 3/2} has just a lower limit >or similar 650 TeV (>or similar 480 TeV) and m{sub χ{sup 0}{sub 1}} is constrained to 1.12 (1.13) ± 0.02 TeV in the μ > 0 (μ < 0) scenario. In neither case can the anomalous magnetic moment of the muon, (g-2){sub μ}, be improved significantly relative to its Standard Model (SM) value, nor do flavour measurements constrain the model significantly, and there are poor prospects for discovering supersymmetric particles at the LHC, though there are some prospects for direct DM detection. On the other hand, if the χ{sup 0}{sub 1} contributes only a fraction of the cold DM density, future LHC E{sub T}-based searches for gluinos, squarks and heavier chargino and neutralino states as well as disappearing track searches in the wino-like LSP region will be relevant, and interference effects enable BR(B{sub s,d} → μ{sup +}μ{sup -}) to agree with the data better than in the SM in the case of wino-like DM with μ > 0. (orig.)

  1. Dimension-independent likelihood-informed MCMC

    KAUST Repository

    Cui, Tiangang

    2015-10-08

    Many Bayesian inference problems require exploring the posterior distribution of high-dimensional parameters that represent the discretization of an underlying function. This work introduces a family of Markov chain Monte Carlo (MCMC) samplers that can adapt to the particular structure of a posterior distribution over functions. Two distinct lines of research intersect in the methods developed here. First, we introduce a general class of operator-weighted proposal distributions that are well defined on function space, such that the performance of the resulting MCMC samplers is independent of the discretization of the function. Second, by exploiting local Hessian information and any associated low-dimensional structure in the change from prior to posterior distributions, we develop an inhomogeneous discretization scheme for the Langevin stochastic differential equation that yields operator-weighted proposals adapted to the non-Gaussian structure of the posterior. The resulting dimension-independent and likelihood-informed (DILI) MCMC samplers may be useful for a large class of high-dimensional problems where the target probability measure has a density with respect to a Gaussian reference measure. Two nonlinear inverse problems are used to demonstrate the efficiency of these DILI samplers: an elliptic PDE coefficient inverse problem and path reconstruction in a conditioned diffusion.

  2. Likelihood Analysis of Supersymmetric SU(5) GUTs

    CERN Document Server

    Bagnaschi, E.

    2017-01-01

    We perform a likelihood analysis of the constraints from accelerator experiments and astrophysical observations on supersymmetric (SUSY) models with SU(5) boundary conditions on soft SUSY-breaking parameters at the GUT scale. The parameter space of the models studied has 7 parameters: a universal gaugino mass $m_{1/2}$, distinct masses for the scalar partners of matter fermions in five- and ten-dimensional representations of SU(5), $m_5$ and $m_{10}$, and for the $\\mathbf{5}$ and $\\mathbf{\\bar 5}$ Higgs representations $m_{H_u}$ and $m_{H_d}$, a universal trilinear soft SUSY-breaking parameter $A_0$, and the ratio of Higgs vevs $\\tan \\beta$. In addition to previous constraints from direct sparticle searches, low-energy and flavour observables, we incorporate constraints based on preliminary results from 13 TeV LHC searches for jets + MET events and long-lived particles, as well as the latest PandaX-II and LUX searches for direct Dark Matter detection. In addition to previously-identified mechanisms for bringi...

  3. Reducing the likelihood of long tennis matches.

    Science.gov (United States)

    Barnett, Tristan; Alan, Brown; Pollard, Graham

    2006-01-01

    Long matches can cause problems for tournaments. For example, the starting times of subsequent matches can be substantially delayed causing inconvenience to players, spectators, officials and television scheduling. They can even be seen as unfair in the tournament setting when the winner of a very long match, who may have negative aftereffects from such a match, plays the winner of an average or shorter length match in the next round. Long matches can also lead to injuries to the participating players. One factor that can lead to long matches is the use of the advantage set as the fifth set, as in the Australian Open, the French Open and Wimbledon. Another factor is long rallies and a greater than average number of points per game. This tends to occur more frequently on the slower surfaces such as at the French Open. The mathematical method of generating functions is used to show that the likelihood of long matches can be substantially reduced by using the tiebreak game in the fifth set, or more effectively by using a new type of game, the 50-40 game, throughout the match. Key PointsThe cumulant generating function has nice properties for calculating the parameters of distributions in a tennis matchA final tiebreaker set reduces the length of matches as currently being used in the US OpenA new 50-40 game reduces the length of matches whilst maintaining comparable probabilities for the better player to win the match.

  4. Dimension-independent likelihood-informed MCMC

    KAUST Repository

    Cui, Tiangang; Law, Kody; Marzouk, Youssef M.

    2015-01-01

    Many Bayesian inference problems require exploring the posterior distribution of high-dimensional parameters that represent the discretization of an underlying function. This work introduces a family of Markov chain Monte Carlo (MCMC) samplers that can adapt to the particular structure of a posterior distribution over functions. Two distinct lines of research intersect in the methods developed here. First, we introduce a general class of operator-weighted proposal distributions that are well defined on function space, such that the performance of the resulting MCMC samplers is independent of the discretization of the function. Second, by exploiting local Hessian information and any associated low-dimensional structure in the change from prior to posterior distributions, we develop an inhomogeneous discretization scheme for the Langevin stochastic differential equation that yields operator-weighted proposals adapted to the non-Gaussian structure of the posterior. The resulting dimension-independent and likelihood-informed (DILI) MCMC samplers may be useful for a large class of high-dimensional problems where the target probability measure has a density with respect to a Gaussian reference measure. Two nonlinear inverse problems are used to demonstrate the efficiency of these DILI samplers: an elliptic PDE coefficient inverse problem and path reconstruction in a conditioned diffusion.

  5. Maximum likelihood window for time delay estimation

    International Nuclear Information System (INIS)

    Lee, Young Sup; Yoon, Dong Jin; Kim, Chi Yup

    2004-01-01

    Time delay estimation for the detection of leak location in underground pipelines is critically important. Because the exact leak location depends upon the precision of the time delay between sensor signals due to leak noise and the speed of elastic waves, the research on the estimation of time delay has been one of the key issues in leak lovating with the time arrival difference method. In this study, an optimal Maximum Likelihood window is considered to obtain a better estimation of the time delay. This method has been proved in experiments, which can provide much clearer and more precise peaks in cross-correlation functions of leak signals. The leak location error has been less than 1 % of the distance between sensors, for example the error was not greater than 3 m for 300 m long underground pipelines. Apart from the experiment, an intensive theoretical analysis in terms of signal processing has been described. The improved leak locating with the suggested method is due to the windowing effect in frequency domain, which offers a weighting in significant frequencies.

  6. Optimized Large-scale CMB Likelihood and Quadratic Maximum Likelihood Power Spectrum Estimation

    Science.gov (United States)

    Gjerløw, E.; Colombo, L. P. L.; Eriksen, H. K.; Górski, K. M.; Gruppuso, A.; Jewell, J. B.; Plaszczynski, S.; Wehus, I. K.

    2015-11-01

    We revisit the problem of exact cosmic microwave background (CMB) likelihood and power spectrum estimation with the goal of minimizing computational costs through linear compression. This idea was originally proposed for CMB purposes by Tegmark et al., and here we develop it into a fully functioning computational framework for large-scale polarization analysis, adopting WMAP as a working example. We compare five different linear bases (pixel space, harmonic space, noise covariance eigenvectors, signal-to-noise covariance eigenvectors, and signal-plus-noise covariance eigenvectors) in terms of compression efficiency, and find that the computationally most efficient basis is the signal-to-noise eigenvector basis, which is closely related to the Karhunen-Loeve and Principal Component transforms, in agreement with previous suggestions. For this basis, the information in 6836 unmasked WMAP sky map pixels can be compressed into a smaller set of 3102 modes, with a maximum error increase of any single multipole of 3.8% at ℓ ≤ 32 and a maximum shift in the mean values of a joint distribution of an amplitude-tilt model of 0.006σ. This compression reduces the computational cost of a single likelihood evaluation by a factor of 5, from 38 to 7.5 CPU seconds, and it also results in a more robust likelihood by implicitly regularizing nearly degenerate modes. Finally, we use the same compression framework to formulate a numerically stable and computationally efficient variation of the Quadratic Maximum Likelihood implementation, which requires less than 3 GB of memory and 2 CPU minutes per iteration for ℓ ≤ 32, rendering low-ℓ QML CMB power spectrum analysis fully tractable on a standard laptop.

  7. Impact of relationship dynamics and gender roles in the protection of HIV discordant heterosexual couples: an exploratory study in the Puerto Rican context.

    Science.gov (United States)

    Orengo-Aguayo, Rosaura; Pérez-Jiménez, David

    2009-03-01

    Most of the HIV/AIDS prevention efforts have not taken into consideration the context of the relationship and the gender constructs that influence relationship dynamics. These efforts have failed to view HIV prevention as a collaborative process between partners. Therefore, it is important to explore how relationship dynamics and gender constructs influence how men and women involved in an HIV discordant heterosexual relationship, visualize their role in the protection of their partners in order to design more effective prevention interventions. Five Puerto Rican HIV discordant heterosexual couples were interviewed via a qualitative semi-structured interview. The taped interviews were transcribed and analyzed using content analysis according to a set of defined categories. Women visualized their role as one of convincing their partners to use protection as well as being strong and firm in the demand of its use. Men viewed their role as one of being more supportive and willing to use protection, but recognized their resistance towards the use of condoms. Relationship dynamics such as communication and support promoted protection. Traditional and non-traditional gender roles were assumed by both men and women. Traditional gender roles inhibited protection but were also used in positive ways to promote it. Men showed a greater initiative to break with traditional gender norms. A positive relationship, marked by communication and support could serve as a facilitator in the protection and in the transformation of traditional gender norms. This points out to the need of viewing HIV/AIDS prevention as a collaborative rather than individualistic process.

  8. Likelihood analysis of supersymmetric SU(5) GUTs

    Energy Technology Data Exchange (ETDEWEB)

    Bagnaschi, E.; Weiglein, G. [DESY, Hamburg (Germany); Costa, J.C.; Buchmueller, O.; Citron, M.; Richards, A.; De Vries, K.J. [Imperial College, High Energy Physics Group, Blackett Laboratory, London (United Kingdom); Sakurai, K. [University of Durham, Science Laboratories, Department of Physics, Institute for Particle Physics Phenomenology, Durham (United Kingdom); University of Warsaw, Faculty of Physics, Institute of Theoretical Physics, Warsaw (Poland); Borsato, M.; Chobanova, V.; Lucio, M.; Martinez Santos, D. [Universidade de Santiago de Compostela, Santiago de Compostela (Spain); Cavanaugh, R. [Fermi National Accelerator Laboratory, Batavia, IL (United States); University of Illinois at Chicago, Physics Department, Chicago, IL (United States); Roeck, A. de [CERN, Experimental Physics Department, Geneva (Switzerland); Antwerp University, Wilrijk (Belgium); Dolan, M.J. [University of Melbourne, ARC Centre of Excellence for Particle Physics at the Terascale, School of Physics, Parkville (Australia); Ellis, J.R. [King' s College London, Theoretical Particle Physics and Cosmology Group, Department of Physics, London (United Kingdom); Theoretical Physics Department, CERN, Geneva 23 (Switzerland); Flaecher, H. [University of Bristol, H.H. Wills Physics Laboratory, Bristol (United Kingdom); Heinemeyer, S. [Campus of International Excellence UAM+CSIC, Cantoblanco, Madrid (Spain); Instituto de Fisica Teorica UAM-CSIC, Madrid (Spain); Instituto de Fisica de Cantabria (CSIC-UC), Santander (Spain); Isidori, G. [Universitaet Zuerich, Physik-Institut, Zurich (Switzerland); Olive, K.A. [University of Minnesota, William I. Fine Theoretical Physics Institute, School of Physics and Astronomy, Minneapolis, MN (United States)

    2017-02-15

    We perform a likelihood analysis of the constraints from accelerator experiments and astrophysical observations on supersymmetric (SUSY) models with SU(5) boundary conditions on soft SUSY-breaking parameters at the GUT scale. The parameter space of the models studied has seven parameters: a universal gaugino mass m{sub 1/2}, distinct masses for the scalar partners of matter fermions in five- and ten-dimensional representations of SU(5), m{sub 5} and m{sub 10}, and for the 5 and anti 5 Higgs representations m{sub H{sub u}} and m{sub H{sub d}}, a universal trilinear soft SUSY-breaking parameter A{sub 0}, and the ratio of Higgs vevs tan β. In addition to previous constraints from direct sparticle searches, low-energy and flavour observables, we incorporate constraints based on preliminary results from 13 TeV LHC searches for jets + E{sub T} events and long-lived particles, as well as the latest PandaX-II and LUX searches for direct Dark Matter detection. In addition to previously identified mechanisms for bringing the supersymmetric relic density into the range allowed by cosmology, we identify a novel u{sub R}/c{sub R} - χ{sup 0}{sub 1} coannihilation mechanism that appears in the supersymmetric SU(5) GUT model and discuss the role of ν{sub τ} coannihilation. We find complementarity between the prospects for direct Dark Matter detection and SUSY searches at the LHC. (orig.)

  9. Likelihood analysis of supersymmetric SU(5) GUTs

    Energy Technology Data Exchange (ETDEWEB)

    Bagnaschi, E. [DESY, Hamburg (Germany); Costa, J.C. [Imperial College, London (United Kingdom). Blackett Lab.; Sakurai, K. [Durham Univ. (United Kingdom). Inst. for Particle Physics Phenomonology; Warsaw Univ. (Poland). Inst. of Theoretical Physics; Collaboration: MasterCode Collaboration; and others

    2016-10-15

    We perform a likelihood analysis of the constraints from accelerator experiments and astrophysical observations on supersymmetric (SUSY) models with SU(5) boundary conditions on soft SUSY-breaking parameters at the GUT scale. The parameter space of the models studied has 7 parameters: a universal gaugino mass m{sub 1/2}, distinct masses for the scalar partners of matter fermions in five- and ten-dimensional representations of SU(5), m{sub 5} and m{sub 10}, and for the 5 and anti 5 Higgs representations m{sub H{sub u}} and m{sub H{sub d}}, a universal trilinear soft SUSY-breaking parameter A{sub 0}, and the ratio of Higgs vevs tan β. In addition to previous constraints from direct sparticle searches, low-energy and avour observables, we incorporate constraints based on preliminary results from 13 TeV LHC searches for jets+E{sub T} events and long-lived particles, as well as the latest PandaX-II and LUX searches for direct Dark Matter detection. In addition to previously-identified mechanisms for bringing the supersymmetric relic density into the range allowed by cosmology, we identify a novel u{sub R}/c{sub R}-χ{sup 0}{sub 1} coannihilation mechanism that appears in the supersymmetric SU(5) GUT model and discuss the role of ν{sub T} coannihilation. We find complementarity between the prospects for direct Dark Matter detection and SUSY searches at the LHC.

  10. Grain-scale numerical modeling of granular mechanics and fluid dynamics and application in a glacial context

    DEFF Research Database (Denmark)

    Damsgaard, Anders; Egholm, David Lundbek; Beem, Lucas H.

    The macroscopic behavior of granular materials is the result of the self-organizing complexity of the constituent grains. Granular materials are known for their ability to change phase, where each phase is characterized by distinct mechanical properties. This rich generic phenomenology has made...... it difficult to constrain generalized and adequate mathematical models for their mechanical behavior. Glaciers and ice streams often move by deformation of underlying melt-water saturated sediments. Glacier flow models including subglacial sediment deformation use simplified a priori assumptions for sediment......, the method imposes intense computational requirements on the computational time step. The majority of steps in the granular dynamics algorithm are massively parallel, which makes the DEM an obvious candidate for exploiting the capabilities of modern GPUs. The granular computations are coupled to a fluid...

  11. Dealing with the Empty Vehicle Movements in Personal Rapid Transit System with Batteries Constraints in a Dynamic Context

    Directory of Open Access Journals (Sweden)

    Ezzeddine Fatnassi

    2017-01-01

    Full Text Available The Personal Rapid Transit is a new emergent transportation tool. It relies on using a set of small driverless electric vehicles to transport people on demand. Because of the specific on-demand characteristic of the Personal Rapid Transit system, many Personal Rapid Transit vehicles would move empty which results in a high level of wasted transportation capacity. This is enhanced while using Personal Rapid Transit vehicles with limited electric battery capacity. This paper deals with this problem in a real time context while minimizing the set of empty vehicle movements. First, a mathematical formulation to benchmark waiting time of passengers in Personal Rapid Transit systems is proposed. Then, a simulation model that captures the main features of the Personal Rapid Transit system is developed. A decision support system which integrates several real time solution strategies as well as a simulation module is proposed. Our dispatching strategies are evaluated and compared based on our simulation model. The efficiency of our method is tested through extensive test studies.

  12. How parents whose children have been conceived with donor gametes make their disclosure decision: contexts, influences, and couple dynamics.

    Science.gov (United States)

    Shehab, Dena; Duff, Julia; Pasch, Lauri A; Mac Dougall, Kirstin; Scheib, Joanna E; Nachtigall, Robert D

    2008-01-01

    To describe parents' disclosure decision-making process. In-depth ethnographic interviews. Participants were recruited from 11 medical infertility practices and 1 sperm bank in Northern California. One hundred forty-one married couples who had conceived a child using donor gametes (62 with donor sperm, 79 with donor oocytes). Husbands and wives were interviewed together and separately. Thematic analysis of interview transcripts. Ninety-five percent of couples came to a united disclosure decision, some "intuitively," but most after discussions influenced by the couples' local sociopolitical environment, professional opinion, counseling, religious and cultural background, family relationships, and individual personal, psychological, and ethical beliefs. Couples who were not initially in agreement ultimately came to a decision after one partner deferred to the wishes or opinions of the other. Deferral could reflect the result of a prior agreement, one partner's recognition of the other's experiential or emotional expertise, or direct persuasion. In disclosing couples, men frequently deferred to their wives, whereas, in nondisclosing couples, women always deferred to their husbands. Although the majority of couples were in initial agreement about disclosure, for many the disclosure decision was a complex, negotiated process reflecting a wide range of influences and contexts.

  13. Maximum likelihood estimation for cytogenetic dose-response curves

    International Nuclear Information System (INIS)

    Frome, E.L; DuFrain, R.J.

    1983-10-01

    In vitro dose-response curves are used to describe the relation between the yield of dicentric chromosome aberrations and radiation dose for human lymphocytes. The dicentric yields follow the Poisson distribution, and the expected yield depends on both the magnitude and the temporal distribution of the dose for low LET radiation. A general dose-response model that describes this relation has been obtained by Kellerer and Rossi using the theory of dual radiation action. The yield of elementary lesions is kappa[γd + g(t, tau)d 2 ], where t is the time and d is dose. The coefficient of the d 2 term is determined by the recovery function and the temporal mode of irradiation. Two special cases of practical interest are split-dose and continuous exposure experiments, and the resulting models are intrinsically nonlinear in the parameters. A general purpose maximum likelihood estimation procedure is described and illustrated with numerical examples from both experimental designs. Poisson regression analysis is used for estimation, hypothesis testing, and regression diagnostics. Results are discussed in the context of exposure assessment procedures for both acute and chronic human radiation exposure

  14. Maximum likelihood estimation for cytogenetic dose-response curves

    International Nuclear Information System (INIS)

    Frome, E.L.; DuFrain, R.J.

    1986-01-01

    In vitro dose-response curves are used to describe the relation between chromosome aberrations and radiation dose for human lymphocytes. The lymphocytes are exposed to low-LET radiation, and the resulting dicentric chromosome aberrations follow the Poisson distribution. The expected yield depends on both the magnitude and the temporal distribution of the dose. A general dose-response model that describes this relation has been presented by Kellerer and Rossi (1972, Current Topics on Radiation Research Quarterly 8, 85-158; 1978, Radiation Research 75, 471-488) using the theory of dual radiation action. Two special cases of practical interest are split-dose and continuous exposure experiments, and the resulting dose-time-response models are intrinsically nonlinear in the parameters. A general-purpose maximum likelihood estimation procedure is described, and estimation for the nonlinear models is illustrated with numerical examples from both experimental designs. Poisson regression analysis is used for estimation, hypothesis testing, and regression diagnostics. Results are discussed in the context of exposure assessment procedures for both acute and chronic human radiation exposure

  15. Maximum likelihood estimation for cytogenetic dose-response curves

    Energy Technology Data Exchange (ETDEWEB)

    Frome, E.L; DuFrain, R.J.

    1983-10-01

    In vitro dose-response curves are used to describe the relation between the yield of dicentric chromosome aberrations and radiation dose for human lymphocytes. The dicentric yields follow the Poisson distribution, and the expected yield depends on both the magnitude and the temporal distribution of the dose for low LET radiation. A general dose-response model that describes this relation has been obtained by Kellerer and Rossi using the theory of dual radiation action. The yield of elementary lesions is kappa(..gamma..d + g(t, tau)d/sup 2/), where t is the time and d is dose. The coefficient of the d/sup 2/ term is determined by the recovery function and the temporal mode of irradiation. Two special cases of practical interest are split-dose and continuous exposure experiments, and the resulting models are intrinsically nonlinear in the parameters. A general purpose maximum likelihood estimation procedure is described and illustrated with numerical examples from both experimental designs. Poisson regression analysis is used for estimation, hypothesis testing, and regression diagnostics. Results are discussed in the context of exposure assessment procedures for both acute and chronic human radiation exposure.

  16. The Dark Side of Context: Context Reinstatement Can Distort Memory.

    Science.gov (United States)

    Doss, Manoj K; Picart, Jamila K; Gallo, David A

    2018-04-01

    It is widely assumed that context reinstatement benefits memory, but our experiments revealed that context reinstatement can systematically distort memory. Participants viewed pictures of objects superimposed over scenes, and we later tested their ability to differentiate these old objects from similar new objects. Context reinstatement was manipulated by presenting objects on the reinstated or switched scene at test. Not only did context reinstatement increase correct recognition of old objects, but it also consistently increased incorrect recognition of similar objects as old ones. This false recognition effect was robust, as it was found in several experiments, occurred after both immediate and delayed testing, and persisted with high confidence even after participants were warned to avoid the distorting effects of context. To explain this memory illusion, we propose that context reinstatement increases the likelihood of confusing conceptual and perceptual information, potentially in medial temporal brain regions that integrate this information.

  17. Context-specific control and context selection in conflict tasks.

    Science.gov (United States)

    Schouppe, Nathalie; Ridderinkhof, K Richard; Verguts, Tom; Notebaert, Wim

    2014-02-01

    This study investigated whether participants prefer contexts with relatively little cognitive conflict and whether this preference is related to context-specific control. A conflict selection task was administered in which participants had to choose between two categories that contained different levels of conflict. One category was associated with 80% congruent Stroop trials and 20% incongruent Stroop trials, while the other category was associated with only 20% congruent Stroop trials and 80% incongruent Stroop trials. As predicted, participants selected the low-conflict category more frequently, indicating that participants avoid contexts with high-conflict likelihood. Furthermore, we predicted a correlation between this preference for the low-conflict category and the control implementation associated with the categories (i.e., context-specific proportion congruency effect, CSPC effect). Results however did not show such a correlation, thereby failing to support a relationship between context control and context selection. Copyright © 2013 Elsevier B.V. All rights reserved.

  18. Advancing the application of systems thinking in health: managing rural China health system development in complex and dynamic contexts.

    Science.gov (United States)

    Zhang, Xiulan; Bloom, Gerald; Xu, Xiaoxin; Chen, Lin; Liang, Xiaoyun; Wolcott, Sara J

    2014-08-26

    This paper explores the evolution of schemes for rural finance in China as a case study of the long and complex process of health system development. It argues that the evolution of these schemes has been the outcome of the response of a large number of agents to a rapidly changing context and of efforts by the government to influence this adaptation process and achieve public health goals. The study draws on several sources of data including a review of official policy documents and academic papers and in-depth interviews with key policy actors at national level and at a sample of localities. The study identifies three major transition points associated with changes in broad development strategy and demonstrates how the adaptation of large numbers of actors to these contextual changes had a major impact on the performance of the health system. Further, it documents how the Ministry of Health viewed its role as both an advocate for the interests of health facilities and health workers and as the agency responsible for ensuring that government health system objectives were met. It is argued that a major reason for the resilience of the health system and its ability to adapt to rapid economic and institutional change was the ability of the Ministry to provide overall strategy leadership. Additionally, it postulates that a number of interest groups have emerged, which now also seek to influence the pathway of health system development. This history illustrates the complex and political nature of the management of health system development and reform. The paper concludes that governments will need to increase their capacity to analyze the health sector as a complex system and to manage change processes.

  19. Dynamic Effects of Self-Relevance and Task on the Neural Processing of Emotional Words in Context.

    Science.gov (United States)

    Fields, Eric C; Kuperberg, Gina R

    2015-01-01

    We used event-related potentials (ERPs) to examine the interactions between task, emotion, and contextual self-relevance on processing words in social vignettes. Participants read scenarios that were in either third person (other-relevant) or second person (self-relevant) and we recorded ERPs to a neutral, pleasant, or unpleasant critical word. In a previously reported study (Fields and Kuperberg, 2012) with these stimuli, participants were tasked with producing a third sentence continuing the scenario. We observed a larger LPC to emotional words than neutral words in both the self-relevant and other-relevant scenarios, but this effect was smaller in the self-relevant scenarios because the LPC was larger on the neutral words (i.e., a larger LPC to self-relevant than other-relevant neutral words). In the present work, participants simply answered comprehension questions that did not refer to the emotional aspects of the scenario. Here we observed quite a different pattern of interaction between self-relevance and emotion: the LPC was larger to emotional vs. neutral words in the self-relevant scenarios only, and there was no effect of self-relevance on neutral words. Taken together, these findings suggest that the LPC reflects a dynamic interaction between specific task demands, the emotional properties of a stimulus, and contextual self-relevance. We conclude by discussing implications and future directions for a functional theory of the emotional LPC.

  20. Applying the Dynamic Social Systems Model to HIV prevention in a rural African context: the Maasai and the esoto dance.

    Science.gov (United States)

    Siegler, Aaron J; Mbwambo, Jessie K; DiClemente, Ralph J

    2013-12-01

    This study applied the Dynamic Social Systems Model (DSSM) to the issue of HIV risk among the Maasai tribe of Tanzania, using data from a cross-sectional, cluster survey among 370 randomly selected participants from Ngorongoro and Siha Districts. A culturally appropriate survey instrument was developed to explore traditions reportedly coadunate with sexual partnership, including "wife sharing", fertility rituals, and various traditional dances. One dance, esoto, accounted for more than two thirds of participants' lifetime sexual partners (n = 10.5). The DSSM, combining structural and systems theories, was applied to systematize complex multilevel factors regarding esoto practice. Participants reported multifaceted beliefs regarding esoto; a majority viewed the dance as exciting and essential, yet most men feared social stigma and three quarters of women had experienced physical punishment for nonattendance. In multivariate logistic regression, esoto attendance was predicted by female gender (adjusted odds ratio [AOR] = 4.67, 95% confidence interval [CI] = 1.6-13.2), higher positive beliefs regarding esoto (AOR = 2.84, 95% CI = 1.9-4.2), and Maasai life cycle events (AOR = 0.06, 95% CI = 0.01-0.47). The DSSM proved useful for characterizing esoto and for revealing feedback loops that maintain esoto, thus indicating avenues for future interventions.

  1. Dynamic effects of self-relevance and task on the neural processing of emotional words in context

    Directory of Open Access Journals (Sweden)

    Eric C. Fields

    2016-01-01

    Full Text Available We used event-related potentials (ERPs to examine the interactions between task, emotion, and contextual self-relevance on processing words in social vignettes. Participants read scenarios that were in either third person (other-relevant or second person (self-relevant and we recorded ERPs to a neutral, pleasant, or unpleasant critical word. In a previously reported study (Fields & Kuperberg, 2012 with these stimuli, participants were tasked with producing a third sentence continuing the scenario. We observed a larger LPC to emotional words than neutral words in both the self-relevant and other-relevant scenarios, but this effect was smaller in the self-relevant scenarios because the LPC was larger on the neutral words (i.e., a larger LPC to self-relevant than other-relevant neutral words. In the present work, participants simply answered comprehension questions that did not refer to the emotional aspects of the scenario. Here we observed quite a different pattern of interaction between self-relevance and emotion: the LPC was larger to emotional versus neutral words in the self-relevant scenarios only, and there was no effect of self-relevance on neutral words. Taken together, these findings suggest that the LPC reflects a dynamic interaction between specific task demands, the emotional properties of a stimulus, and contextual self-relevance. We conclude by discussing implications and future directions for a functional theory of the emotional LPC.

  2. Penalized Maximum Likelihood Estimation for univariate normal mixture distributions

    International Nuclear Information System (INIS)

    Ridolfi, A.; Idier, J.

    2001-01-01

    Due to singularities of the likelihood function, the maximum likelihood approach for the estimation of the parameters of normal mixture models is an acknowledged ill posed optimization problem. Ill posedness is solved by penalizing the likelihood function. In the Bayesian framework, it amounts to incorporating an inverted gamma prior in the likelihood function. A penalized version of the EM algorithm is derived, which is still explicit and which intrinsically assures that the estimates are not singular. Numerical evidence of the latter property is put forward with a test

  3. How Four Scientists Integrate Thermodynamic and Kinetic Theory, Context, Analogies, and Methods in Protein-Folding and Dynamics Research: Implications for Biochemistry Instruction.

    Science.gov (United States)

    Jeffery, Kathleen A; Pelaez, Nancy; Anderson, Trevor R

    2018-01-01

    To keep biochemistry instruction current and relevant, it is crucial to expose students to cutting-edge scientific research and how experts reason about processes governed by thermodynamics and kinetics such as protein folding and dynamics. This study focuses on how experts explain their research into this topic with the intention of informing instruction. Previous research has modeled how expert biologists incorporate research methods, social or biological context, and analogies when they talk about their research on mechanisms. We used this model as a guiding framework to collect and analyze interview data from four experts. The similarities and differences that emerged from analysis indicate that all experts integrated theoretical knowledge with their research context, methods, and analogies when they explained how phenomena operate, in particular by mapping phenomena to mathematical models; they explored different processes depending on their explanatory aims, but readily transitioned between different perspectives and explanatory models; and they explained thermodynamic and kinetic concepts of relevance to protein folding in different ways that aligned with their particular research methods. We discuss how these findings have important implications for teaching and future educational research. © 2018 K. A. Jeffery et al. CBE—Life Sciences Education © 2018 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  4. Seasonal species interactions minimize the impact of species turnover on the likelihood of community persistence.

    Science.gov (United States)

    Saavedra, Serguei; Rohr, Rudolf P; Fortuna, Miguel A; Selva, Nuria; Bascompte, Jordi

    2016-04-01

    Many of the observed species interactions embedded in ecological communities are not permanent, but are characterized by temporal changes that are observed along with abiotic and biotic variations. While work has been done describing and quantifying these changes, little is known about their consequences for species coexistence. Here, we investigate the extent to which changes of species composition impact the likelihood of persistence of the predator-prey community in the highly seasonal Białowieza Primeval Forest (northeast Poland), and the extent to which seasonal changes of species interactions (predator diet) modulate the expected impact. This likelihood is estimated extending recent developments on the study of structural stability in ecological communities. We find that the observed species turnover strongly varies the likelihood of community persistence between summer and winter. Importantly, we demonstrate that the observed seasonal interaction changes minimize the variation in the likelihood of persistence associated with species turnover across the year. We find that these community dynamics can be explained as the coupling of individual species to their environment by minimizing both the variation in persistence conditions and the interaction changes between seasons. Our results provide a homeostatic explanation for seasonal species interactions and suggest that monitoring the association of interactions changes with the level of variation in community dynamics can provide a good indicator of the response of species to environmental pressures.

  5. Dynamics of Metropolitan Landscapes and Daily Mobility Flows in the Italian Context. An Analysis Based on the Theory of Graphs

    Directory of Open Access Journals (Sweden)

    Amedeo Ganciu

    2018-02-01

    Full Text Available The distribution of services across a territory generates daily commuting flows, which have a significant influence on the development of the territory and often causes congestion in large areas. This negatively affects the environmental, economic and social components of the metropolitan landscape. Using the graph theory, we constructed and analyzed various (in typologies of transportation and moving time flow networks in the two main Italian metropolitan areas: Rome (MCR and Milan (MCM. The analysis of these networks provided us with strategic information on the dynamics of the two urban macro-systems. In particular, the aim of our study was to: (i identify the characteristics, distribution and direction of the main attractive forces within the regional systems under study; (ii identify the main differences in size and structure of commuter networks between the two metropolitan areas and between the two regional systems that include the two mother cities; and, (iii identify the main differences in the size and structure of the two commuting networks by transport modes (private, public, non-motorized mobility and the travel time. The results highlighted significant differences between the two case studies regarding volume flows, complexity and structure networks, and the spatial extension of the territories that are governed by the two metropolitan areas. MCR is a strongly monocentric urban system with a regional influence centred on the mother city of Rome, while MCM is a diffused polycentric regional metropolitan system centred on multiple mother cities. The findings many have a role in urban planning choices and in the evaluation of policies aimed to favor sustainable mobility.

  6. Maximum likelihood approach for several stochastic volatility models

    International Nuclear Information System (INIS)

    Camprodon, Jordi; Perelló, Josep

    2012-01-01

    Volatility measures the amplitude of price fluctuations. Despite it being one of the most important quantities in finance, volatility is not directly observable. Here we apply a maximum likelihood method which assumes that price and volatility follow a two-dimensional diffusion process where volatility is the stochastic diffusion coefficient of the log-price dynamics. We apply this method to the simplest versions of the expOU, the OU and the Heston stochastic volatility models and we study their performance in terms of the log-price probability, the volatility probability, and its Mean First-Passage Time. The approach has some predictive power on the future returns amplitude by only knowing the current volatility. The assumed models do not consider long-range volatility autocorrelation and the asymmetric return-volatility cross-correlation but the method still yields very naturally these two important stylized facts. We apply the method to different market indices and with a good performance in all cases. (paper)

  7. Analysis and prediction of pest dynamics in an agroforestry context using Tiko'n, a generic tool to develop food web models

    Science.gov (United States)

    Rojas, Marcela; Malard, Julien; Adamowski, Jan; Carrera, Jaime Luis; Maas, Raúl

    2017-04-01

    While it is known that climate change will impact future plant-pest population dynamics, potentially affecting crop damage, agroforestry with its enhanced biodiversity is said to reduce the outbreaks of pest insects by providing natural enemies for the control of pest populations. This premise is known in the literature as the natural enemy hypothesis and has been widely studied qualitatively. However, disagreement still exists on whether biodiversity enhancement reduces pest outbreaks, showing the need of quantitatively understanding the mechanisms behind the interactions between pests and natural enemies, also known as trophic interactions. Crop pest models that study insect population dynamics in agroforestry contexts are very rare, and pest models that take trophic interactions into account are even rarer. This may be due to the difficulty of representing complex food webs in a quantifiable model. There is therefore a need for validated food web models that allow users to predict the response of these webs to changes in climate in agroforestry systems. In this study we present Tiko'n, a Python-based software whose API allows users to rapidly build and validate trophic web models; the program uses a Bayesian inference approach to calibrate the models according to field data, allowing for the reuse of literature data from various sources and reducing the need for extensive field data collection. Tiko'n was run using coffee leaf miner (Leucoptera coffeella) and associated parasitoid data from a shaded coffee plantation, showing the mechanisms of insect population dynamics within a tri-trophic food web in an agroforestry system.

  8. Efficient Detection of Repeating Sites to Accelerate Phylogenetic Likelihood Calculations.

    Science.gov (United States)

    Kobert, K; Stamatakis, A; Flouri, T

    2017-03-01

    The phylogenetic likelihood function (PLF) is the major computational bottleneck in several applications of evolutionary biology such as phylogenetic inference, species delimitation, model selection, and divergence times estimation. Given the alignment, a tree and the evolutionary model parameters, the likelihood function computes the conditional likelihood vectors for every node of the tree. Vector entries for which all input data are identical result in redundant likelihood operations which, in turn, yield identical conditional values. Such operations can be omitted for improving run-time and, using appropriate data structures, reducing memory usage. We present a fast, novel method for identifying and omitting such redundant operations in phylogenetic likelihood calculations, and assess the performance improvement and memory savings attained by our method. Using empirical and simulated data sets, we show that a prototype implementation of our method yields up to 12-fold speedups and uses up to 78% less memory than one of the fastest and most highly tuned implementations of the PLF currently available. Our method is generic and can seamlessly be integrated into any phylogenetic likelihood implementation. [Algorithms; maximum likelihood; phylogenetic likelihood function; phylogenetics]. © The Author(s) 2016. Published by Oxford University Press, on behalf of the Society of Systematic Biologists.

  9. Planck intermediate results: XVI. Profile likelihoods for cosmological parameters

    DEFF Research Database (Denmark)

    Bartlett, J.G.; Cardoso, J.-F.; Delabrouille, J.

    2014-01-01

    We explore the 2013 Planck likelihood function with a high-precision multi-dimensional minimizer (Minuit). This allows a refinement of the CDM best-fit solution with respect to previously-released results, and the construction of frequentist confidence intervals using profile likelihoods. The agr...

  10. Planck 2013 results. XV. CMB power spectra and likelihood

    DEFF Research Database (Denmark)

    Tauber, Jan; Bartlett, J.G.; Bucher, M.

    2014-01-01

    This paper presents the Planck 2013 likelihood, a complete statistical description of the two-point correlation function of the CMB temperature fluctuations that accounts for all known relevant uncertainties, both instrumental and astrophysical in nature. We use this likelihood to derive our best...

  11. The modified signed likelihood statistic and saddlepoint approximations

    DEFF Research Database (Denmark)

    Jensen, Jens Ledet

    1992-01-01

    SUMMARY: For a number of tests in exponential families we show that the use of a normal approximation to the modified signed likelihood ratio statistic r * is equivalent to the use of a saddlepoint approximation. This is also true in a large deviation region where the signed likelihood ratio...... statistic r is of order √ n. © 1992 Biometrika Trust....

  12. Validation of DNA-based identification software by computation of pedigree likelihood ratios.

    Science.gov (United States)

    Slooten, K

    2011-08-01

    Disaster victim identification (DVI) can be aided by DNA-evidence, by comparing the DNA-profiles of unidentified individuals with those of surviving relatives. The DNA-evidence is used optimally when such a comparison is done by calculating the appropriate likelihood ratios. Though conceptually simple, the calculations can be quite involved, especially with large pedigrees, precise mutation models etc. In this article we describe a series of test cases designed to check if software designed to calculate such likelihood ratios computes them correctly. The cases include both simple and more complicated pedigrees, among which inbred ones. We show how to calculate the likelihood ratio numerically and algebraically, including a general mutation model and possibility of allelic dropout. In Appendix A we show how to derive such algebraic expressions mathematically. We have set up these cases to validate new software, called Bonaparte, which performs pedigree likelihood ratio calculations in a DVI context. Bonaparte has been developed by SNN Nijmegen (The Netherlands) for the Netherlands Forensic Institute (NFI). It is available free of charge for non-commercial purposes (see www.dnadvi.nl for details). Commercial licenses can also be obtained. The software uses Bayesian networks and the junction tree algorithm to perform its calculations. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  13. Likelihood analysis of parity violation in the compound nucleus

    International Nuclear Information System (INIS)

    Bowman, D.; Sharapov, E.

    1993-01-01

    We discuss the determination of the root mean-squared matrix element of the parity-violating interaction between compound-nuclear states using likelihood analysis. We briefly review the relevant features of the statistical model of the compound nucleus and the formalism of likelihood analysis. We then discuss the application of likelihood analysis to data on panty-violating longitudinal asymmetries. The reliability of the extracted value of the matrix element and errors assigned to the matrix element is stressed. We treat the situations where the spins of the p-wave resonances are not known and known using experimental data and Monte Carlo techniques. We conclude that likelihood analysis provides a reliable way to determine M and its confidence interval. We briefly discuss some problems associated with the normalization of the likelihood function

  14. On-line validation of linear process models using generalized likelihood ratios

    International Nuclear Information System (INIS)

    Tylee, J.L.

    1981-12-01

    A real-time method for testing the validity of linear models of nonlinear processes is described and evaluated. Using generalized likelihood ratios, the model dynamics are continually monitored to see if the process has moved far enough away from the nominal linear model operating point to justify generation of a new linear model. The method is demonstrated using a seventh-order model of a natural circulation steam generator

  15. Climatic and ecological future of the Amazon: likelihood and causes of change

    OpenAIRE

    B. Cook; N. Zeng; J.-H. Yoon

    2010-01-01

    Some recent climate modeling results suggested a possible dieback of the Amazon rainforest under future climate change, a prediction that raised considerable interest as well as controversy. To determine the likelihood and causes of such changes, we analyzed the output of 15 models from the Intergovernmental Panel on Climate Change Fourth Assessment Report (IPCC/AR4) and a dynamic vegetation model VEGAS driven by these climate output. Our results suggest that the core of the Amazon rainforest...

  16. Habit and context

    DEFF Research Database (Denmark)

    Mueller Loose, Simone; Jaeger, S. R.

    , but like the influence of context, quantification of its importance is lacking. To contribute to a closing of this gap, we analyse food dairy data from 100+ New Zealand consumers quantitatively with a variance component analysis. Food diaries, recording the eating occasion, beverages and meal food...... was used to examine the contribution of context factors (eating occasion, where, with whom), habit (share of beverage in consumption portfolio) and socio-demographic characteristics (gender, age) to explain the binary choice of seven main beverage types (water, hot beverages, milk, carbonated beverages...... predictor for its consumption likelihood. The impact of this measure for habit differed across beverages, for instance it played a larger role for hot beverages and water than for the consumption of beer and wine. Eating occasions and its interaction with place of consumption had highest explanatory power...

  17. The Context of Creating Space: Assessing the Likelihood of College LGBT Center Presence

    Science.gov (United States)

    Fine, Leigh E.

    2012-01-01

    LGBT (lesbian, gay, bisexual, and transgender) resource centers are campus spaces dedicated to the success of sexual minority students. However, only a small handful of American colleges and universities have such spaces. Political opportunity and resource mobilization theory can provide a useful framework for understanding what contextual factors…

  18. Posterior distributions for likelihood ratios in forensic science.

    Science.gov (United States)

    van den Hout, Ardo; Alberink, Ivo

    2016-09-01

    Evaluation of evidence in forensic science is discussed using posterior distributions for likelihood ratios. Instead of eliminating the uncertainty by integrating (Bayes factor) or by conditioning on parameter values, uncertainty in the likelihood ratio is retained by parameter uncertainty derived from posterior distributions. A posterior distribution for a likelihood ratio can be summarised by the median and credible intervals. Using the posterior mean of the distribution is not recommended. An analysis of forensic data for body height estimation is undertaken. The posterior likelihood approach has been criticised both theoretically and with respect to applicability. This paper addresses the latter and illustrates an interesting application area. Copyright © 2016 The Chartered Society of Forensic Sciences. Published by Elsevier Ireland Ltd. All rights reserved.

  19. Practical likelihood analysis for spatial generalized linear mixed models

    DEFF Research Database (Denmark)

    Bonat, W. H.; Ribeiro, Paulo Justiniano

    2016-01-01

    We investigate an algorithm for maximum likelihood estimation of spatial generalized linear mixed models based on the Laplace approximation. We compare our algorithm with a set of alternative approaches for two datasets from the literature. The Rhizoctonia root rot and the Rongelap are......, respectively, examples of binomial and count datasets modeled by spatial generalized linear mixed models. Our results show that the Laplace approximation provides similar estimates to Markov Chain Monte Carlo likelihood, Monte Carlo expectation maximization, and modified Laplace approximation. Some advantages...... of Laplace approximation include the computation of the maximized log-likelihood value, which can be used for model selection and tests, and the possibility to obtain realistic confidence intervals for model parameters based on profile likelihoods. The Laplace approximation also avoids the tuning...

  20. Algorithms of maximum likelihood data clustering with applications

    Science.gov (United States)

    Giada, Lorenzo; Marsili, Matteo

    2002-12-01

    We address the problem of data clustering by introducing an unsupervised, parameter-free approach based on maximum likelihood principle. Starting from the observation that data sets belonging to the same cluster share a common information, we construct an expression for the likelihood of any possible cluster structure. The likelihood in turn depends only on the Pearson's coefficient of the data. We discuss clustering algorithms that provide a fast and reliable approximation to maximum likelihood configurations. Compared to standard clustering methods, our approach has the advantages that (i) it is parameter free, (ii) the number of clusters need not be fixed in advance and (iii) the interpretation of the results is transparent. In order to test our approach and compare it with standard clustering algorithms, we analyze two very different data sets: time series of financial market returns and gene expression data. We find that different maximization algorithms produce similar cluster structures whereas the outcome of standard algorithms has a much wider variability.

  1. Generalized empirical likelihood methods for analyzing longitudinal data

    KAUST Repository

    Wang, S.; Qian, L.; Carroll, R. J.

    2010-01-01

    Efficient estimation of parameters is a major objective in analyzing longitudinal data. We propose two generalized empirical likelihood based methods that take into consideration within-subject correlations. A nonparametric version of the Wilks

  2. Maximum likelihood estimation of finite mixture model for economic data

    Science.gov (United States)

    Phoong, Seuk-Yen; Ismail, Mohd Tahir

    2014-06-01

    Finite mixture model is a mixture model with finite-dimension. This models are provides a natural representation of heterogeneity in a finite number of latent classes. In addition, finite mixture models also known as latent class models or unsupervised learning models. Recently, maximum likelihood estimation fitted finite mixture models has greatly drawn statistician's attention. The main reason is because maximum likelihood estimation is a powerful statistical method which provides consistent findings as the sample sizes increases to infinity. Thus, the application of maximum likelihood estimation is used to fit finite mixture model in the present paper in order to explore the relationship between nonlinear economic data. In this paper, a two-component normal mixture model is fitted by maximum likelihood estimation in order to investigate the relationship among stock market price and rubber price for sampled countries. Results described that there is a negative effect among rubber price and stock market price for Malaysia, Thailand, Philippines and Indonesia.

  3. Attitude towards, and likelihood of, complaining in the banking ...

    African Journals Online (AJOL)

    aims to determine customers' attitudes towards complaining as well as their likelihood of voicing a .... is particularly powerful and impacts greatly on customer satisfaction and retention. ...... 'Cross-national analysis of hotel customers' attitudes ...

  4. Narrow band interference cancelation in OFDM: Astructured maximum likelihood approach

    KAUST Repository

    Sohail, Muhammad Sadiq; Al-Naffouri, Tareq Y.; Al-Ghadhban, Samir N.

    2012-01-01

    This paper presents a maximum likelihood (ML) approach to mitigate the effect of narrow band interference (NBI) in a zero padded orthogonal frequency division multiplexing (ZP-OFDM) system. The NBI is assumed to be time variant and asynchronous

  5. Context updates are hierarchical

    Directory of Open Access Journals (Sweden)

    Anton Karl Ingason

    2016-10-01

    Full Text Available This squib studies the order in which elements are added to the shared context of interlocutors in a conversation. It focuses on context updates within one hierarchical structure and argues that structurally higher elements are entered into the context before lower elements, even if the structurally higher elements are pronounced after the lower elements. The crucial data are drawn from a comparison of relative clauses in two head-initial languages, English and Icelandic, and two head-final languages, Korean and Japanese. The findings have consequences for any theory of a dynamic semantics.

  6. On the likelihood function of Gaussian max-stable processes

    KAUST Repository

    Genton, M. G.; Ma, Y.; Sang, H.

    2011-01-01

    We derive a closed form expression for the likelihood function of a Gaussian max-stable process indexed by ℝd at p≤d+1 sites, d≥1. We demonstrate the gain in efficiency in the maximum composite likelihood estimators of the covariance matrix from p=2 to p=3 sites in ℝ2 by means of a Monte Carlo simulation study. © 2011 Biometrika Trust.

  7. Incorporating Nuisance Parameters in Likelihoods for Multisource Spectra

    CERN Document Server

    Conway, J.S.

    2011-01-01

    We describe here the general mathematical approach to constructing likelihoods for fitting observed spectra in one or more dimensions with multiple sources, including the effects of systematic uncertainties represented as nuisance parameters, when the likelihood is to be maximized with respect to these parameters. We consider three types of nuisance parameters: simple multiplicative factors, source spectra "morphing" parameters, and parameters representing statistical uncertainties in the predicted source spectra.

  8. On the likelihood function of Gaussian max-stable processes

    KAUST Repository

    Genton, M. G.

    2011-05-24

    We derive a closed form expression for the likelihood function of a Gaussian max-stable process indexed by ℝd at p≤d+1 sites, d≥1. We demonstrate the gain in efficiency in the maximum composite likelihood estimators of the covariance matrix from p=2 to p=3 sites in ℝ2 by means of a Monte Carlo simulation study. © 2011 Biometrika Trust.

  9. Tapered composite likelihood for spatial max-stable models

    KAUST Repository

    Sang, Huiyan

    2014-05-01

    Spatial extreme value analysis is useful to environmental studies, in which extreme value phenomena are of interest and meaningful spatial patterns can be discerned. Max-stable process models are able to describe such phenomena. This class of models is asymptotically justified to characterize the spatial dependence among extremes. However, likelihood inference is challenging for such models because their corresponding joint likelihood is unavailable and only bivariate or trivariate distributions are known. In this paper, we propose a tapered composite likelihood approach by utilizing lower dimensional marginal likelihoods for inference on parameters of various max-stable process models. We consider a weighting strategy based on a "taper range" to exclude distant pairs or triples. The "optimal taper range" is selected to maximize various measures of the Godambe information associated with the tapered composite likelihood function. This method substantially reduces the computational cost and improves the efficiency over equally weighted composite likelihood estimators. We illustrate its utility with simulation experiments and an analysis of rainfall data in Switzerland.

  10. Dissociating response conflict and error likelihood in anterior cingulate cortex.

    Science.gov (United States)

    Yeung, Nick; Nieuwenhuis, Sander

    2009-11-18

    Neuroimaging studies consistently report activity in anterior cingulate cortex (ACC) in conditions of high cognitive demand, leading to the view that ACC plays a crucial role in the control of cognitive processes. According to one prominent theory, the sensitivity of ACC to task difficulty reflects its role in monitoring for the occurrence of competition, or "conflict," between responses to signal the need for increased cognitive control. However, a contrasting theory proposes that ACC is the recipient rather than source of monitoring signals, and that ACC activity observed in relation to task demand reflects the role of this region in learning about the likelihood of errors. Response conflict and error likelihood are typically confounded, making the theories difficult to distinguish empirically. The present research therefore used detailed computational simulations to derive contrasting predictions regarding ACC activity and error rate as a function of response speed. The simulations demonstrated a clear dissociation between conflict and error likelihood: fast response trials are associated with low conflict but high error likelihood, whereas slow response trials show the opposite pattern. Using the N2 component as an index of ACC activity, an EEG study demonstrated that when conflict and error likelihood are dissociated in this way, ACC activity tracks conflict and is negatively correlated with error likelihood. These findings support the conflict-monitoring theory and suggest that, in speeded decision tasks, ACC activity reflects current task demands rather than the retrospective coding of past performance.

  11. Tapered composite likelihood for spatial max-stable models

    KAUST Repository

    Sang, Huiyan; Genton, Marc G.

    2014-01-01

    Spatial extreme value analysis is useful to environmental studies, in which extreme value phenomena are of interest and meaningful spatial patterns can be discerned. Max-stable process models are able to describe such phenomena. This class of models is asymptotically justified to characterize the spatial dependence among extremes. However, likelihood inference is challenging for such models because their corresponding joint likelihood is unavailable and only bivariate or trivariate distributions are known. In this paper, we propose a tapered composite likelihood approach by utilizing lower dimensional marginal likelihoods for inference on parameters of various max-stable process models. We consider a weighting strategy based on a "taper range" to exclude distant pairs or triples. The "optimal taper range" is selected to maximize various measures of the Godambe information associated with the tapered composite likelihood function. This method substantially reduces the computational cost and improves the efficiency over equally weighted composite likelihood estimators. We illustrate its utility with simulation experiments and an analysis of rainfall data in Switzerland.

  12. A preliminary evaluation of the generalized likelihood ratio for detecting and identifying control element failures in a transport aircraft

    Science.gov (United States)

    Bundick, W. T.

    1985-01-01

    The application of the Generalized Likelihood Ratio technique to the detection and identification of aircraft control element failures has been evaluated in a linear digital simulation of the longitudinal dynamics of a B-737 aircraft. Simulation results show that the technique has potential but that the effects of wind turbulence and Kalman filter model errors are problems which must be overcome.

  13. Gravitational wave chirp search: no-signal cumulative distribution of the maximum likelihood detection statistic

    International Nuclear Information System (INIS)

    Croce, R P; Demma, Th; Longo, M; Marano, S; Matta, V; Pierro, V; Pinto, I M

    2003-01-01

    The cumulative distribution of the supremum of a set (bank) of correlators is investigated in the context of maximum likelihood detection of gravitational wave chirps from coalescing binaries with unknown parameters. Accurate (lower-bound) approximants are introduced based on a suitable generalization of previous results by Mohanty. Asymptotic properties (in the limit where the number of correlators goes to infinity) are highlighted. The validity of numerical simulations made on small-size banks is extended to banks of any size, via a Gaussian correlation inequality

  14. Operation Context

    DEFF Research Database (Denmark)

    Stüben, Henning; Tietjen, Anne

    2006-01-01

    Abstract: This paper seeks to challenge the notion of context from an operational perspective. Can we grasp the forces that shape the complex conditions for an architectural or urban design within the notion of context? By shifting the gaze towards the agency of architecture, contextual analysis...

  15. Diagnostics of the Enterprise Economic Security and the Role of Information and Communication in the Context of Sustainability of Dynamical Equilibrium, Operation and Development

    Directory of Open Access Journals (Sweden)

    Skrynkovskyy Ruslan M.

    2015-03-01

    Full Text Available In the scientific article a system for diagnostics of the enterprise economic security is developed. It has been determined that the main business indicators for diagnostics of the enterprise economic security are: the level of the enterprise financial stability (contains the indicator for the enterprise provision with its own funds, rate of independence, financial stability indicator, current assets to equity ratio, liquid ratio, absolute liquidity ratio, current liquidity ratio; level of the enterprise production activity (calculated on the output-capital ratio, capital-labor ratio, index of workforce productivity, quality indicators of fixed assets, production potential indicator, production profitability ratio, input-output coefficient; level of organizational and administrative activities of the enterprise (takes into account the ratio of administrative expenses to the rate of increase in production volume, rate of saving of the managerial apparatus, rate of information processing; level of employee loyalty to the enterprise (calculated on the rate of personnel turnover, rate of personnel continuity, indicator of employee satisfaction, personnel development indicator, education level of employees; level of scientific and technical and innovative activity of the enterprise (including index of profitability of innovations, profitability of expenditures on research and development works; level of investment activity of the enterprise (includes index of investment profitability, rate of investment activity, rate of return on investments, rate of investment in production; level of market reliability (calculated on index of return on sales, index of return on net assets, index of marketability, level of market research. It has been identified that an important role in the context of sustainability of dynamical equilibrium, operation and development of enterprises is played by information and communication.

  16. Likelihood-based methods for evaluating principal surrogacy in augmented vaccine trials.

    Science.gov (United States)

    Liu, Wei; Zhang, Bo; Zhang, Hui; Zhang, Zhiwei

    2017-04-01

    There is growing interest in assessing immune biomarkers, which are quick to measure and potentially predictive of long-term efficacy, as surrogate endpoints in randomized, placebo-controlled vaccine trials. This can be done under a principal stratification approach, with principal strata defined using a subject's potential immune responses to vaccine and placebo (the latter may be assumed to be zero). In this context, principal surrogacy refers to the extent to which vaccine efficacy varies across principal strata. Because a placebo recipient's potential immune response to vaccine is unobserved in a standard vaccine trial, augmented vaccine trials have been proposed to produce the information needed to evaluate principal surrogacy. This article reviews existing methods based on an estimated likelihood and a pseudo-score (PS) and proposes two new methods based on a semiparametric likelihood (SL) and a pseudo-likelihood (PL), for analyzing augmented vaccine trials. Unlike the PS method, the SL method does not require a model for missingness, which can be advantageous when immune response data are missing by happenstance. The SL method is shown to be asymptotically efficient, and it performs similarly to the PS and PL methods in simulation experiments. The PL method appears to have a computational advantage over the PS and SL methods.

  17. Awareness of Entities, Activities and Contexts in Ambient Systems

    DEFF Research Database (Denmark)

    Kristensen, Bent Bruun

    2013-01-01

    Ambient systems are modeled by entities, activities and contexts, where entities exist in contexts and engage in activities. A context supports a dynamic collection of entities by services and offers awareness information about the entities. Activities also exist in contexts and model ongoing...... collaborations between entities. Activities and local contexts also obtain awareness information from the context about the dynamic collection of entities. Similarly activities, local contexts and entities are offered awareness information about activities and local contexts....

  18. Constraint likelihood analysis for a network of gravitational wave detectors

    International Nuclear Information System (INIS)

    Klimenko, S.; Rakhmanov, M.; Mitselmakher, G.; Mohanty, S.

    2005-01-01

    We propose a coherent method for detection and reconstruction of gravitational wave signals with a network of interferometric detectors. The method is derived by using the likelihood ratio functional for unknown signal waveforms. In the likelihood analysis, the global maximum of the likelihood ratio over the space of waveforms is used as the detection statistic. We identify a problem with this approach. In the case of an aligned pair of detectors, the detection statistic depends on the cross correlation between the detectors as expected, but this dependence disappears even for infinitesimally small misalignments. We solve the problem by applying constraints on the likelihood functional and obtain a new class of statistics. The resulting method can be applied to data from a network consisting of any number of detectors with arbitrary detector orientations. The method allows us reconstruction of the source coordinates and the waveforms of two polarization components of a gravitational wave. We study the performance of the method with numerical simulations and find the reconstruction of the source coordinates to be more accurate than in the standard likelihood method

  19. Context in a wider context

    Directory of Open Access Journals (Sweden)

    John Traxler

    2011-07-01

    Full Text Available This paper attempts to review and reconsider the role of context in mobile learning and starts by outlining definitions of context-aware mobile learning as the technologies have become more mature, more robust and more widely available and as the notion of context has become progressively richer. The future role of context-aware mobile learning is considered within the context of the future of mobile learning as it moves from the challenges and opportunities of pedagogy and technology to the challenges and opportunities of policy, scale, sustainability, equity and engagement with augmented reality, «blended learning», «learner devices», «user-generated contexts» and the «internet of things». This is essentially a perspective on mobile learning, and other forms of technology-enhanced learning (TEL, where educators and their institutions set the agenda and manage change. There are, however, other perspectives on context. The increasing availability and use of smart-phones and other personal mobile devices with similar powerful functionality means that the experience of context for many people, in the form of personalized or location-based services, is an increasingly social and informal experience, rather than a specialist or educational experience. This is part of the transformative impact of mobility and connectedness on our societies brought about by these universal, ubiquitous and pervasive technologies. This paper contributes a revised understanding of context in the wider context (sic of the transformations taking place in our societies. These are subtle but pervasive transformations of jobs, work and the economy, of our sense of time, space and place, of knowing and learning, and of community and identity. This leads to a radical reconsideration of context as the notions of ‹self› and ‹other› are transformed.

  20. Cosmic shear measurement with maximum likelihood and maximum a posteriori inference

    Science.gov (United States)

    Hall, Alex; Taylor, Andy

    2017-06-01

    We investigate the problem of noise bias in maximum likelihood and maximum a posteriori estimators for cosmic shear. We derive the leading and next-to-leading order biases and compute them in the context of galaxy ellipticity measurements, extending previous work on maximum likelihood inference for weak lensing. We show that a large part of the bias on these point estimators can be removed using information already contained in the likelihood when a galaxy model is specified, without the need for external calibration. We test these bias-corrected estimators on simulated galaxy images similar to those expected from planned space-based weak lensing surveys, with promising results. We find that the introduction of an intrinsic shape prior can help with mitigation of noise bias, such that the maximum a posteriori estimate can be made less biased than the maximum likelihood estimate. Second-order terms offer a check on the convergence of the estimators, but are largely subdominant. We show how biases propagate to shear estimates, demonstrating in our simple set-up that shear biases can be reduced by orders of magnitude and potentially to within the requirements of planned space-based surveys at mild signal-to-noise ratio. We find that second-order terms can exhibit significant cancellations at low signal-to-noise ratio when Gaussian noise is assumed, which has implications for inferring the performance of shear-measurement algorithms from simplified simulations. We discuss the viability of our point estimators as tools for lensing inference, arguing that they allow for the robust measurement of ellipticity and shear.

  1. Profile-likelihood Confidence Intervals in Item Response Theory Models.

    Science.gov (United States)

    Chalmers, R Philip; Pek, Jolynn; Liu, Yang

    2017-01-01

    Confidence intervals (CIs) are fundamental inferential devices which quantify the sampling variability of parameter estimates. In item response theory, CIs have been primarily obtained from large-sample Wald-type approaches based on standard error estimates, derived from the observed or expected information matrix, after parameters have been estimated via maximum likelihood. An alternative approach to constructing CIs is to quantify sampling variability directly from the likelihood function with a technique known as profile-likelihood confidence intervals (PL CIs). In this article, we introduce PL CIs for item response theory models, compare PL CIs to classical large-sample Wald-type CIs, and demonstrate important distinctions among these CIs. CIs are then constructed for parameters directly estimated in the specified model and for transformed parameters which are often obtained post-estimation. Monte Carlo simulation results suggest that PL CIs perform consistently better than Wald-type CIs for both non-transformed and transformed parameters.

  2. Generalized empirical likelihood methods for analyzing longitudinal data

    KAUST Repository

    Wang, S.

    2010-02-16

    Efficient estimation of parameters is a major objective in analyzing longitudinal data. We propose two generalized empirical likelihood based methods that take into consideration within-subject correlations. A nonparametric version of the Wilks theorem for the limiting distributions of the empirical likelihood ratios is derived. It is shown that one of the proposed methods is locally efficient among a class of within-subject variance-covariance matrices. A simulation study is conducted to investigate the finite sample properties of the proposed methods and compare them with the block empirical likelihood method by You et al. (2006) and the normal approximation with a correctly estimated variance-covariance. The results suggest that the proposed methods are generally more efficient than existing methods which ignore the correlation structure, and better in coverage compared to the normal approximation with correctly specified within-subject correlation. An application illustrating our methods and supporting the simulation study results is also presented.

  3. The Influence of Hedonic and Utilitarian Motivators on Likelihood to Buy a Tourism Package

    Directory of Open Access Journals (Sweden)

    Alexandra VINEREAN

    2013-12-01

    Full Text Available To fully understand the pattern of choice, it is important that any explanation of consumer behavior to be accompanied by a complete understanding of the interplay between a consumer’s functional goals and experiential preferences within the decision context. Consumer researchers have increasingly begun to investigate consumer choice based on distinctions that involve the purchase and consumption of goods for pleasure versus for more utilitarian and instrumental purposes. Consumers are often faced with these types of choices between hedonic and utilitarian alternatives that are at least partly driven by emotional desires rather than cold cognitive deliberations. This research approaches factor analysis and multiple linear regression in the context of 150 international respondents and their perception of hedonic and utilitarian motivators on likelihood to buy a tourism package.

  4. Likelihood ratio sequential sampling models of recognition memory.

    Science.gov (United States)

    Osth, Adam F; Dennis, Simon; Heathcote, Andrew

    2017-02-01

    The mirror effect - a phenomenon whereby a manipulation produces opposite effects on hit and false alarm rates - is benchmark regularity of recognition memory. A likelihood ratio decision process, basing recognition on the relative likelihood that a stimulus is a target or a lure, naturally predicts the mirror effect, and so has been widely adopted in quantitative models of recognition memory. Glanzer, Hilford, and Maloney (2009) demonstrated that likelihood ratio models, assuming Gaussian memory strength, are also capable of explaining regularities observed in receiver-operating characteristics (ROCs), such as greater target than lure variance. Despite its central place in theorising about recognition memory, however, this class of models has not been tested using response time (RT) distributions. In this article, we develop a linear approximation to the likelihood ratio transformation, which we show predicts the same regularities as the exact transformation. This development enabled us to develop a tractable model of recognition-memory RT based on the diffusion decision model (DDM), with inputs (drift rates) provided by an approximate likelihood ratio transformation. We compared this "LR-DDM" to a standard DDM where all targets and lures receive their own drift rate parameters. Both were implemented as hierarchical Bayesian models and applied to four datasets. Model selection taking into account parsimony favored the LR-DDM, which requires fewer parameters than the standard DDM but still fits the data well. These results support log-likelihood based models as providing an elegant explanation of the regularities of recognition memory, not only in terms of choices made but also in terms of the times it takes to make them. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. Unbinned likelihood maximisation framework for neutrino clustering in Python

    Energy Technology Data Exchange (ETDEWEB)

    Coenders, Stefan [Technische Universitaet Muenchen, Boltzmannstr. 2, 85748 Garching (Germany)

    2016-07-01

    Albeit having detected an astrophysical neutrino flux with IceCube, sources of astrophysical neutrinos remain hidden up to now. A detection of a neutrino point source is a smoking gun for hadronic processes and acceleration of cosmic rays. The search for neutrino sources has many degrees of freedom, for example steady versus transient, point-like versus extended sources, et cetera. Here, we introduce a Python framework designed for unbinned likelihood maximisations as used in searches for neutrino point sources by IceCube. Implementing source scenarios in a modular way, likelihood searches on various kinds can be implemented in a user-friendly way, without sacrificing speed and memory management.

  6. Nearly Efficient Likelihood Ratio Tests of the Unit Root Hypothesis

    DEFF Research Database (Denmark)

    Jansson, Michael; Nielsen, Morten Ørregaard

    Seemingly absent from the arsenal of currently available "nearly efficient" testing procedures for the unit root hypothesis, i.e. tests whose local asymptotic power functions are indistinguishable from the Gaussian power envelope, is a test admitting a (quasi-)likelihood ratio interpretation. We...... show that the likelihood ratio unit root test derived in a Gaussian AR(1) model with standard normal innovations is nearly efficient in that model. Moreover, these desirable properties carry over to more complicated models allowing for serially correlated and/or non-Gaussian innovations....

  7. A note on estimating errors from the likelihood function

    International Nuclear Information System (INIS)

    Barlow, Roger

    2005-01-01

    The points at which the log likelihood falls by 12 from its maximum value are often used to give the 'errors' on a result, i.e. the 68% central confidence interval. The validity of this is examined for two simple cases: a lifetime measurement and a Poisson measurement. Results are compared with the exact Neyman construction and with the simple Bartlett approximation. It is shown that the accuracy of the log likelihood method is poor, and the Bartlett construction explains why it is flawed

  8. Nearly Efficient Likelihood Ratio Tests for Seasonal Unit Roots

    DEFF Research Database (Denmark)

    Jansson, Michael; Nielsen, Morten Ørregaard

    In an important generalization of zero frequency autore- gressive unit root tests, Hylleberg, Engle, Granger, and Yoo (1990) developed regression-based tests for unit roots at the seasonal frequencies in quarterly time series. We develop likelihood ratio tests for seasonal unit roots and show...... that these tests are "nearly efficient" in the sense of Elliott, Rothenberg, and Stock (1996), i.e. that their local asymptotic power functions are indistinguishable from the Gaussian power envelope. Currently available nearly efficient testing procedures for seasonal unit roots are regression-based and require...... the choice of a GLS detrending parameter, which our likelihood ratio tests do not....

  9. LDR: A Package for Likelihood-Based Sufficient Dimension Reduction

    Directory of Open Access Journals (Sweden)

    R. Dennis Cook

    2011-03-01

    Full Text Available We introduce a new mlab software package that implements several recently proposed likelihood-based methods for sufficient dimension reduction. Current capabilities include estimation of reduced subspaces with a fixed dimension d, as well as estimation of d by use of likelihood-ratio testing, permutation testing and information criteria. The methods are suitable for preprocessing data for both regression and classification. Implementations of related estimators are also available. Although the software is more oriented to command-line operation, a graphical user interface is also provided for prototype computations.

  10. Likelihood ratio decisions in memory: three implied regularities.

    Science.gov (United States)

    Glanzer, Murray; Hilford, Andrew; Maloney, Laurence T

    2009-06-01

    We analyze four general signal detection models for recognition memory that differ in their distributional assumptions. Our analyses show that a basic assumption of signal detection theory, the likelihood ratio decision axis, implies three regularities in recognition memory: (1) the mirror effect, (2) the variance effect, and (3) the z-ROC length effect. For each model, we present the equations that produce the three regularities and show, in computed examples, how they do so. We then show that the regularities appear in data from a range of recognition studies. The analyses and data in our study support the following generalization: Individuals make efficient recognition decisions on the basis of likelihood ratios.

  11. Likelihood-based inference for cointegration with nonlinear error-correction

    DEFF Research Database (Denmark)

    Kristensen, Dennis; Rahbek, Anders Christian

    2010-01-01

    We consider a class of nonlinear vector error correction models where the transfer function (or loadings) of the stationary relationships is nonlinear. This includes in particular the smooth transition models. A general representation theorem is given which establishes the dynamic properties...... and a linear trend in general. Gaussian likelihood-based estimators are considered for the long-run cointegration parameters, and the short-run parameters. Asymptotic theory is provided for these and it is discussed to what extend asymptotic normality and mixed normality can be found. A simulation study...

  12. Understanding the properties of diagnostic tests - Part 2: Likelihood ratios.

    Science.gov (United States)

    Ranganathan, Priya; Aggarwal, Rakesh

    2018-01-01

    Diagnostic tests are used to identify subjects with and without disease. In a previous article in this series, we examined some attributes of diagnostic tests - sensitivity, specificity, and predictive values. In this second article, we look at likelihood ratios, which are useful for the interpretation of diagnostic test results in everyday clinical practice.

  13. Comparison of likelihood testing procedures for parallel systems with covariances

    International Nuclear Information System (INIS)

    Ayman Baklizi; Isa Daud; Noor Akma Ibrahim

    1998-01-01

    In this paper we considered investigating and comparing the behavior of the likelihood ratio, the Rao's and the Wald's statistics for testing hypotheses on the parameters of the simple linear regression model based on parallel systems with covariances. These statistics are asymptotically equivalent (Barndorff-Nielsen and Cox, 1994). However, their relative performances in finite samples are generally known. A Monte Carlo experiment is conducted to stimulate the sizes and the powers of these statistics for complete samples and in the presence of time censoring. Comparisons of the statistics are made according to the attainment of assumed size of the test and their powers at various points in the parameter space. The results show that the likelihood ratio statistics appears to have the best performance in terms of the attainment of the assumed size of the test. Power comparisons show that the Rao statistic has some advantage over the Wald statistic in almost all of the space of alternatives while likelihood ratio statistic occupies either the first or the last position in term of power. Overall, the likelihood ratio statistic appears to be more appropriate to the model under study, especially for small sample sizes

  14. Maximum likelihood estimation of the attenuated ultrasound pulse

    DEFF Research Database (Denmark)

    Rasmussen, Klaus Bolding

    1994-01-01

    The attenuated ultrasound pulse is divided into two parts: a stationary basic pulse and a nonstationary attenuation pulse. A standard ARMA model is used for the basic pulse, and a nonstandard ARMA model is derived for the attenuation pulse. The maximum likelihood estimator of the attenuated...

  15. Planck 2013 results. XV. CMB power spectra and likelihood

    CERN Document Server

    Ade, P.A.R.; Armitage-Caplan, C.; Arnaud, M.; Ashdown, M.; Atrio-Barandela, F.; Aumont, J.; Baccigalupi, C.; Banday, A.J.; Barreiro, R.B.; Bartlett, J.G.; Battaner, E.; Benabed, K.; Benoit, A.; Benoit-Levy, A.; Bernard, J.P.; Bersanelli, M.; Bielewicz, P.; Bobin, J.; Bock, J.J.; Bonaldi, A.; Bonavera, L.; Bond, J.R.; Borrill, J.; Bouchet, F.R.; Boulanger, F.; Bridges, M.; Bucher, M.; Burigana, C.; Butler, R.C.; Calabrese, E.; Cardoso, J.F.; Catalano, A.; Challinor, A.; Chamballu, A.; Chiang, L.Y.; Chiang, H.C.; Christensen, P.R.; Church, S.; Clements, D.L.; Colombi, S.; Colombo, L.P.L.; Combet, C.; Couchot, F.; Coulais, A.; Crill, B.P.; Curto, A.; Cuttaia, F.; Danese, L.; Davies, R.D.; Davis, R.J.; de Bernardis, P.; de Rosa, A.; de Zotti, G.; Delabrouille, J.; Delouis, J.M.; Desert, F.X.; Dickinson, C.; Diego, J.M.; Dole, H.; Donzelli, S.; Dore, O.; Douspis, M.; Dunkley, J.; Dupac, X.; Efstathiou, G.; Elsner, F.; Ensslin, T.A.; Eriksen, H.K.; Finelli, F.; Forni, O.; Frailis, M.; Fraisse, A.A.; Franceschi, E.; Gaier, T.C.; Galeotta, S.; Galli, S.; Ganga, K.; Giard, M.; Giardino, G.; Giraud-Heraud, Y.; Gjerlow, E.; Gonzalez-Nuevo, J.; Gorski, K.M.; Gratton, S.; Gregorio, A.; Gruppuso, A.; Gudmundsson, J.E.; Hansen, F.K.; Hanson, D.; Harrison, D.; Helou, G.; Henrot-Versille, S.; Hernandez-Monteagudo, C.; Herranz, D.; Hildebrandt, S.R.; Hivon, E.; Hobson, M.; Holmes, W.A.; Hornstrup, A.; Hovest, W.; Huffenberger, K.M.; Hurier, G.; Jaffe, T.R.; Jaffe, A.H.; Jewell, J.; Jones, W.C.; Juvela, M.; Keihanen, E.; Keskitalo, R.; Kiiveri, K.; Kisner, T.S.; Kneissl, R.; Knoche, J.; Knox, L.; Kunz, M.; Kurki-Suonio, H.; Lagache, G.; Lahteenmaki, A.; Lamarre, J.M.; Lasenby, A.; Lattanzi, M.; Laureijs, R.J.; Lawrence, C.R.; Le Jeune, M.; Leach, S.; Leahy, J.P.; Leonardi, R.; Leon-Tavares, J.; Lesgourgues, J.; Liguori, M.; Lilje, P.B.; Lindholm, V.; Linden-Vornle, M.; Lopez-Caniego, M.; Lubin, P.M.; Macias-Perez, J.F.; Maffei, B.; Maino, D.; Mandolesi, N.; Marinucci, D.; Maris, M.; Marshall, D.J.; Martin, P.G.; Martinez-Gonzalez, E.; Masi, S.; Matarrese, S.; Matthai, F.; Mazzotta, P.; Meinhold, P.R.; Melchiorri, A.; Mendes, L.; Menegoni, E.; Mennella, A.; Migliaccio, M.; Millea, M.; Mitra, S.; Miville-Deschenes, M.A.; Molinari, D.; Moneti, A.; Montier, L.; Morgante, G.; Mortlock, D.; Moss, A.; Munshi, D.; Naselsky, P.; Nati, F.; Natoli, P.; Netterfield, C.B.; Norgaard-Nielsen, H.U.; Noviello, F.; Novikov, D.; Novikov, I.; O'Dwyer, I.J.; Orieux, F.; Osborne, S.; Oxborrow, C.A.; Paci, F.; Pagano, L.; Pajot, F.; Paladini, R.; Paoletti, D.; Partridge, B.; Pasian, F.; Patanchon, G.; Paykari, P.; Perdereau, O.; Perotto, L.; Perrotta, F.; Piacentini, F.; Piat, M.; Pierpaoli, E.; Pietrobon, D.; Plaszczynski, S.; Pointecouteau, E.; Polenta, G.; Ponthieu, N.; Popa, L.; Poutanen, T.; Pratt, G.W.; Prezeau, G.; Prunet, S.; Puget, J.L.; Rachen, J.P.; Rahlin, A.; Rebolo, R.; Reinecke, M.; Remazeilles, M.; Renault, C.; Ricciardi, S.; Riller, T.; Ringeval, C.; Ristorcelli, I.; Rocha, G.; Rosset, C.; Roudier, G.; Rowan-Robinson, M.; Rubino-Martin, J.A.; Rusholme, B.; Sandri, M.; Sanselme, L.; Santos, D.; Savini, G.; Scott, D.; Seiffert, M.D.; Shellard, E.P.S.; Spencer, L.D.; Starck, J.L.; Stolyarov, V.; Stompor, R.; Sudiwala, R.; Sureau, F.; Sutton, D.; Suur-Uski, A.S.; Sygnet, J.F.; Tauber, J.A.; Tavagnacco, D.; Terenzi, L.; Toffolatti, L.; Tomasi, M.; Tristram, M.; Tucci, M.; Tuovinen, J.; Turler, M.; Valenziano, L.; Valiviita, J.; Van Tent, B.; Varis, J.; Vielva, P.; Villa, F.; Vittorio, N.; Wade, L.A.; Wandelt, B.D.; Wehus, I.K.; White, M.; White, S.D.M.; Yvon, D.; Zacchei, A.; Zonca, A.

    2014-01-01

    We present the Planck likelihood, a complete statistical description of the two-point correlation function of the CMB temperature fluctuations. We use this likelihood to derive the Planck CMB power spectrum over three decades in l, covering 2 = 50, we employ a correlated Gaussian likelihood approximation based on angular cross-spectra derived from the 100, 143 and 217 GHz channels. We validate our likelihood through an extensive suite of consistency tests, and assess the impact of residual foreground and instrumental uncertainties on cosmological parameters. We find good internal agreement among the high-l cross-spectra with residuals of a few uK^2 at l <= 1000. We compare our results with foreground-cleaned CMB maps, and with cross-spectra derived from the 70 GHz Planck map, and find broad agreement in terms of spectrum residuals and cosmological parameters. The best-fit LCDM cosmology is in excellent agreement with preliminary Planck polarisation spectra. The standard LCDM cosmology is well constrained b...

  16. Robust Gaussian Process Regression with a Student-t Likelihood

    NARCIS (Netherlands)

    Jylänki, P.P.; Vanhatalo, J.; Vehtari, A.

    2011-01-01

    This paper considers the robust and efficient implementation of Gaussian process regression with a Student-t observation model, which has a non-log-concave likelihood. The challenge with the Student-t model is the analytically intractable inference which is why several approximative methods have

  17. MAXIMUM-LIKELIHOOD-ESTIMATION OF THE ENTROPY OF AN ATTRACTOR

    NARCIS (Netherlands)

    SCHOUTEN, JC; TAKENS, F; VANDENBLEEK, CM

    In this paper, a maximum-likelihood estimate of the (Kolmogorov) entropy of an attractor is proposed that can be obtained directly from a time series. Also, the relative standard deviation of the entropy estimate is derived; it is dependent on the entropy and on the number of samples used in the

  18. A simplification of the likelihood ratio test statistic for testing ...

    African Journals Online (AJOL)

    The traditional likelihood ratio test statistic for testing hypothesis about goodness of fit of multinomial probabilities in one, two and multi – dimensional contingency table was simplified. Advantageously, using the simplified version of the statistic to test the null hypothesis is easier and faster because calculating the expected ...

  19. Adaptive Unscented Kalman Filter using Maximum Likelihood Estimation

    DEFF Research Database (Denmark)

    Mahmoudi, Zeinab; Poulsen, Niels Kjølstad; Madsen, Henrik

    2017-01-01

    The purpose of this study is to develop an adaptive unscented Kalman filter (UKF) by tuning the measurement noise covariance. We use the maximum likelihood estimation (MLE) and the covariance matching (CM) method to estimate the noise covariance. The multi-step prediction errors generated...

  20. LIKELIHOOD ESTIMATION OF PARAMETERS USING SIMULTANEOUSLY MONITORED PROCESSES

    DEFF Research Database (Denmark)

    Friis-Hansen, Peter; Ditlevsen, Ove Dalager

    2004-01-01

    The topic is maximum likelihood inference from several simultaneously monitored response processes of a structure to obtain knowledge about the parameters of other not monitored but important response processes when the structure is subject to some Gaussian load field in space and time. The consi....... The considered example is a ship sailing with a given speed through a Gaussian wave field....

  1. Likelihood-based inference for clustered line transect data

    DEFF Research Database (Denmark)

    Waagepetersen, Rasmus; Schweder, Tore

    2006-01-01

    The uncertainty in estimation of spatial animal density from line transect surveys depends on the degree of spatial clustering in the animal population. To quantify the clustering we model line transect data as independent thinnings of spatial shot-noise Cox processes. Likelihood-based inference...

  2. Likelihood-based inference for clustered line transect data

    DEFF Research Database (Denmark)

    Waagepetersen, Rasmus Plenge; Schweder, Tore

    The uncertainty in estimation of spatial animal density from line transect surveys depends on the degree of spatial clustering in the animal population. To quantify the clustering we model line transect data as independent thinnings of spatial shot-noise Cox processes. Likelihood-based inference...

  3. Composite likelihood and two-stage estimation in family studies

    DEFF Research Database (Denmark)

    Andersen, Elisabeth Anne Wreford

    2004-01-01

    In this paper register based family studies provide the motivation for linking a two-stage estimation procedure in copula models for multivariate failure time data with a composite likelihood approach. The asymptotic properties of the estimators in both parametric and semi-parametric models are d...

  4. Reconceptualizing Social Influence in Counseling: The Elaboration Likelihood Model.

    Science.gov (United States)

    McNeill, Brian W.; Stoltenberg, Cal D.

    1989-01-01

    Presents Elaboration Likelihood Model (ELM) of persuasion (a reconceptualization of the social influence process) as alternative model of attitude change. Contends ELM unifies conflicting social psychology results and can potentially account for inconsistent research findings in counseling psychology. Provides guidelines on integrating…

  5. Cases in which ancestral maximum likelihood will be confusingly misleading.

    Science.gov (United States)

    Handelman, Tomer; Chor, Benny

    2017-05-07

    Ancestral maximum likelihood (AML) is a phylogenetic tree reconstruction criteria that "lies between" maximum parsimony (MP) and maximum likelihood (ML). ML has long been known to be statistically consistent. On the other hand, Felsenstein (1978) showed that MP is statistically inconsistent, and even positively misleading: There are cases where the parsimony criteria, applied to data generated according to one tree topology, will be optimized on a different tree topology. The question of weather AML is statistically consistent or not has been open for a long time. Mossel et al. (2009) have shown that AML can "shrink" short tree edges, resulting in a star tree with no internal resolution, which yields a better AML score than the original (resolved) model. This result implies that AML is statistically inconsistent, but not that it is positively misleading, because the star tree is compatible with any other topology. We show that AML is confusingly misleading: For some simple, four taxa (resolved) tree, the ancestral likelihood optimization criteria is maximized on an incorrect (resolved) tree topology, as well as on a star tree (both with specific edge lengths), while the tree with the original, correct topology, has strictly lower ancestral likelihood. Interestingly, the two short edges in the incorrect, resolved tree topology are of length zero, and are not adjacent, so this resolved tree is in fact a simple path. While for MP, the underlying phenomenon can be described as long edge attraction, it turns out that here we have long edge repulsion. Copyright © 2017. Published by Elsevier Ltd.

  6. Multilevel maximum likelihood estimation with application to covariance matrices

    Czech Academy of Sciences Publication Activity Database

    Turčičová, Marie; Mandel, J.; Eben, Kryštof

    Published online: 23 January ( 2018 ) ISSN 0361-0926 R&D Projects: GA ČR GA13-34856S Institutional support: RVO:67985807 Keywords : Fisher information * High dimension * Hierarchical maximum likelihood * Nested parameter spaces * Spectral diagonal covariance model * Sparse inverse covariance model Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.311, year: 2016

  7. Pendeteksian Outlier pada Regresi Nonlinier dengan Metode statistik Likelihood Displacement

    Directory of Open Access Journals (Sweden)

    Siti Tabi'atul Hasanah

    2012-11-01

    Full Text Available Outlier is an observation that much different (extreme from the other observational data, or data can be interpreted that do not follow the general pattern of the model. Sometimes outliers provide information that can not be provided by other data. That's why outliers should not just be eliminated. Outliers can also be an influential observation. There are many methods that can be used to detect of outliers. In previous studies done on outlier detection of linear regression. Next will be developed detection of outliers in nonlinear regression. Nonlinear regression here is devoted to multiplicative nonlinear regression. To detect is use of statistical method likelihood displacement. Statistical methods abbreviated likelihood displacement (LD is a method to detect outliers by removing the suspected outlier data. To estimate the parameters are used to the maximum likelihood method, so we get the estimate of the maximum. By using LD method is obtained i.e likelihood displacement is thought to contain outliers. Further accuracy of LD method in detecting the outliers are shown by comparing the MSE of LD with the MSE from the regression in general. Statistic test used is Λ. Initial hypothesis was rejected when proved so is an outlier.

  8. Gaussian copula as a likelihood function for environmental models

    Science.gov (United States)

    Wani, O.; Espadas, G.; Cecinati, F.; Rieckermann, J.

    2017-12-01

    Parameter estimation of environmental models always comes with uncertainty. To formally quantify this parametric uncertainty, a likelihood function needs to be formulated, which is defined as the probability of observations given fixed values of the parameter set. A likelihood function allows us to infer parameter values from observations using Bayes' theorem. The challenge is to formulate a likelihood function that reliably describes the error generating processes which lead to the observed monitoring data, such as rainfall and runoff. If the likelihood function is not representative of the error statistics, the parameter inference will give biased parameter values. Several uncertainty estimation methods that are currently being used employ Gaussian processes as a likelihood function, because of their favourable analytical properties. Box-Cox transformation is suggested to deal with non-symmetric and heteroscedastic errors e.g. for flow data which are typically more uncertain in high flows than in periods with low flows. Problem with transformations is that the results are conditional on hyper-parameters, for which it is difficult to formulate the analyst's belief a priori. In an attempt to address this problem, in this research work we suggest learning the nature of the error distribution from the errors made by the model in the "past" forecasts. We use a Gaussian copula to generate semiparametric error distributions . 1) We show that this copula can be then used as a likelihood function to infer parameters, breaking away from the practice of using multivariate normal distributions. Based on the results from a didactical example of predicting rainfall runoff, 2) we demonstrate that the copula captures the predictive uncertainty of the model. 3) Finally, we find that the properties of autocorrelation and heteroscedasticity of errors are captured well by the copula, eliminating the need to use transforms. In summary, our findings suggest that copulas are an

  9. Theory and context / Theory in context

    DEFF Research Database (Denmark)

    Glaveanu, Vlad Petre

    2014-01-01

    trans-disciplinary manner. Consideration needs to be given as well to connected scholarship focusing on imagination, innova-tion, and improvisation. Last but not least, an expanded the-ory of context cannot ignore the institutional context of doing research on creativity. Creativity scholars are facing......It is debatable whether the psychology of creativity is a field in crisis or not. There are clear signs of increased fragmenta-tion and a scarcity of integrative efforts, but is this necessari-ly bad? Do we need more comprehensive theories of creativ-ity and a return to old epistemological...... questions? This de-pends on how one understands theory. Against a view of theoretical work as aiming towards generality, universality, uniformity, completeness, and singularity, I advocate for a dynamic perspective in which theory is plural, multifaceted, and contextual. Far from ‘waiting for the Messiah...

  10. Modeling gene expression measurement error: a quasi-likelihood approach

    Directory of Open Access Journals (Sweden)

    Strimmer Korbinian

    2003-03-01

    Full Text Available Abstract Background Using suitable error models for gene expression measurements is essential in the statistical analysis of microarray data. However, the true probabilistic model underlying gene expression intensity readings is generally not known. Instead, in currently used approaches some simple parametric model is assumed (usually a transformed normal distribution or the empirical distribution is estimated. However, both these strategies may not be optimal for gene expression data, as the non-parametric approach ignores known structural information whereas the fully parametric models run the risk of misspecification. A further related problem is the choice of a suitable scale for the model (e.g. observed vs. log-scale. Results Here a simple semi-parametric model for gene expression measurement error is presented. In this approach inference is based an approximate likelihood function (the extended quasi-likelihood. Only partial knowledge about the unknown true distribution is required to construct this function. In case of gene expression this information is available in the form of the postulated (e.g. quadratic variance structure of the data. As the quasi-likelihood behaves (almost like a proper likelihood, it allows for the estimation of calibration and variance parameters, and it is also straightforward to obtain corresponding approximate confidence intervals. Unlike most other frameworks, it also allows analysis on any preferred scale, i.e. both on the original linear scale as well as on a transformed scale. It can also be employed in regression approaches to model systematic (e.g. array or dye effects. Conclusions The quasi-likelihood framework provides a simple and versatile approach to analyze gene expression data that does not make any strong distributional assumptions about the underlying error model. For several simulated as well as real data sets it provides a better fit to the data than competing models. In an example it also

  11. Exclusion probabilities and likelihood ratios with applications to mixtures.

    Science.gov (United States)

    Slooten, Klaas-Jan; Egeland, Thore

    2016-01-01

    The statistical evidence obtained from mixed DNA profiles can be summarised in several ways in forensic casework including the likelihood ratio (LR) and the Random Man Not Excluded (RMNE) probability. The literature has seen a discussion of the advantages and disadvantages of likelihood ratios and exclusion probabilities, and part of our aim is to bring some clarification to this debate. In a previous paper, we proved that there is a general mathematical relationship between these statistics: RMNE can be expressed as a certain average of the LR, implying that the expected value of the LR, when applied to an actual contributor to the mixture, is at least equal to the inverse of the RMNE. While the mentioned paper presented applications for kinship problems, the current paper demonstrates the relevance for mixture cases, and for this purpose, we prove some new general properties. We also demonstrate how to use the distribution of the likelihood ratio for donors of a mixture, to obtain estimates for exceedance probabilities of the LR for non-donors, of which the RMNE is a special case corresponding to L R>0. In order to derive these results, we need to view the likelihood ratio as a random variable. In this paper, we describe how such a randomization can be achieved. The RMNE is usually invoked only for mixtures without dropout. In mixtures, artefacts like dropout and drop-in are commonly encountered and we address this situation too, illustrating our results with a basic but widely implemented model, a so-called binary model. The precise definitions, modelling and interpretation of the required concepts of dropout and drop-in are not entirely obvious, and we attempt to clarify them here in a general likelihood framework for a binary model.

  12. Dynamics

    CERN Document Server

    Goodman, Lawrence E

    2001-01-01

    Beginning text presents complete theoretical treatment of mechanical model systems and deals with technological applications. Topics include introduction to calculus of vectors, particle motion, dynamics of particle systems and plane rigid bodies, technical applications in plane motions, theory of mechanical vibrations, and more. Exercises and answers appear in each chapter.

  13. Context matters!

    DEFF Research Database (Denmark)

    Bojesen, Anders

    2004-01-01

    for granted and unproblematic, although it is agreed to be of great importance. By crystallising three different modes of contextualised competence thinking (prescriptive, descriptive and analytical) the paper shows that the underlying assumptions about context - the interaction between the individual...... and the social - has major consequences for the specific enactment of competence. The paper argues in favour of a second order observation strategy for the context of competence. But in doing so it also shows that prevailing second-order competence theories so far, in criticising (counter) positions (and...

  14. Body mass dynamics of Daphnia in the context of life history theory: an approach based on contribution analysis (in Russian, Abstract in English)

    NARCIS (Netherlands)

    Polishchuk, L.; Vijverberg, J.

    2006-01-01

    In this paper, we elaborate on the method of contribution analysis in relation to body mass dynamics which has been proposed recently (Polishchuk, Vijverberg, 2005. Oecologia. V. 144. P. 268-277). We suggest that contribution analysis as applied to body mass dynamics makes a bridge between

  15. Process criticality accident likelihoods, magnitudes and emergency planning. A focus on solution accidents

    International Nuclear Information System (INIS)

    McLaughlin, Thomas P.

    2003-01-01

    This paper presents analyses and applications of data from reactor and critical experiment research on the dynamics of nuclear excursions in solution media. Available criticality accident information is also discussed and shown to provide strong evidence of the overwhelming likelihood of accidents in liquid media over other forms and to support the measured data. These analyses are shown to provide valuable insights into key parameters important to understanding solution excursion dynamics in general and in evaluating practical upper bounds on criticality accident magnitudes. This understanding and these upper bounds are directly applicable to the evaluation of the consequences of postulated criticality accidents. These bounds are also essential in order to comply with national and international consensus standards and regulatory requirements for emergency planning. (author)

  16. A composite likelihood approach for spatially correlated survival data

    Science.gov (United States)

    Paik, Jane; Ying, Zhiliang

    2013-01-01

    The aim of this paper is to provide a composite likelihood approach to handle spatially correlated survival data using pairwise joint distributions. With e-commerce data, a recent question of interest in marketing research has been to describe spatially clustered purchasing behavior and to assess whether geographic distance is the appropriate metric to describe purchasing dependence. We present a model for the dependence structure of time-to-event data subject to spatial dependence to characterize purchasing behavior from the motivating example from e-commerce data. We assume the Farlie-Gumbel-Morgenstern (FGM) distribution and then model the dependence parameter as a function of geographic and demographic pairwise distances. For estimation of the dependence parameters, we present pairwise composite likelihood equations. We prove that the resulting estimators exhibit key properties of consistency and asymptotic normality under certain regularity conditions in the increasing-domain framework of spatial asymptotic theory. PMID:24223450

  17. A composite likelihood approach for spatially correlated survival data.

    Science.gov (United States)

    Paik, Jane; Ying, Zhiliang

    2013-01-01

    The aim of this paper is to provide a composite likelihood approach to handle spatially correlated survival data using pairwise joint distributions. With e-commerce data, a recent question of interest in marketing research has been to describe spatially clustered purchasing behavior and to assess whether geographic distance is the appropriate metric to describe purchasing dependence. We present a model for the dependence structure of time-to-event data subject to spatial dependence to characterize purchasing behavior from the motivating example from e-commerce data. We assume the Farlie-Gumbel-Morgenstern (FGM) distribution and then model the dependence parameter as a function of geographic and demographic pairwise distances. For estimation of the dependence parameters, we present pairwise composite likelihood equations. We prove that the resulting estimators exhibit key properties of consistency and asymptotic normality under certain regularity conditions in the increasing-domain framework of spatial asymptotic theory.

  18. Secondary Analysis under Cohort Sampling Designs Using Conditional Likelihood

    Directory of Open Access Journals (Sweden)

    Olli Saarela

    2012-01-01

    Full Text Available Under cohort sampling designs, additional covariate data are collected on cases of a specific type and a randomly selected subset of noncases, primarily for the purpose of studying associations with a time-to-event response of interest. With such data available, an interest may arise to reuse them for studying associations between the additional covariate data and a secondary non-time-to-event response variable, usually collected for the whole study cohort at the outset of the study. Following earlier literature, we refer to such a situation as secondary analysis. We outline a general conditional likelihood approach for secondary analysis under cohort sampling designs and discuss the specific situations of case-cohort and nested case-control designs. We also review alternative methods based on full likelihood and inverse probability weighting. We compare the alternative methods for secondary analysis in two simulated settings and apply them in a real-data example.

  19. GENERALIZATION OF RAYLEIGH MAXIMUM LIKELIHOOD DESPECKLING FILTER USING QUADRILATERAL KERNELS

    Directory of Open Access Journals (Sweden)

    S. Sridevi

    2013-02-01

    Full Text Available Speckle noise is the most prevalent noise in clinical ultrasound images. It visibly looks like light and dark spots and deduce the pixel intensity as murkiest. Gazing at fetal ultrasound images, the impact of edge and local fine details are more palpable for obstetricians and gynecologists to carry out prenatal diagnosis of congenital heart disease. A robust despeckling filter has to be contrived to proficiently suppress speckle noise and simultaneously preserve the features. The proposed filter is the generalization of Rayleigh maximum likelihood filter by the exploitation of statistical tools as tuning parameters and use different shapes of quadrilateral kernels to estimate the noise free pixel from neighborhood. The performance of various filters namely Median, Kuwahura, Frost, Homogenous mask filter and Rayleigh maximum likelihood filter are compared with the proposed filter in terms PSNR and image profile. Comparatively the proposed filters surpass the conventional filters.

  20. Likelihood inference for a nonstationary fractional autoregressive model

    DEFF Research Database (Denmark)

    Johansen, Søren; Ørregård Nielsen, Morten

    2010-01-01

    This paper discusses model-based inference in an autoregressive model for fractional processes which allows the process to be fractional of order d or d-b. Fractional differencing involves infinitely many past values and because we are interested in nonstationary processes we model the data X1......,...,X_{T} given the initial values X_{-n}, n=0,1,..., as is usually done. The initial values are not modeled but assumed to be bounded. This represents a considerable generalization relative to all previous work where it is assumed that initial values are zero. For the statistical analysis we assume...... the conditional Gaussian likelihood and for the probability analysis we also condition on initial values but assume that the errors in the autoregressive model are i.i.d. with suitable moment conditions. We analyze the conditional likelihood and its derivatives as stochastic processes in the parameters, including...

  1. Maximum Likelihood Compton Polarimetry with the Compton Spectrometer and Imager

    Energy Technology Data Exchange (ETDEWEB)

    Lowell, A. W.; Boggs, S. E; Chiu, C. L.; Kierans, C. A.; Sleator, C.; Tomsick, J. A.; Zoglauer, A. C. [Space Sciences Laboratory, University of California, Berkeley (United States); Chang, H.-K.; Tseng, C.-H.; Yang, C.-Y. [Institute of Astronomy, National Tsing Hua University, Taiwan (China); Jean, P.; Ballmoos, P. von [IRAP Toulouse (France); Lin, C.-H. [Institute of Physics, Academia Sinica, Taiwan (China); Amman, M. [Lawrence Berkeley National Laboratory (United States)

    2017-10-20

    Astrophysical polarization measurements in the soft gamma-ray band are becoming more feasible as detectors with high position and energy resolution are deployed. Previous work has shown that the minimum detectable polarization (MDP) of an ideal Compton polarimeter can be improved by ∼21% when an unbinned, maximum likelihood method (MLM) is used instead of the standard approach of fitting a sinusoid to a histogram of azimuthal scattering angles. Here we outline a procedure for implementing this maximum likelihood approach for real, nonideal polarimeters. As an example, we use the recent observation of GRB 160530A with the Compton Spectrometer and Imager. We find that the MDP for this observation is reduced by 20% when the MLM is used instead of the standard method.

  2. Physical constraints on the likelihood of life on exoplanets

    Science.gov (United States)

    Lingam, Manasvi; Loeb, Abraham

    2018-04-01

    One of the most fundamental questions in exoplanetology is to determine whether a given planet is habitable. We estimate the relative likelihood of a planet's propensity towards habitability by considering key physical characteristics such as the role of temperature on ecological and evolutionary processes, and atmospheric losses via hydrodynamic escape and stellar wind erosion. From our analysis, we demonstrate that Earth-sized exoplanets in the habitable zone around M-dwarfs seemingly display much lower prospects of being habitable relative to Earth, owing to the higher incident ultraviolet fluxes and closer distances to the host star. We illustrate our results by specifically computing the likelihood (of supporting life) for the recently discovered exoplanets, Proxima b and TRAPPIST-1e, which we find to be several orders of magnitude smaller than that of Earth.

  3. THESEUS: maximum likelihood superpositioning and analysis of macromolecular structures.

    Science.gov (United States)

    Theobald, Douglas L; Wuttke, Deborah S

    2006-09-01

    THESEUS is a command line program for performing maximum likelihood (ML) superpositions and analysis of macromolecular structures. While conventional superpositioning methods use ordinary least-squares (LS) as the optimization criterion, ML superpositions provide substantially improved accuracy by down-weighting variable structural regions and by correcting for correlations among atoms. ML superpositioning is robust and insensitive to the specific atoms included in the analysis, and thus it does not require subjective pruning of selected variable atomic coordinates. Output includes both likelihood-based and frequentist statistics for accurate evaluation of the adequacy of a superposition and for reliable analysis of structural similarities and differences. THESEUS performs principal components analysis for analyzing the complex correlations found among atoms within a structural ensemble. ANSI C source code and selected binaries for various computing platforms are available under the GNU open source license from http://monkshood.colorado.edu/theseus/ or http://www.theseus3d.org.

  4. Deformation of log-likelihood loss function for multiclass boosting.

    Science.gov (United States)

    Kanamori, Takafumi

    2010-09-01

    The purpose of this paper is to study loss functions in multiclass classification. In classification problems, the decision function is estimated by minimizing an empirical loss function, and then, the output label is predicted by using the estimated decision function. We propose a class of loss functions which is obtained by a deformation of the log-likelihood loss function. There are four main reasons why we focus on the deformed log-likelihood loss function: (1) this is a class of loss functions which has not been deeply investigated so far, (2) in terms of computation, a boosting algorithm with a pseudo-loss is available to minimize the proposed loss function, (3) the proposed loss functions provide a clear correspondence between the decision functions and conditional probabilities of output labels, (4) the proposed loss functions satisfy the statistical consistency of the classification error rate which is a desirable property in classification problems. Based on (3), we show that the deformed log-likelihood loss provides a model of mislabeling which is useful as a statistical model of medical diagnostics. We also propose a robust loss function against outliers in multiclass classification based on our approach. The robust loss function is a natural extension of the existing robust loss function for binary classification. A model of mislabeling and a robust loss function are useful to cope with noisy data. Some numerical studies are presented to show the robustness of the proposed loss function. A mathematical characterization of the deformed log-likelihood loss function is also presented. Copyright 2010 Elsevier Ltd. All rights reserved.

  5. Bayesian interpretation of Generalized empirical likelihood by maximum entropy

    OpenAIRE

    Rochet , Paul

    2011-01-01

    We study a parametric estimation problem related to moment condition models. As an alternative to the generalized empirical likelihood (GEL) and the generalized method of moments (GMM), a Bayesian approach to the problem can be adopted, extending the MEM procedure to parametric moment conditions. We show in particular that a large number of GEL estimators can be interpreted as a maximum entropy solution. Moreover, we provide a more general field of applications by proving the method to be rob...

  6. Menyoal Elaboration Likelihood Model (ELM) dan Teori Retorika

    OpenAIRE

    Yudi Perbawaningsih

    2012-01-01

    Abstract: Persuasion is a communication process to establish or change attitudes, which can be understood through theory of Rhetoric and theory of Elaboration Likelihood Model (ELM). This study elaborates these theories in a Public Lecture series which to persuade the students in choosing their concentration of study. The result shows that in term of persuasion effectiveness it is not quite relevant to separate the message and its source. The quality of source is determined by the quality of ...

  7. Maximum Likelihood, Consistency and Data Envelopment Analysis: A Statistical Foundation

    OpenAIRE

    Rajiv D. Banker

    1993-01-01

    This paper provides a formal statistical basis for the efficiency evaluation techniques of data envelopment analysis (DEA). DEA estimators of the best practice monotone increasing and concave production function are shown to be also maximum likelihood estimators if the deviation of actual output from the efficient output is regarded as a stochastic variable with a monotone decreasing probability density function. While the best practice frontier estimator is biased below the theoretical front...

  8. Multiple Improvements of Multiple Imputation Likelihood Ratio Tests

    OpenAIRE

    Chan, Kin Wai; Meng, Xiao-Li

    2017-01-01

    Multiple imputation (MI) inference handles missing data by first properly imputing the missing values $m$ times, and then combining the $m$ analysis results from applying a complete-data procedure to each of the completed datasets. However, the existing method for combining likelihood ratio tests has multiple defects: (i) the combined test statistic can be negative in practice when the reference null distribution is a standard $F$ distribution; (ii) it is not invariant to re-parametrization; ...

  9. Maximum likelihood convolutional decoding (MCD) performance due to system losses

    Science.gov (United States)

    Webster, L.

    1976-01-01

    A model for predicting the computational performance of a maximum likelihood convolutional decoder (MCD) operating in a noisy carrier reference environment is described. This model is used to develop a subroutine that will be utilized by the Telemetry Analysis Program to compute the MCD bit error rate. When this computational model is averaged over noisy reference phase errors using a high-rate interpolation scheme, the results are found to agree quite favorably with experimental measurements.

  10. Menyoal Elaboration Likelihood Model (ELM) Dan Teori Retorika

    OpenAIRE

    Perbawaningsih, Yudi

    2012-01-01

    : Persuasion is a communication process to establish or change attitudes, which can be understood through theory of Rhetoric and theory of Elaboration Likelihood Model (ELM). This study elaborates these theories in a Public Lecture series which to persuade the students in choosing their concentration of study. The result shows that in term of persuasion effectiveness it is not quite relevant to separate the message and its source. The quality of source is determined by the quality of the mess...

  11. Penggunaan Elaboration Likelihood Model dalam Menganalisis Penerimaan Teknologi Informasi

    OpenAIRE

    vitrian, vitrian2

    2010-01-01

    This article discusses some technology acceptance models in an organization. Thorough analysis of how technology is acceptable help managers make any planning to implement new teachnology and make sure that new technology could enhance organization's performance. Elaboration Likelihood Model (ELM) is the one which sheds light on some behavioral factors in acceptance of information technology. The basic tenet of ELM states that human behavior in principle can be influenced through central r...

  12. Statistical Bias in Maximum Likelihood Estimators of Item Parameters.

    Science.gov (United States)

    1982-04-01

    34 a> E r’r~e r ,C Ie I# ne,..,.rVi rnd Id.,flfv b1 - bindk numb.r) I; ,t-i i-cd I ’ tiie bias in the maximum likelihood ,st i- i;, ’ t iIeiIrs in...NTC, IL 60088 Psychometric Laboratory University of North Carolina I ERIC Facility-Acquisitions Davie Hall 013A 4833 Rugby Avenue Chapel Hill, NC

  13. Empirical Likelihood in Nonignorable Covariate-Missing Data Problems.

    Science.gov (United States)

    Xie, Yanmei; Zhang, Biao

    2017-04-20

    Missing covariate data occurs often in regression analysis, which frequently arises in the health and social sciences as well as in survey sampling. We study methods for the analysis of a nonignorable covariate-missing data problem in an assumed conditional mean function when some covariates are completely observed but other covariates are missing for some subjects. We adopt the semiparametric perspective of Bartlett et al. (Improving upon the efficiency of complete case analysis when covariates are MNAR. Biostatistics 2014;15:719-30) on regression analyses with nonignorable missing covariates, in which they have introduced the use of two working models, the working probability model of missingness and the working conditional score model. In this paper, we study an empirical likelihood approach to nonignorable covariate-missing data problems with the objective of effectively utilizing the two working models in the analysis of covariate-missing data. We propose a unified approach to constructing a system of unbiased estimating equations, where there are more equations than unknown parameters of interest. One useful feature of these unbiased estimating equations is that they naturally incorporate the incomplete data into the data analysis, making it possible to seek efficient estimation of the parameter of interest even when the working regression function is not specified to be the optimal regression function. We apply the general methodology of empirical likelihood to optimally combine these unbiased estimating equations. We propose three maximum empirical likelihood estimators of the underlying regression parameters and compare their efficiencies with other existing competitors. We present a simulation study to compare the finite-sample performance of various methods with respect to bias, efficiency, and robustness to model misspecification. The proposed empirical likelihood method is also illustrated by an analysis of a data set from the US National Health and

  14. Democracy, Autocracy and the Likelihood of International Conflict

    OpenAIRE

    Tangerås, Thomas

    2008-01-01

    This is a game-theoretic analysis of the link between regime type and international conflict. The democratic electorate can credibly punish the leader for bad conflict outcomes, whereas the autocratic selectorate cannot. For the fear of being thrown out of office, democratic leaders are (i) more selective about the wars they initiate and (ii) on average win more of the wars they start. Foreign policy behaviour is found to display strategic complementarities. The likelihood of interstate war, ...

  15. Moment Conditions Selection Based on Adaptive Penalized Empirical Likelihood

    Directory of Open Access Journals (Sweden)

    Yunquan Song

    2014-01-01

    Full Text Available Empirical likelihood is a very popular method and has been widely used in the fields of artificial intelligence (AI and data mining as tablets and mobile application and social media dominate the technology landscape. This paper proposes an empirical likelihood shrinkage method to efficiently estimate unknown parameters and select correct moment conditions simultaneously, when the model is defined by moment restrictions in which some are possibly misspecified. We show that our method enjoys oracle-like properties; that is, it consistently selects the correct moment conditions and at the same time its estimator is as efficient as the empirical likelihood estimator obtained by all correct moment conditions. Moreover, unlike the GMM, our proposed method allows us to carry out confidence regions for the parameters included in the model without estimating the covariances of the estimators. For empirical implementation, we provide some data-driven procedures for selecting the tuning parameter of the penalty function. The simulation results show that the method works remarkably well in terms of correct moment selection and the finite sample properties of the estimators. Also, a real-life example is carried out to illustrate the new methodology.

  16. Approximate maximum likelihood estimation for population genetic inference.

    Science.gov (United States)

    Bertl, Johanna; Ewing, Gregory; Kosiol, Carolin; Futschik, Andreas

    2017-11-27

    In many population genetic problems, parameter estimation is obstructed by an intractable likelihood function. Therefore, approximate estimation methods have been developed, and with growing computational power, sampling-based methods became popular. However, these methods such as Approximate Bayesian Computation (ABC) can be inefficient in high-dimensional problems. This led to the development of more sophisticated iterative estimation methods like particle filters. Here, we propose an alternative approach that is based on stochastic approximation. By moving along a simulated gradient or ascent direction, the algorithm produces a sequence of estimates that eventually converges to the maximum likelihood estimate, given a set of observed summary statistics. This strategy does not sample much from low-likelihood regions of the parameter space, and is fast, even when many summary statistics are involved. We put considerable efforts into providing tuning guidelines that improve the robustness and lead to good performance on problems with high-dimensional summary statistics and a low signal-to-noise ratio. We then investigate the performance of our resulting approach and study its properties in simulations. Finally, we re-estimate parameters describing the demographic history of Bornean and Sumatran orang-utans.

  17. Caching and interpolated likelihoods: accelerating cosmological Monte Carlo Markov chains

    Energy Technology Data Exchange (ETDEWEB)

    Bouland, Adam; Easther, Richard; Rosenfeld, Katherine, E-mail: adam.bouland@aya.yale.edu, E-mail: richard.easther@yale.edu, E-mail: krosenfeld@cfa.harvard.edu [Department of Physics, Yale University, New Haven CT 06520 (United States)

    2011-05-01

    We describe a novel approach to accelerating Monte Carlo Markov Chains. Our focus is cosmological parameter estimation, but the algorithm is applicable to any problem for which the likelihood surface is a smooth function of the free parameters and computationally expensive to evaluate. We generate a high-order interpolating polynomial for the log-likelihood using the first points gathered by the Markov chains as a training set. This polynomial then accurately computes the majority of the likelihoods needed in the latter parts of the chains. We implement a simple version of this algorithm as a patch (InterpMC) to CosmoMC and show that it accelerates parameter estimatation by a factor of between two and four for well-converged chains. The current code is primarily intended as a ''proof of concept'', and we argue that there is considerable room for further performance gains. Unlike other approaches to accelerating parameter fits, we make no use of precomputed training sets or special choices of variables, and InterpMC is almost entirely transparent to the user.

  18. Caching and interpolated likelihoods: accelerating cosmological Monte Carlo Markov chains

    International Nuclear Information System (INIS)

    Bouland, Adam; Easther, Richard; Rosenfeld, Katherine

    2011-01-01

    We describe a novel approach to accelerating Monte Carlo Markov Chains. Our focus is cosmological parameter estimation, but the algorithm is applicable to any problem for which the likelihood surface is a smooth function of the free parameters and computationally expensive to evaluate. We generate a high-order interpolating polynomial for the log-likelihood using the first points gathered by the Markov chains as a training set. This polynomial then accurately computes the majority of the likelihoods needed in the latter parts of the chains. We implement a simple version of this algorithm as a patch (InterpMC) to CosmoMC and show that it accelerates parameter estimatation by a factor of between two and four for well-converged chains. The current code is primarily intended as a ''proof of concept'', and we argue that there is considerable room for further performance gains. Unlike other approaches to accelerating parameter fits, we make no use of precomputed training sets or special choices of variables, and InterpMC is almost entirely transparent to the user

  19. Maximum likelihood as a common computational framework in tomotherapy

    International Nuclear Information System (INIS)

    Olivera, G.H.; Shepard, D.M.; Reckwerdt, P.J.; Ruchala, K.; Zachman, J.; Fitchard, E.E.; Mackie, T.R.

    1998-01-01

    Tomotherapy is a dose delivery technique using helical or axial intensity modulated beams. One of the strengths of the tomotherapy concept is that it can incorporate a number of processes into a single piece of equipment. These processes include treatment optimization planning, dose reconstruction and kilovoltage/megavoltage image reconstruction. A common computational technique that could be used for all of these processes would be very appealing. The maximum likelihood estimator, originally developed for emission tomography, can serve as a useful tool in imaging and radiotherapy. We believe that this approach can play an important role in the processes of optimization planning, dose reconstruction and kilovoltage and/or megavoltage image reconstruction. These processes involve computations that require comparable physical methods. They are also based on equivalent assumptions, and they have similar mathematical solutions. As a result, the maximum likelihood approach is able to provide a common framework for all three of these computational problems. We will demonstrate how maximum likelihood methods can be applied to optimization planning, dose reconstruction and megavoltage image reconstruction in tomotherapy. Results for planning optimization, dose reconstruction and megavoltage image reconstruction will be presented. Strengths and weaknesses of the methodology are analysed. Future directions for this work are also suggested. (author)

  20. Generative Contexts

    Science.gov (United States)

    Lyles, Dan Allen

    Educational research has identified how science, technology, engineering, and mathematics (STEM) practice and education have underperforming metrics in racial and gender diversity, despite decades of intervention. These disparities are part of the construction of a culture of science that is alienating to these populations. Recent studies in a social science framework described as "Generative Justice" have suggested that the context of social and scientific practice might be modified to bring about more just and equitable relations among the disenfranchised by circulating the value they and their non-human allies create back to them in unalienated forms. What is not known are the underlying principles of social and material space that makes a system more or less generative. I employ an autoethnographic method at four sites: a high school science class; a farm committed to "Black and Brown liberation"; a summer program geared towards youth environmental mapping; and a summer workshop for Harlem middle school students. My findings suggest that by identifying instances where material affinity, participatory voice, and creative solidarity are mutually reinforcing, it is possible to create educational contexts that generate unalienated value, and circulate it back to the producers themselves. This cycle of generation may help explain how to create systems of justice that strengthen and grow themselves through successive iterations. The problem of lack of diversity in STEM may be addressed not merely by recruiting the best and the brightest from underrepresented populations, but by changing the context of STEM education to provide tools for its own systematic restructuring.

  1. Communicating likelihoods and probabilities in forecasts of volcanic eruptions

    Science.gov (United States)

    Doyle, Emma E. H.; McClure, John; Johnston, David M.; Paton, Douglas

    2014-02-01

    The issuing of forecasts and warnings of natural hazard events, such as volcanic eruptions, earthquake aftershock sequences and extreme weather often involves the use of probabilistic terms, particularly when communicated by scientific advisory groups to key decision-makers, who can differ greatly in relative expertise and function in the decision making process. Recipients may also differ in their perception of relative importance of political and economic influences on interpretation. Consequently, the interpretation of these probabilistic terms can vary greatly due to the framing of the statements, and whether verbal or numerical terms are used. We present a review from the psychology literature on how the framing of information influences communication of these probability terms. It is also unclear as to how people rate their perception of an event's likelihood throughout a time frame when a forecast time window is stated. Previous research has identified that, when presented with a 10-year time window forecast, participants viewed the likelihood of an event occurring ‘today’ as being of less than that in year 10. Here we show that this skew in perception also occurs for short-term time windows (under one week) that are of most relevance for emergency warnings. In addition, unlike the long-time window statements, the use of the phrasing “within the next…” instead of “in the next…” does not mitigate this skew, nor do we observe significant differences between the perceived likelihoods of scientists and non-scientists. This finding suggests that effects occurring due to the shorter time window may be ‘masking’ any differences in perception due to wording or career background observed for long-time window forecasts. These results have implications for scientific advice, warning forecasts, emergency management decision-making, and public information as any skew in perceived event likelihood towards the end of a forecast time window may result in

  2. Comparisons of likelihood and machine learning methods of individual classification

    Science.gov (United States)

    Guinand, B.; Topchy, A.; Page, K.S.; Burnham-Curtis, M. K.; Punch, W.F.; Scribner, K.T.

    2002-01-01

    Classification methods used in machine learning (e.g., artificial neural networks, decision trees, and k-nearest neighbor clustering) are rarely used with population genetic data. We compare different nonparametric machine learning techniques with parametric likelihood estimations commonly employed in population genetics for purposes of assigning individuals to their population of origin (“assignment tests”). Classifier accuracy was compared across simulated data sets representing different levels of population differentiation (low and high FST), number of loci surveyed (5 and 10), and allelic diversity (average of three or eight alleles per locus). Empirical data for the lake trout (Salvelinus namaycush) exhibiting levels of population differentiation comparable to those used in simulations were examined to further evaluate and compare classification methods. Classification error rates associated with artificial neural networks and likelihood estimators were lower for simulated data sets compared to k-nearest neighbor and decision tree classifiers over the entire range of parameters considered. Artificial neural networks only marginally outperformed the likelihood method for simulated data (0–2.8% lower error rates). The relative performance of each machine learning classifier improved relative likelihood estimators for empirical data sets, suggesting an ability to “learn” and utilize properties of empirical genotypic arrays intrinsic to each population. Likelihood-based estimation methods provide a more accessible option for reliable assignment of individuals to the population of origin due to the intricacies in development and evaluation of artificial neural networks. In recent years, characterization of highly polymorphic molecular markers such as mini- and microsatellites and development of novel methods of analysis have enabled researchers to extend investigations of ecological and evolutionary processes below the population level to the level of

  3. Failed refutations: further comments on parsimony and likelihood methods and their relationship to Popper's degree of corroboration.

    Science.gov (United States)

    de Queiroz, Kevin; Poe, Steven

    2003-06-01

    Kluge's (2001, Syst. Biol. 50:322-330) continued arguments that phylogenetic methods based on the statistical principle of likelihood are incompatible with the philosophy of science described by Karl Popper are based on false premises related to Kluge's misrepresentations of Popper's philosophy. Contrary to Kluge's conjectures, likelihood methods are not inherently verificationist; they do not treat every instance of a hypothesis as confirmation of that hypothesis. The historical nature of phylogeny does not preclude phylogenetic hypotheses from being evaluated using the probability of evidence. The low absolute probabilities of hypotheses are irrelevant to the correct interpretation of Popper's concept termed degree of corroboration, which is defined entirely in terms of relative probabilities. Popper did not advocate minimizing background knowledge; in any case, the background knowledge of both parsimony and likelihood methods consists of the general assumption of descent with modification and additional assumptions that are deterministic, concerning which tree is considered most highly corroborated. Although parsimony methods do not assume (in the sense of entailing) that homoplasy is rare, they do assume (in the sense of requiring to obtain a correct phylogenetic inference) certain things about patterns of homoplasy. Both parsimony and likelihood methods assume (in the sense of implying by the manner in which they operate) various things about evolutionary processes, although violation of those assumptions does not always cause the methods to yield incorrect phylogenetic inferences. Test severity is increased by sampling additional relevant characters rather than by character reanalysis, although either interpretation is compatible with the use of phylogenetic likelihood methods. Neither parsimony nor likelihood methods assess test severity (critical evidence) when used to identify a most highly corroborated tree(s) based on a single method or model and a

  4. Pippi — Painless parsing, post-processing and plotting of posterior and likelihood samples

    Science.gov (United States)

    Scott, Pat

    2012-11-01

    Interpreting samples from likelihood or posterior probability density functions is rarely as straightforward as it seems it should be. Producing publication-quality graphics of these distributions is often similarly painful. In this short note I describe pippi, a simple, publicly available package for parsing and post-processing such samples, as well as generating high-quality PDF graphics of the results. Pippi is easily and extensively configurable and customisable, both in its options for parsing and post-processing samples, and in the visual aspects of the figures it produces. I illustrate some of these using an existing supersymmetric global fit, performed in the context of a gamma-ray search for dark matter. Pippi can be downloaded and followed at http://github.com/patscott/pippi.

  5. How are important life events disclosed on facebook? Relationships with likelihood of sharing and privacy.

    Science.gov (United States)

    Bevan, Jennifer L; Cummings, Megan B; Kubiniec, Ashley; Mogannam, Megan; Price, Madison; Todd, Rachel

    2015-01-01

    This study examined an aspect of Facebook disclosure that has as yet gone unexplored: whether a user prefers to share information directly, for example, through status updates, or indirectly, via photos with no caption or relationship status changes without context or explanation. The focus was on the sharing of important positive and negative life events related to romantic relationships, health, and work/school in relation to likelihood of sharing this type of information on Facebook and general attitudes toward privacy. An online survey of 599 adult Facebook users found that when positive life events were shared, users preferred to do so indirectly, whereas negative life events were more likely to be disclosed directly. Privacy shared little association with how information was shared. Implications for understanding the finer nuances of how news is shared on Facebook are discussed.

  6. Targeted search for continuous gravitational waves: Bayesian versus maximum-likelihood statistics

    International Nuclear Information System (INIS)

    Prix, Reinhard; Krishnan, Badri

    2009-01-01

    We investigate the Bayesian framework for detection of continuous gravitational waves (GWs) in the context of targeted searches, where the phase evolution of the GW signal is assumed to be known, while the four amplitude parameters are unknown. We show that the orthodox maximum-likelihood statistic (known as F-statistic) can be rediscovered as a Bayes factor with an unphysical prior in amplitude parameter space. We introduce an alternative detection statistic ('B-statistic') using the Bayes factor with a more natural amplitude prior, namely an isotropic probability distribution for the orientation of GW sources. Monte Carlo simulations of targeted searches show that the resulting Bayesian B-statistic is more powerful in the Neyman-Pearson sense (i.e., has a higher expected detection probability at equal false-alarm probability) than the frequentist F-statistic.

  7. Accuracy of maximum likelihood estimates of a two-state model in single-molecule FRET

    Energy Technology Data Exchange (ETDEWEB)

    Gopich, Irina V. [Laboratory of Chemical Physics, National Institute of Diabetes and Digestive and Kidney Diseases, National Institutes of Health, Bethesda, Maryland 20892 (United States)

    2015-01-21

    Photon sequences from single-molecule Förster resonance energy transfer (FRET) experiments can be analyzed using a maximum likelihood method. Parameters of the underlying kinetic model (FRET efficiencies of the states and transition rates between conformational states) are obtained by maximizing the appropriate likelihood function. In addition, the errors (uncertainties) of the extracted parameters can be obtained from the curvature of the likelihood function at the maximum. We study the standard deviations of the parameters of a two-state model obtained from photon sequences with recorded colors and arrival times. The standard deviations can be obtained analytically in a special case when the FRET efficiencies of the states are 0 and 1 and in the limiting cases of fast and slow conformational dynamics. These results are compared with the results of numerical simulations. The accuracy and, therefore, the ability to predict model parameters depend on how fast the transition rates are compared to the photon count rate. In the limit of slow transitions, the key parameters that determine the accuracy are the number of transitions between the states and the number of independent photon sequences. In the fast transition limit, the accuracy is determined by the small fraction of photons that are correlated with their neighbors. The relative standard deviation of the relaxation rate has a “chevron” shape as a function of the transition rate in the log-log scale. The location of the minimum of this function dramatically depends on how well the FRET efficiencies of the states are separated.

  8. Applying exclusion likelihoods from LHC searches to extended Higgs sectors

    International Nuclear Information System (INIS)

    Bechtle, Philip; Heinemeyer, Sven; Staal, Oscar; Stefaniak, Tim; Weiglein, Georg

    2015-01-01

    LHC searches for non-standard Higgs bosons decaying into tau lepton pairs constitute a sensitive experimental probe for physics beyond the Standard Model (BSM), such as supersymmetry (SUSY). Recently, the limits obtained from these searches have been presented by the CMS collaboration in a nearly model-independent fashion - as a narrow resonance model - based on the full 8 TeV dataset. In addition to publishing a 95 % C.L. exclusion limit, the full likelihood information for the narrowresonance model has been released. This provides valuable information that can be incorporated into global BSM fits. We present a simple algorithm that maps an arbitrary model with multiple neutral Higgs bosons onto the narrow resonance model and derives the corresponding value for the exclusion likelihood from the CMS search. This procedure has been implemented into the public computer code HiggsBounds (version 4.2.0 and higher). We validate our implementation by cross-checking against the official CMS exclusion contours in three Higgs benchmark scenarios in the Minimal Supersymmetric Standard Model (MSSM), and find very good agreement. Going beyond validation, we discuss the combined constraints of the ττ search and the rate measurements of the SM-like Higgs at 125 GeV in a recently proposed MSSM benchmark scenario, where the lightest Higgs boson obtains SM-like couplings independently of the decoupling of the heavier Higgs states. Technical details for how to access the likelihood information within HiggsBounds are given in the appendix. The program is available at http:// higgsbounds.hepforge.org. (orig.)

  9. Australian food life style segments and elaboration likelihood differences

    DEFF Research Database (Denmark)

    Brunsø, Karen; Reid, Mike

    As the global food marketing environment becomes more competitive, the international and comparative perspective of consumers' attitudes and behaviours becomes more important for both practitioners and academics. This research employs the Food-Related Life Style (FRL) instrument in Australia...... in order to 1) determine Australian Life Style Segments and compare these with their European counterparts, and to 2) explore differences in elaboration likelihood among the Australian segments, e.g. consumers' interest and motivation to perceive product related communication. The results provide new...

  10. Maximum-likelihood method for numerical inversion of Mellin transform

    International Nuclear Information System (INIS)

    Iqbal, M.

    1997-01-01

    A method is described for inverting the Mellin transform which uses an expansion in Laguerre polynomials and converts the Mellin transform to Laplace transform, then the maximum-likelihood regularization method is used to recover the original function of the Mellin transform. The performance of the method is illustrated by the inversion of the test functions available in the literature (J. Inst. Math. Appl., 20 (1977) 73; Math. Comput., 53 (1989) 589). Effectiveness of the method is shown by results obtained through demonstration by means of tables and diagrams

  11. How to Improve the Likelihood of CDM Approval?

    DEFF Research Database (Denmark)

    Brandt, Urs Steiner; Svendsen, Gert Tinggaard

    2014-01-01

    How can the likelihood of Clean Development Mechanism (CDM) approval be improved in the face of institutional shortcomings? To answer this question, we focus on the three institutional shortcomings of income sharing, risk sharing and corruption prevention concerning afforestation/reforestation (A....../R). Furthermore, three main stakeholders are identified, namely investors, governments and agents in a principal-agent model regarding monitoring and enforcement capacity. Developing countries such as West Africa have, despite huge potentials, not been integrated in A/R CDM projects yet. Remote sensing, however...

  12. Maximum Likelihood and Bayes Estimation in Randomly Censored Geometric Distribution

    Directory of Open Access Journals (Sweden)

    Hare Krishna

    2017-01-01

    Full Text Available In this article, we study the geometric distribution under randomly censored data. Maximum likelihood estimators and confidence intervals based on Fisher information matrix are derived for the unknown parameters with randomly censored data. Bayes estimators are also developed using beta priors under generalized entropy and LINEX loss functions. Also, Bayesian credible and highest posterior density (HPD credible intervals are obtained for the parameters. Expected time on test and reliability characteristics are also analyzed in this article. To compare various estimates developed in the article, a Monte Carlo simulation study is carried out. Finally, for illustration purpose, a randomly censored real data set is discussed.

  13. Elemental composition of cosmic rays using a maximum likelihood method

    International Nuclear Information System (INIS)

    Ruddick, K.

    1996-01-01

    We present a progress report on our attempts to determine the composition of cosmic rays in the knee region of the energy spectrum. We have used three different devices to measure properties of the extensive air showers produced by primary cosmic rays: the Soudan 2 underground detector measures the muon flux deep underground, a proportional tube array samples shower density at the surface of the earth, and a Cherenkov array observes light produced high in the atmosphere. We have begun maximum likelihood fits to these measurements with the hope of determining the nuclear mass number A on an event by event basis. (orig.)

  14. Process criticality accident likelihoods, consequences and emergency planning

    International Nuclear Information System (INIS)

    McLaughlin, T.P.

    1992-01-01

    Evaluation of criticality accident risks in the processing of significant quantities of fissile materials is both complex and subjective, largely due to the lack of accident statistics. Thus, complying with national and international standards and regulations which require an evaluation of the net benefit of a criticality accident alarm system, is also subjective. A review of guidance found in the literature on potential accident magnitudes is presented for different material forms and arrangements. Reasoned arguments are also presented concerning accident prevention and accident likelihoods for these material forms and arrangements. (Author)

  15. Likelihood Estimation of Gamma Ray Bursts Duration Distribution

    OpenAIRE

    Horvath, Istvan

    2005-01-01

    Two classes of Gamma Ray Bursts have been identified so far, characterized by T90 durations shorter and longer than approximately 2 seconds. It was shown that the BATSE 3B data allow a good fit with three Gaussian distributions in log T90. In the same Volume in ApJ. another paper suggested that the third class of GRBs is may exist. Using the full BATSE catalog here we present the maximum likelihood estimation, which gives us 0.5% probability to having only two subclasses. The MC simulation co...

  16. Process criticality accident likelihoods, consequences, and emergency planning

    Energy Technology Data Exchange (ETDEWEB)

    McLaughlin, T.P.

    1991-01-01

    Evaluation of criticality accident risks in the processing of significant quantities of fissile materials is both complex and subjective, largely due to the lack of accident statistics. Thus, complying with standards such as ISO 7753 which mandates that the need for an alarm system be evaluated, is also subjective. A review of guidance found in the literature on potential accident magnitudes is presented for different material forms and arrangements. Reasoned arguments are also presented concerning accident prevention and accident likelihoods for these material forms and arrangements. 13 refs., 1 fig., 1 tab.

  17. Improved Likelihood Function in Particle-based IR Eye Tracking

    DEFF Research Database (Denmark)

    Satria, R.; Sorensen, J.; Hammoud, R.

    2005-01-01

    In this paper we propose a log likelihood-ratio function of foreground and background models used in a particle filter to track the eye region in dark-bright pupil image sequences. This model fuses information from both dark and bright pupil images and their difference image into one model. Our...... enhanced tracker overcomes the issues of prior selection of static thresholds during the detection of feature observations in the bright-dark difference images. The auto-initialization process is performed using cascaded classifier trained using adaboost and adapted to IR eye images. Experiments show good...

  18. Estimating likelihood of future crashes for crash-prone drivers

    OpenAIRE

    Subasish Das; Xiaoduan Sun; Fan Wang; Charles Leboeuf

    2015-01-01

    At-fault crash-prone drivers are usually considered as the high risk group for possible future incidents or crashes. In Louisiana, 34% of crashes are repeatedly committed by the at-fault crash-prone drivers who represent only 5% of the total licensed drivers in the state. This research has conducted an exploratory data analysis based on the driver faultiness and proneness. The objective of this study is to develop a crash prediction model to estimate the likelihood of future crashes for the a...

  19. Similar tests and the standardized log likelihood ratio statistic

    DEFF Research Database (Denmark)

    Jensen, Jens Ledet

    1986-01-01

    When testing an affine hypothesis in an exponential family the 'ideal' procedure is to calculate the exact similar test, or an approximation to this, based on the conditional distribution given the minimal sufficient statistic under the null hypothesis. By contrast to this there is a 'primitive......' approach in which the marginal distribution of a test statistic considered and any nuisance parameter appearing in the test statistic is replaced by an estimate. We show here that when using standardized likelihood ratio statistics the 'primitive' procedure is in fact an 'ideal' procedure to order O(n -3...

  20. Likelihood Approximation With Parallel Hierarchical Matrices For Large Spatial Datasets

    KAUST Repository

    Litvinenko, Alexander

    2017-11-01

    The main goal of this article is to introduce the parallel hierarchical matrix library HLIBpro to the statistical community. We describe the HLIBCov package, which is an extension of the HLIBpro library for approximating large covariance matrices and maximizing likelihood functions. We show that an approximate Cholesky factorization of a dense matrix of size $2M\\\\times 2M$ can be computed on a modern multi-core desktop in few minutes. Further, HLIBCov is used for estimating the unknown parameters such as the covariance length, variance and smoothness parameter of a Matérn covariance function by maximizing the joint Gaussian log-likelihood function. The computational bottleneck here is expensive linear algebra arithmetics due to large and dense covariance matrices. Therefore covariance matrices are approximated in the hierarchical ($\\\\H$-) matrix format with computational cost $\\\\mathcal{O}(k^2n \\\\log^2 n/p)$ and storage $\\\\mathcal{O}(kn \\\\log n)$, where the rank $k$ is a small integer (typically $k<25$), $p$ the number of cores and $n$ the number of locations on a fairly general mesh. We demonstrate a synthetic example, where the true values of known parameters are known. For reproducibility we provide the C++ code, the documentation, and the synthetic data.

  1. Likelihood Approximation With Parallel Hierarchical Matrices For Large Spatial Datasets

    KAUST Repository

    Litvinenko, Alexander; Sun, Ying; Genton, Marc G.; Keyes, David E.

    2017-01-01

    The main goal of this article is to introduce the parallel hierarchical matrix library HLIBpro to the statistical community. We describe the HLIBCov package, which is an extension of the HLIBpro library for approximating large covariance matrices and maximizing likelihood functions. We show that an approximate Cholesky factorization of a dense matrix of size $2M\\times 2M$ can be computed on a modern multi-core desktop in few minutes. Further, HLIBCov is used for estimating the unknown parameters such as the covariance length, variance and smoothness parameter of a Matérn covariance function by maximizing the joint Gaussian log-likelihood function. The computational bottleneck here is expensive linear algebra arithmetics due to large and dense covariance matrices. Therefore covariance matrices are approximated in the hierarchical ($\\H$-) matrix format with computational cost $\\mathcal{O}(k^2n \\log^2 n/p)$ and storage $\\mathcal{O}(kn \\log n)$, where the rank $k$ is a small integer (typically $k<25$), $p$ the number of cores and $n$ the number of locations on a fairly general mesh. We demonstrate a synthetic example, where the true values of known parameters are known. For reproducibility we provide the C++ code, the documentation, and the synthetic data.

  2. Likelihood inference for a fractionally cointegrated vector autoregressive model

    DEFF Research Database (Denmark)

    Johansen, Søren; Ørregård Nielsen, Morten

    2012-01-01

    such that the process X_{t} is fractional of order d and cofractional of order d-b; that is, there exist vectors ß for which ß'X_{t} is fractional of order d-b, and no other fractionality order is possible. We define the statistical model by 0inference when the true values satisfy b0¿1/2 and d0-b0......We consider model based inference in a fractionally cointegrated (or cofractional) vector autoregressive model with a restricted constant term, ¿, based on the Gaussian likelihood conditional on initial values. The model nests the I(d) VAR model. We give conditions on the parameters...... process in the parameters when errors are i.i.d. with suitable moment conditions and initial values are bounded. When the limit is deterministic this implies uniform convergence in probability of the conditional likelihood function. If the true value b0>1/2, we prove that the limit distribution of (ß...

  3. Likelihood-Based Inference of B Cell Clonal Families.

    Directory of Open Access Journals (Sweden)

    Duncan K Ralph

    2016-10-01

    Full Text Available The human immune system depends on a highly diverse collection of antibody-making B cells. B cell receptor sequence diversity is generated by a random recombination process called "rearrangement" forming progenitor B cells, then a Darwinian process of lineage diversification and selection called "affinity maturation." The resulting receptors can be sequenced in high throughput for research and diagnostics. Such a collection of sequences contains a mixture of various lineages, each of which may be quite numerous, or may consist of only a single member. As a step to understanding the process and result of this diversification, one may wish to reconstruct lineage membership, i.e. to cluster sampled sequences according to which came from the same rearrangement events. We call this clustering problem "clonal family inference." In this paper we describe and validate a likelihood-based framework for clonal family inference based on a multi-hidden Markov Model (multi-HMM framework for B cell receptor sequences. We describe an agglomerative algorithm to find a maximum likelihood clustering, two approximate algorithms with various trade-offs of speed versus accuracy, and a third, fast algorithm for finding specific lineages. We show that under simulation these algorithms greatly improve upon existing clonal family inference methods, and that they also give significantly different clusters than previous methods when applied to two real data sets.

  4. Risk factors and likelihood of Campylobacter colonization in broiler flocks

    Directory of Open Access Journals (Sweden)

    SL Kuana

    2007-09-01

    Full Text Available Campylobacter was investigated in cecal droppings, feces, and cloacal swabs of 22 flocks of 3 to 5 week-old broilers. Risk factors and the likelihood of the presence of this agent in these flocks were determined. Management practices, such as cleaning and disinfection, feeding, drinkers, and litter treatments, were assessed. Results were evaluated using Odds Ratio (OR test, and their significance was tested by Fisher's test (p<0.05. A Campylobacter prevalence of 81.8% was found in the broiler flocks (18/22, and within positive flocks, it varied between 85 and 100%. Campylobacter incidence among sample types was homogenous, being 81.8% in cecal droppings, 80.9% in feces, and 80.4% in cloacal swabs (230. Flocks fed by automatic feeding systems presented higher incidence of Campylobacter as compared to those fed by tube feeders. Litter was reused in 63.6% of the farm, and, despite the lack of statistical significance, there was higher likelihood of Campylobacter incidence when litter was reused. Foot bath was not used in 45.5% of the flocks, whereas the use of foot bath associated to deficient lime management increased the number of positive flocks, although with no statiscal significance. The evaluated parameters were not significantly associated with Campylobacter colonization in the assessed broiler flocks.

  5. Menyoal Elaboration Likelihood Model (ELM dan Teori Retorika

    Directory of Open Access Journals (Sweden)

    Yudi Perbawaningsih

    2012-06-01

    Full Text Available Abstract: Persuasion is a communication process to establish or change attitudes, which can be understood through theory of Rhetoric and theory of Elaboration Likelihood Model (ELM. This study elaborates these theories in a Public Lecture series which to persuade the students in choosing their concentration of study. The result shows that in term of persuasion effectiveness it is not quite relevant to separate the message and its source. The quality of source is determined by the quality of the message, and vice versa. Separating the two routes of the persuasion process as described in the ELM theory would not be relevant. Abstrak: Persuasi adalah proses komunikasi untuk membentuk atau mengubah sikap, yang dapat dipahami dengan teori Retorika dan teori Elaboration Likelihood Model (ELM. Penelitian ini mengelaborasi teori tersebut dalam Kuliah Umum sebagai sarana mempersuasi mahasiswa untuk memilih konsentrasi studi studi yang didasarkan pada proses pengolahan informasi. Menggunakan metode survey, didapatkan hasil yaitu tidaklah cukup relevan memisahkan pesan dan narasumber dalam melihat efektivitas persuasi. Keduanya menyatu yang berarti bahwa kualitas narasumber ditentukan oleh kualitas pesan yang disampaikannya, dan sebaliknya. Memisahkan proses persuasi dalam dua lajur seperti yang dijelaskan dalam ELM teori menjadi tidak relevan.

  6. Gauging the likelihood of stable cavitation from ultrasound contrast agents.

    Science.gov (United States)

    Bader, Kenneth B; Holland, Christy K

    2013-01-07

    The mechanical index (MI) was formulated to gauge the likelihood of adverse bioeffects from inertial cavitation. However, the MI formulation did not consider bubble activity from stable cavitation. This type of bubble activity can be readily nucleated from ultrasound contrast agents (UCAs) and has the potential to promote beneficial bioeffects. Here, the presence of stable cavitation is determined numerically by tracking the onset of subharmonic oscillations within a population of bubbles for frequencies up to 7 MHz and peak rarefactional pressures up to 3 MPa. In addition, the acoustic pressure rupture threshold of an UCA population was determined using the Marmottant model. The threshold for subharmonic emissions of optimally sized bubbles was found to be lower than the inertial cavitation threshold for all frequencies studied. The rupture thresholds of optimally sized UCAs were found to be lower than the threshold for subharmonic emissions for either single cycle or steady state acoustic excitations. Because the thresholds of both subharmonic emissions and UCA rupture are linearly dependent on frequency, an index of the form I(CAV) = P(r)/f (where P(r) is the peak rarefactional pressure in MPa and f is the frequency in MHz) was derived to gauge the likelihood of subharmonic emissions due to stable cavitation activity nucleated from UCAs.

  7. Safe semi-supervised learning based on weighted likelihood.

    Science.gov (United States)

    Kawakita, Masanori; Takeuchi, Jun'ichi

    2014-05-01

    We are interested in developing a safe semi-supervised learning that works in any situation. Semi-supervised learning postulates that n(') unlabeled data are available in addition to n labeled data. However, almost all of the previous semi-supervised methods require additional assumptions (not only unlabeled data) to make improvements on supervised learning. If such assumptions are not met, then the methods possibly perform worse than supervised learning. Sokolovska, Cappé, and Yvon (2008) proposed a semi-supervised method based on a weighted likelihood approach. They proved that this method asymptotically never performs worse than supervised learning (i.e., it is safe) without any assumption. Their method is attractive because it is easy to implement and is potentially general. Moreover, it is deeply related to a certain statistical paradox. However, the method of Sokolovska et al. (2008) assumes a very limited situation, i.e., classification, discrete covariates, n(')→∞ and a maximum likelihood estimator. In this paper, we extend their method by modifying the weight. We prove that our proposal is safe in a significantly wide range of situations as long as n≤n('). Further, we give a geometrical interpretation of the proof of safety through the relationship with the above-mentioned statistical paradox. Finally, we show that the above proposal is asymptotically safe even when n(')

  8. Orthogonal series generalized likelihood ratio test for failure detection and isolation. [for aircraft control

    Science.gov (United States)

    Hall, Steven R.; Walker, Bruce K.

    1990-01-01

    A new failure detection and isolation algorithm for linear dynamic systems is presented. This algorithm, the Orthogonal Series Generalized Likelihood Ratio (OSGLR) test, is based on the assumption that the failure modes of interest can be represented by truncated series expansions. This assumption leads to a failure detection algorithm with several desirable properties. Computer simulation results are presented for the detection of the failures of actuators and sensors of a C-130 aircraft. The results show that the OSGLR test generally performs as well as the GLR test in terms of time to detect a failure and is more robust to failure mode uncertainty. However, the OSGLR test is also somewhat more sensitive to modeling errors than the GLR test.

  9. Improved anomaly detection using multi-scale PLS and generalized likelihood ratio test

    KAUST Repository

    Madakyaru, Muddu

    2017-02-16

    Process monitoring has a central role in the process industry to enhance productivity, efficiency, and safety, and to avoid expensive maintenance. In this paper, a statistical approach that exploit the advantages of multiscale PLS models (MSPLS) and those of a generalized likelihood ratio (GLR) test to better detect anomalies is proposed. Specifically, to consider the multivariate and multi-scale nature of process dynamics, a MSPLS algorithm combining PLS and wavelet analysis is used as modeling framework. Then, GLR hypothesis testing is applied using the uncorrelated residuals obtained from MSPLS model to improve the anomaly detection abilities of these latent variable based fault detection methods even further. Applications to a simulated distillation column data are used to evaluate the proposed MSPLS-GLR algorithm.

  10. BOREAS TE-18 Landsat TM Maximum Likelihood Classification Image of the NSA

    Science.gov (United States)

    Hall, Forrest G. (Editor); Knapp, David

    2000-01-01

    The BOREAS TE-18 team focused its efforts on using remotely sensed data to characterize the successional and disturbance dynamics of the boreal forest for use in carbon modeling. The objective of this classification is to provide the BOREAS investigators with a data product that characterizes the land cover of the NSA. A Landsat-5 TM image from 20-Aug-1988 was used to derive this classification. A standard supervised maximum likelihood classification approach was used to produce this classification. The data are provided in a binary image format file. The data files are available on a CD-ROM (see document number 20010000884), or from the Oak Ridge National Laboratory (ORNL) Distributed Activity Archive Center (DAAC).

  11. Improved anomaly detection using multi-scale PLS and generalized likelihood ratio test

    KAUST Repository

    Madakyaru, Muddu; Harrou, Fouzi; Sun, Ying

    2017-01-01

    Process monitoring has a central role in the process industry to enhance productivity, efficiency, and safety, and to avoid expensive maintenance. In this paper, a statistical approach that exploit the advantages of multiscale PLS models (MSPLS) and those of a generalized likelihood ratio (GLR) test to better detect anomalies is proposed. Specifically, to consider the multivariate and multi-scale nature of process dynamics, a MSPLS algorithm combining PLS and wavelet analysis is used as modeling framework. Then, GLR hypothesis testing is applied using the uncorrelated residuals obtained from MSPLS model to improve the anomaly detection abilities of these latent variable based fault detection methods even further. Applications to a simulated distillation column data are used to evaluate the proposed MSPLS-GLR algorithm.

  12. Maximum likelihood positioning algorithm for high-resolution PET scanners

    International Nuclear Information System (INIS)

    Gross-Weege, Nicolas; Schug, David; Hallen, Patrick; Schulz, Volkmar

    2016-01-01

    Purpose: In high-resolution positron emission tomography (PET), lightsharing elements are incorporated into typical detector stacks to read out scintillator arrays in which one scintillator element (crystal) is smaller than the size of the readout channel. In order to identify the hit crystal by means of the measured light distribution, a positioning algorithm is required. One commonly applied positioning algorithm uses the center of gravity (COG) of the measured light distribution. The COG algorithm is limited in spatial resolution by noise and intercrystal Compton scatter. The purpose of this work is to develop a positioning algorithm which overcomes this limitation. Methods: The authors present a maximum likelihood (ML) algorithm which compares a set of expected light distributions given by probability density functions (PDFs) with the measured light distribution. Instead of modeling the PDFs by using an analytical model, the PDFs of the proposed ML algorithm are generated assuming a single-gamma-interaction model from measured data. The algorithm was evaluated with a hot-rod phantom measurement acquired with the preclinical HYPERION II D PET scanner. In order to assess the performance with respect to sensitivity, energy resolution, and image quality, the ML algorithm was compared to a COG algorithm which calculates the COG from a restricted set of channels. The authors studied the energy resolution of the ML and the COG algorithm regarding incomplete light distributions (missing channel information caused by detector dead time). Furthermore, the authors investigated the effects of using a filter based on the likelihood values on sensitivity, energy resolution, and image quality. Results: A sensitivity gain of up to 19% was demonstrated in comparison to the COG algorithm for the selected operation parameters. Energy resolution and image quality were on a similar level for both algorithms. Additionally, the authors demonstrated that the performance of the ML

  13. The asymptotic behaviour of the maximum likelihood function of Kriging approximations using the Gaussian correlation function

    CSIR Research Space (South Africa)

    Kok, S

    2012-07-01

    Full Text Available continuously as the correlation function hyper-parameters approach zero. Since the global minimizer of the maximum likelihood function is an asymptote in this case, it is unclear if maximum likelihood estimation (MLE) remains valid. Numerical ill...

  14. A Non-standard Empirical Likelihood for Time Series

    DEFF Research Database (Denmark)

    Nordman, Daniel J.; Bunzel, Helle; Lahiri, Soumendra N.

    Standard blockwise empirical likelihood (BEL) for stationary, weakly dependent time series requires specifying a fixed block length as a tuning parameter for setting confidence regions. This aspect can be difficult and impacts coverage accuracy. As an alternative, this paper proposes a new version...... of BEL based on a simple, though non-standard, data-blocking rule which uses a data block of every possible length. Consequently, the method involves no block selection and is also anticipated to exhibit better coverage performance. Its non-standard blocking scheme, however, induces non......-standard asymptotics and requires a significantly different development compared to standard BEL. We establish the large-sample distribution of log-ratio statistics from the new BEL method for calibrating confidence regions for mean or smooth function parameters of time series. This limit law is not the usual chi...

  15. Neutron spectra unfolding with maximum entropy and maximum likelihood

    International Nuclear Information System (INIS)

    Itoh, Shikoh; Tsunoda, Toshiharu

    1989-01-01

    A new unfolding theory has been established on the basis of the maximum entropy principle and the maximum likelihood method. This theory correctly embodies the Poisson statistics of neutron detection, and always brings a positive solution over the whole energy range. Moreover, the theory unifies both problems of overdetermined and of underdetermined. For the latter, the ambiguity in assigning a prior probability, i.e. the initial guess in the Bayesian sense, has become extinct by virtue of the principle. An approximate expression of the covariance matrix for the resultant spectra is also presented. An efficient algorithm to solve the nonlinear system, which appears in the present study, has been established. Results of computer simulation showed the effectiveness of the present theory. (author)

  16. Narrow band interference cancelation in OFDM: Astructured maximum likelihood approach

    KAUST Repository

    Sohail, Muhammad Sadiq

    2012-06-01

    This paper presents a maximum likelihood (ML) approach to mitigate the effect of narrow band interference (NBI) in a zero padded orthogonal frequency division multiplexing (ZP-OFDM) system. The NBI is assumed to be time variant and asynchronous with the frequency grid of the ZP-OFDM system. The proposed structure based technique uses the fact that the NBI signal is sparse as compared to the ZP-OFDM signal in the frequency domain. The structure is also useful in reducing the computational complexity of the proposed method. The paper also presents a data aided approach for improved NBI estimation. The suitability of the proposed method is demonstrated through simulations. © 2012 IEEE.

  17. Calibration of two complex ecosystem models with different likelihood functions

    Science.gov (United States)

    Hidy, Dóra; Haszpra, László; Pintér, Krisztina; Nagy, Zoltán; Barcza, Zoltán

    2014-05-01

    The biosphere is a sensitive carbon reservoir. Terrestrial ecosystems were approximately carbon neutral during the past centuries, but they became net carbon sinks due to climate change induced environmental change and associated CO2 fertilization effect of the atmosphere. Model studies and measurements indicate that the biospheric carbon sink can saturate in the future due to ongoing climate change which can act as a positive feedback. Robustness of carbon cycle models is a key issue when trying to choose the appropriate model for decision support. The input parameters of the process-based models are decisive regarding the model output. At the same time there are several input parameters for which accurate values are hard to obtain directly from experiments or no local measurements are available. Due to the uncertainty associated with the unknown model parameters significant bias can be experienced if the model is used to simulate the carbon and nitrogen cycle components of different ecosystems. In order to improve model performance the unknown model parameters has to be estimated. We developed a multi-objective, two-step calibration method based on Bayesian approach in order to estimate the unknown parameters of PaSim and Biome-BGC models. Biome-BGC and PaSim are a widely used biogeochemical models that simulate the storage and flux of water, carbon, and nitrogen between the ecosystem and the atmosphere, and within the components of the terrestrial ecosystems (in this research the developed version of Biome-BGC is used which is referred as BBGC MuSo). Both models were calibrated regardless the simulated processes and type of model parameters. The calibration procedure is based on the comparison of measured data with simulated results via calculating a likelihood function (degree of goodness-of-fit between simulated and measured data). In our research different likelihood function formulations were used in order to examine the effect of the different model

  18. Preliminary attempt on maximum likelihood tomosynthesis reconstruction of DEI data

    International Nuclear Information System (INIS)

    Wang Zhentian; Huang Zhifeng; Zhang Li; Kang Kejun; Chen Zhiqiang; Zhu Peiping

    2009-01-01

    Tomosynthesis is a three-dimension reconstruction method that can remove the effect of superimposition with limited angle projections. It is especially promising in mammography where radiation dose is concerned. In this paper, we propose a maximum likelihood tomosynthesis reconstruction algorithm (ML-TS) on the apparent absorption data of diffraction enhanced imaging (DEI). The motivation of this contribution is to develop a tomosynthesis algorithm in low-dose or noisy circumstances and make DEI get closer to clinic application. The theoretical statistical models of DEI data in physics are analyzed and the proposed algorithm is validated with the experimental data at the Beijing Synchrotron Radiation Facility (BSRF). The results of ML-TS have better contrast compared with the well known 'shift-and-add' algorithm and FBP algorithm. (authors)

  19. H.264 SVC Complexity Reduction Based on Likelihood Mode Decision

    Directory of Open Access Journals (Sweden)

    L. Balaji

    2015-01-01

    Full Text Available H.264 Advanced Video Coding (AVC was prolonged to Scalable Video Coding (SVC. SVC executes in different electronics gadgets such as personal computer, HDTV, SDTV, IPTV, and full-HDTV in which user demands various scaling of the same content. The various scaling is resolution, frame rate, quality, heterogeneous networks, bandwidth, and so forth. Scaling consumes more encoding time and computational complexity during mode selection. In this paper, to reduce encoding time and computational complexity, a fast mode decision algorithm based on likelihood mode decision (LMD is proposed. LMD is evaluated in both temporal and spatial scaling. From the results, we conclude that LMD performs well, when compared to the previous fast mode decision algorithms. The comparison parameters are time, PSNR, and bit rate. LMD achieve time saving of 66.65% with 0.05% detriment in PSNR and 0.17% increment in bit rate compared with the full search method.

  20. H.264 SVC Complexity Reduction Based on Likelihood Mode Decision.

    Science.gov (United States)

    Balaji, L; Thyagharajan, K K

    2015-01-01

    H.264 Advanced Video Coding (AVC) was prolonged to Scalable Video Coding (SVC). SVC executes in different electronics gadgets such as personal computer, HDTV, SDTV, IPTV, and full-HDTV in which user demands various scaling of the same content. The various scaling is resolution, frame rate, quality, heterogeneous networks, bandwidth, and so forth. Scaling consumes more encoding time and computational complexity during mode selection. In this paper, to reduce encoding time and computational complexity, a fast mode decision algorithm based on likelihood mode decision (LMD) is proposed. LMD is evaluated in both temporal and spatial scaling. From the results, we conclude that LMD performs well, when compared to the previous fast mode decision algorithms. The comparison parameters are time, PSNR, and bit rate. LMD achieve time saving of 66.65% with 0.05% detriment in PSNR and 0.17% increment in bit rate compared with the full search method.

  1. Likelihood Approximation With Hierarchical Matrices For Large Spatial Datasets

    KAUST Repository

    Litvinenko, Alexander

    2017-09-03

    We use available measurements to estimate the unknown parameters (variance, smoothness parameter, and covariance length) of a covariance function by maximizing the joint Gaussian log-likelihood function. To overcome cubic complexity in the linear algebra, we approximate the discretized covariance function in the hierarchical (H-) matrix format. The H-matrix format has a log-linear computational cost and storage O(kn log n), where the rank k is a small integer and n is the number of locations. The H-matrix technique allows us to work with general covariance matrices in an efficient way, since H-matrices can approximate inhomogeneous covariance functions, with a fairly general mesh that is not necessarily axes-parallel, and neither the covariance matrix itself nor its inverse have to be sparse. We demonstrate our method with Monte Carlo simulations and an application to soil moisture data. The C, C++ codes and data are freely available.

  2. Music genre classification via likelihood fusion from multiple feature models

    Science.gov (United States)

    Shiu, Yu; Kuo, C.-C. J.

    2005-01-01

    Music genre provides an efficient way to index songs in a music database, and can be used as an effective means to retrieval music of a similar type, i.e. content-based music retrieval. A new two-stage scheme for music genre classification is proposed in this work. At the first stage, we examine a couple of different features, construct their corresponding parametric models (e.g. GMM and HMM) and compute their likelihood functions to yield soft classification results. In particular, the timbre, rhythm and temporal variation features are considered. Then, at the second stage, these soft classification results are integrated to result in a hard decision for final music genre classification. Experimental results are given to demonstrate the performance of the proposed scheme.

  3. Marginal Maximum Likelihood Estimation of Item Response Models in R

    Directory of Open Access Journals (Sweden)

    Matthew S. Johnson

    2007-02-01

    Full Text Available Item response theory (IRT models are a class of statistical models used by researchers to describe the response behaviors of individuals to a set of categorically scored items. The most common IRT models can be classified as generalized linear fixed- and/or mixed-effect models. Although IRT models appear most often in the psychological testing literature, researchers in other fields have successfully utilized IRT-like models in a wide variety of applications. This paper discusses the three major methods of estimation in IRT and develops R functions utilizing the built-in capabilities of the R environment to find the marginal maximum likelihood estimates of the generalized partial credit model. The currently available R packages ltm is also discussed.

  4. Maximum likelihood estimation of phase-type distributions

    DEFF Research Database (Denmark)

    Esparza, Luz Judith R

    for both univariate and multivariate cases. Methods like the EM algorithm and Markov chain Monte Carlo are applied for this purpose. Furthermore, this thesis provides explicit formulae for computing the Fisher information matrix for discrete and continuous phase-type distributions, which is needed to find......This work is concerned with the statistical inference of phase-type distributions and the analysis of distributions with rational Laplace transform, known as matrix-exponential distributions. The thesis is focused on the estimation of the maximum likelihood parameters of phase-type distributions...... confidence regions for their estimated parameters. Finally, a new general class of distributions, called bilateral matrix-exponential distributions, is defined. These distributions have the entire real line as domain and can be used, for instance, for modelling. In addition, this class of distributions...

  5. The elaboration likelihood model and communication about food risks.

    Science.gov (United States)

    Frewer, L J; Howard, C; Hedderley, D; Shepherd, R

    1997-12-01

    Factors such as hazard type and source credibility have been identified as important in the establishment of effective strategies for risk communication. The elaboration likelihood model was adapted to investigate the potential impact of hazard type, information source, and persuasive content of information on individual engagement in elaborative, or thoughtful, cognitions about risk messages. One hundred sixty respondents were allocated to one of eight experimental groups, and the effects of source credibility, persuasive content of information and hazard type were systematically varied. The impact of the different factors on beliefs about the information and elaborative processing examined. Low credibility was particularly important in reducing risk perceptions, although persuasive content and hazard type were also influential in determining whether elaborative processing occurred.

  6. Maximum Likelihood Blood Velocity Estimator Incorporating Properties of Flow Physics

    DEFF Research Database (Denmark)

    Schlaikjer, Malene; Jensen, Jørgen Arendt

    2004-01-01

    )-data under investigation. The flow physic properties are exploited in the second term, as the range of velocity values investigated in the cross-correlation analysis are compared to the velocity estimates in the temporal and spatial neighborhood of the signal segment under investigation. The new estimator...... has been compared to the cross-correlation (CC) estimator and the previously developed maximum likelihood estimator (MLE). The results show that the CMLE can handle a larger velocity search range and is capable of estimating even low velocity levels from tissue motion. The CC and the MLE produce...... for the CC and the MLE. When the velocity search range is set to twice the limit of the CC and the MLE, the number of incorrect velocity estimates are 0, 19.1, and 7.2% for the CMLE, CC, and MLE, respectively. The ability to handle a larger search range and estimating low velocity levels was confirmed...

  7. Using a dynamical advection to reconstruct a part of the SSH evolution in the context of SWOT, application to the Mediterranean Sea

    Science.gov (United States)

    Rogé, Marine; Morrow, Rosemary; Ubelmann, Clément; Dibarboure, Gérald

    2017-08-01

    The main oceanographic objective of the future SWOT mission is to better characterize the ocean mesoscale and sub-mesoscale circulation, by observing a finer range of ocean topography dynamics down to 20 km wavelength. Despite the very high spatial resolution of the future satellite, it will not capture the time evolution of the shorter mesoscale signals, such as the formation and evolution of small eddies. SWOT will have an exact repeat cycle of 21 days, with near repeats around 5-10 days, depending on the latitude. Here, we investigate a technique to reconstruct the missing 2D SSH signal in the time between two satellite revisits. We use the dynamical interpolation (DI) technique developed by Ubelmann et al. (2015). Based on potential vorticity (hereafter PV) conservation using a one and a half layer quasi-geostrophic model, it features an active advection of the SSH field. This model has been tested in energetic open ocean regions such as the Gulf Stream and the Californian Current, and has given promising results. Here, we test this model in the Western Mediterranean Sea, a lower energy region with complex small scale physics, and compare the SSH reconstruction with the high-resolution Symphonie model. We investigate an extension of the simple dynamical model including a separated mean circulation. We find that the DI gives a 16-18% improvement in the reconstruction of the surface height and eddy kinetic energy fields, compared with a simple linear interpolation, and a 37% improvement in the Northern Current subregion. Reconstruction errors are higher during winter and autumn but statistically, the improvement from the DI is also better for these seasons.

  8. Accelerated maximum likelihood parameter estimation for stochastic biochemical systems

    Directory of Open Access Journals (Sweden)

    Daigle Bernie J

    2012-05-01

    Full Text Available Abstract Background A prerequisite for the mechanistic simulation of a biochemical system is detailed knowledge of its kinetic parameters. Despite recent experimental advances, the estimation of unknown parameter values from observed data is still a bottleneck for obtaining accurate simulation results. Many methods exist for parameter estimation in deterministic biochemical systems; methods for discrete stochastic systems are less well developed. Given the probabilistic nature of stochastic biochemical models, a natural approach is to choose parameter values that maximize the probability of the observed data with respect to the unknown parameters, a.k.a. the maximum likelihood parameter estimates (MLEs. MLE computation for all but the simplest models requires the simulation of many system trajectories that are consistent with experimental data. For models with unknown parameters, this presents a computational challenge, as the generation of consistent trajectories can be an extremely rare occurrence. Results We have developed Monte Carlo Expectation-Maximization with Modified Cross-Entropy Method (MCEM2: an accelerated method for calculating MLEs that combines advances in rare event simulation with a computationally efficient version of the Monte Carlo expectation-maximization (MCEM algorithm. Our method requires no prior knowledge regarding parameter values, and it automatically provides a multivariate parameter uncertainty estimate. We applied the method to five stochastic systems of increasing complexity, progressing from an analytically tractable pure-birth model to a computationally demanding model of yeast-polarization. Our results demonstrate that MCEM2 substantially accelerates MLE computation on all tested models when compared to a stand-alone version of MCEM. Additionally, we show how our method identifies parameter values for certain classes of models more accurately than two recently proposed computationally efficient methods

  9. CONSTRUCTING A FLEXIBLE LIKELIHOOD FUNCTION FOR SPECTROSCOPIC INFERENCE

    International Nuclear Information System (INIS)

    Czekala, Ian; Andrews, Sean M.; Mandel, Kaisey S.; Green, Gregory M.; Hogg, David W.

    2015-01-01

    We present a modular, extensible likelihood framework for spectroscopic inference based on synthetic model spectra. The subtraction of an imperfect model from a continuously sampled spectrum introduces covariance between adjacent datapoints (pixels) into the residual spectrum. For the high signal-to-noise data with large spectral range that is commonly employed in stellar astrophysics, that covariant structure can lead to dramatically underestimated parameter uncertainties (and, in some cases, biases). We construct a likelihood function that accounts for the structure of the covariance matrix, utilizing the machinery of Gaussian process kernels. This framework specifically addresses the common problem of mismatches in model spectral line strengths (with respect to data) due to intrinsic model imperfections (e.g., in the atomic/molecular databases or opacity prescriptions) by developing a novel local covariance kernel formalism that identifies and self-consistently downweights pathological spectral line “outliers.” By fitting many spectra in a hierarchical manner, these local kernels provide a mechanism to learn about and build data-driven corrections to synthetic spectral libraries. An open-source software implementation of this approach is available at http://iancze.github.io/Starfish, including a sophisticated probabilistic scheme for spectral interpolation when using model libraries that are sparsely sampled in the stellar parameters. We demonstrate some salient features of the framework by fitting the high-resolution V-band spectrum of WASP-14, an F5 dwarf with a transiting exoplanet, and the moderate-resolution K-band spectrum of Gliese 51, an M5 field dwarf

  10. Likelihood of illegal alcohol sales at professional sport stadiums.

    Science.gov (United States)

    Toomey, Traci L; Erickson, Darin J; Lenk, Kathleen M; Kilian, Gunna R

    2008-11-01

    Several studies have assessed the propensity for illegal alcohol sales at licensed alcohol establishments and community festivals, but no previous studies examined the propensity for these sales at professional sport stadiums. In this study, we assessed the likelihood of alcohol sales to both underage youth and obviously intoxicated patrons at professional sports stadiums across the United States, and assessed the factors related to likelihood of both types of alcohol sales. We conducted pseudo-underage (i.e., persons age 21 or older who appear under 21) and pseudo-intoxicated (i.e., persons feigning intoxication) alcohol purchase attempts at stadiums that house professional hockey, basketball, baseball, and football teams. We conducted the purchase attempts at 16 sport stadiums located in 5 states. We measured 2 outcome variables: pseudo-underage sale (yes, no) and pseudo-intoxicated sale (yes, no), and 3 types of independent variables: (1) seller characteristics, (2) purchase attempt characteristics, and (3) event characteristics. Following univariate and bivariate analyses, we a separate series of logistic generalized mixed regression models for each outcome variable. The overall sales rates to the pseudo-underage and pseudo-intoxicated buyers were 18% and 74%, respectively. In the multivariate logistic analyses, we found that the odds of a sale to a pseudo-underage buyer in the stands was 2.9 as large as the odds of a sale at the concession booths (30% vs. 13%; p = 0.01). The odds of a sale to an obviously intoxicated buyer in the stands was 2.9 as large as the odds of a sale at the concession booths (89% vs. 73%; p = 0.02). Similar to studies assessing illegal alcohol sales at licensed alcohol establishments and community festivals, findings from this study shows the need for interventions specifically focused on illegal alcohol sales at professional sporting events.

  11. Targeted maximum likelihood estimation for a binary treatment: A tutorial.

    Science.gov (United States)

    Luque-Fernandez, Miguel Angel; Schomaker, Michael; Rachet, Bernard; Schnitzer, Mireille E

    2018-04-23

    When estimating the average effect of a binary treatment (or exposure) on an outcome, methods that incorporate propensity scores, the G-formula, or targeted maximum likelihood estimation (TMLE) are preferred over naïve regression approaches, which are biased under misspecification of a parametric outcome model. In contrast propensity score methods require the correct specification of an exposure model. Double-robust methods only require correct specification of either the outcome or the exposure model. Targeted maximum likelihood estimation is a semiparametric double-robust method that improves the chances of correct model specification by allowing for flexible estimation using (nonparametric) machine-learning methods. It therefore requires weaker assumptions than its competitors. We provide a step-by-step guided implementation of TMLE and illustrate it in a realistic scenario based on cancer epidemiology where assumptions about correct model specification and positivity (ie, when a study participant had 0 probability of receiving the treatment) are nearly violated. This article provides a concise and reproducible educational introduction to TMLE for a binary outcome and exposure. The reader should gain sufficient understanding of TMLE from this introductory tutorial to be able to apply the method in practice. Extensive R-code is provided in easy-to-read boxes throughout the article for replicability. Stata users will find a testing implementation of TMLE and additional material in the Appendix S1 and at the following GitHub repository: https://github.com/migariane/SIM-TMLE-tutorial. © 2018 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd.

  12. Is there a hierarchy of social inferences? The likelihood and speed of inferring intentionality, mind, and personality.

    Science.gov (United States)

    Malle, Bertram F; Holbrook, Jess

    2012-04-01

    People interpret behavior by making inferences about agents' intentionality, mind, and personality. Past research studied such inferences 1 at a time; in real life, people make these inferences simultaneously. The present studies therefore examined whether 4 major inferences (intentionality, desire, belief, and personality), elicited simultaneously in response to an observed behavior, might be ordered in a hierarchy of likelihood and speed. To achieve generalizability, the studies included a wide range of stimulus behaviors, presented them verbally and as dynamic videos, and assessed inferences both in a retrieval paradigm (measuring the likelihood and speed of accessing inferences immediately after they were made) and in an online processing paradigm (measuring the speed of forming inferences during behavior observation). Five studies provide evidence for a hierarchy of social inferences-from intentionality and desire to belief to personality-that is stable across verbal and visual presentations and that parallels the order found in developmental and primate research. (c) 2012 APA, all rights reserved.

  13. Maximum-likelihood estimation of the hyperbolic parameters from grouped observations

    DEFF Research Database (Denmark)

    Jensen, Jens Ledet

    1988-01-01

    a least-squares problem. The second procedure Hypesti first approaches the maximum-likelihood estimate by iterating in the profile-log likelihood function for the scale parameter. Close to the maximum of the likelihood function, the estimation is brought to an end by iteration, using all four parameters...

  14. A short proof that phylogenetic tree reconstruction by maximum likelihood is hard.

    Science.gov (United States)

    Roch, Sebastien

    2006-01-01

    Maximum likelihood is one of the most widely used techniques to infer evolutionary histories. Although it is thought to be intractable, a proof of its hardness has been lacking. Here, we give a short proof that computing the maximum likelihood tree is NP-hard by exploiting a connection between likelihood and parsimony observed by Tuffley and Steel.

  15. A Short Proof that Phylogenetic Tree Reconstruction by Maximum Likelihood is Hard

    OpenAIRE

    Roch, S.

    2005-01-01

    Maximum likelihood is one of the most widely used techniques to infer evolutionary histories. Although it is thought to be intractable, a proof of its hardness has been lacking. Here, we give a short proof that computing the maximum likelihood tree is NP-hard by exploiting a connection between likelihood and parsimony observed by Tuffley and Steel.

  16. Charting Multidisciplinary Team External Dynamics Using a Systems Thinking Approach

    Science.gov (United States)

    Barthelemy, Jean-Francois; Waszak, Martin R.; Jones, Kenneth M.; Silcox, Richard J.; Silva, Walter A.; Nowaczyk, Ronald H.

    1998-01-01

    Using the formalism provided by the Systems Thinking approach, the dynamics present when operating multidisciplinary teams are examined in the context of the NASA Langley Research and Technology Group, an R&D organization organized along functional lines. The paper focuses on external dynamics and examines how an organization creates and nurtures the teams and how it disseminates and retains the lessons and expertise created by the multidisciplinary activities. Key variables are selected and the causal relationships between the variables are identified. Five "stories" are told, each of which touches on a different aspect of the dynamics. The Systems Thinking Approach provides recommendations as to interventions that will facilitate the introduction of multidisciplinary teams and that therefore will increase the likelihood of performing successful multidisciplinary developments. These interventions can be carried out either by individual researchers, line management or program management.

  17. A guideline for the validation of likelihood ratio methods used for forensic evidence evaluation.

    Science.gov (United States)

    Meuwly, Didier; Ramos, Daniel; Haraksim, Rudolf

    2017-07-01

    This Guideline proposes a protocol for the validation of forensic evaluation methods at the source level, using the Likelihood Ratio framework as defined within the Bayes' inference model. In the context of the inference of identity of source, the Likelihood Ratio is used to evaluate the strength of the evidence for a trace specimen, e.g. a fingermark, and a reference specimen, e.g. a fingerprint, to originate from common or different sources. Some theoretical aspects of probabilities necessary for this Guideline were discussed prior to its elaboration, which started after a workshop of forensic researchers and practitioners involved in this topic. In the workshop, the following questions were addressed: "which aspects of a forensic evaluation scenario need to be validated?", "what is the role of the LR as part of a decision process?" and "how to deal with uncertainty in the LR calculation?". The questions: "what to validate?" focuses on the validation methods and criteria and "how to validate?" deals with the implementation of the validation protocol. Answers to these questions were deemed necessary with several objectives. First, concepts typical for validation standards [1], such as performance characteristics, performance metrics and validation criteria, will be adapted or applied by analogy to the LR framework. Second, a validation strategy will be defined. Third, validation methods will be described. Finally, a validation protocol and an example of validation report will be proposed, which can be applied to the forensic fields developing and validating LR methods for the evaluation of the strength of evidence at source level under the following propositions. Copyright © 2016. Published by Elsevier B.V.

  18. LIKELIHOOD-FREE COSMOLOGICAL INFERENCE WITH TYPE Ia SUPERNOVAE: APPROXIMATE BAYESIAN COMPUTATION FOR A COMPLETE TREATMENT OF UNCERTAINTY

    Energy Technology Data Exchange (ETDEWEB)

    Weyant, Anja; Wood-Vasey, W. Michael [Pittsburgh Particle Physics, Astrophysics, and Cosmology Center (PITT PACC), Physics and Astronomy Department, University of Pittsburgh, Pittsburgh, PA 15260 (United States); Schafer, Chad, E-mail: anw19@pitt.edu [Department of Statistics, Carnegie Mellon University, Pittsburgh, PA 15213 (United States)

    2013-02-20

    Cosmological inference becomes increasingly difficult when complex data-generating processes cannot be modeled by simple probability distributions. With the ever-increasing size of data sets in cosmology, there is an increasing burden placed on adequate modeling; systematic errors in the model will dominate where previously these were swamped by statistical errors. For example, Gaussian distributions are an insufficient representation for errors in quantities like photometric redshifts. Likewise, it can be difficult to quantify analytically the distribution of errors that are introduced in complex fitting codes. Without a simple form for these distributions, it becomes difficult to accurately construct a likelihood function for the data as a function of parameters of interest. Approximate Bayesian computation (ABC) provides a means of probing the posterior distribution when direct calculation of a sufficiently accurate likelihood is intractable. ABC allows one to bypass direct calculation of the likelihood but instead relies upon the ability to simulate the forward process that generated the data. These simulations can naturally incorporate priors placed on nuisance parameters, and hence these can be marginalized in a natural way. We present and discuss ABC methods in the context of supernova cosmology using data from the SDSS-II Supernova Survey. Assuming a flat cosmology and constant dark energy equation of state, we demonstrate that ABC can recover an accurate posterior distribution. Finally, we show that ABC can still produce an accurate posterior distribution when we contaminate the sample with Type IIP supernovae.

  19. LIKELIHOOD-FREE COSMOLOGICAL INFERENCE WITH TYPE Ia SUPERNOVAE: APPROXIMATE BAYESIAN COMPUTATION FOR A COMPLETE TREATMENT OF UNCERTAINTY

    International Nuclear Information System (INIS)

    Weyant, Anja; Wood-Vasey, W. Michael; Schafer, Chad

    2013-01-01

    Cosmological inference becomes increasingly difficult when complex data-generating processes cannot be modeled by simple probability distributions. With the ever-increasing size of data sets in cosmology, there is an increasing burden placed on adequate modeling; systematic errors in the model will dominate where previously these were swamped by statistical errors. For example, Gaussian distributions are an insufficient representation for errors in quantities like photometric redshifts. Likewise, it can be difficult to quantify analytically the distribution of errors that are introduced in complex fitting codes. Without a simple form for these distributions, it becomes difficult to accurately construct a likelihood function for the data as a function of parameters of interest. Approximate Bayesian computation (ABC) provides a means of probing the posterior distribution when direct calculation of a sufficiently accurate likelihood is intractable. ABC allows one to bypass direct calculation of the likelihood but instead relies upon the ability to simulate the forward process that generated the data. These simulations can naturally incorporate priors placed on nuisance parameters, and hence these can be marginalized in a natural way. We present and discuss ABC methods in the context of supernova cosmology using data from the SDSS-II Supernova Survey. Assuming a flat cosmology and constant dark energy equation of state, we demonstrate that ABC can recover an accurate posterior distribution. Finally, we show that ABC can still produce an accurate posterior distribution when we contaminate the sample with Type IIP supernovae.

  20. Temporal Dynamics and Spatial Patterns of Aedes aegypti Breeding Sites, in the Context of a Dengue Control Program in Tartagal (Salta Province, Argentina)

    Science.gov (United States)

    Espinosa, Manuel; Weinberg, Diego; Rotela, Camilo H.; Polop, Francisco; Abril, Marcelo; Scavuzzo, Carlos Marcelo

    2016-01-01

    Background Since 2009, Fundación Mundo Sano has implemented an Aedes aegypti Surveillance and Control Program in Tartagal city (Salta Province, Argentina). The purpose of this study was to analyze temporal dynamics of Ae. aegypti breeding sites spatial distribution, during five years of samplings, and the effect of control actions over vector population dynamics. Methodology/Principal Findings Seasonal entomological (larval) samplings were conducted in 17,815 fixed sites in Tartagal urban area between 2009 and 2014. Based on information of breeding sites abundance, from satellite remote sensing data (RS), and by the use of Geographic Information Systems (GIS), spatial analysis (hotspots and cluster analysis) and predictive model (MaxEnt) were performed. Spatial analysis showed a distribution pattern with the highest breeding densities registered in city outskirts. The model indicated that 75% of Ae. aegypti distribution is explained by 3 variables: bare soil coverage percentage (44.9%), urbanization coverage percentage(13.5%) and water distribution (11.6%). Conclusions/Significance This results have called attention to the way entomological field data and information from geospatial origin (RS/GIS) are used to infer scenarios which could then be applied in epidemiological surveillance programs and in the determination of dengue control strategies. Predictive maps development constructed with Ae. aegypti systematic spatiotemporal data, in Tartagal city, would allow public health workers to identify and target high-risk areas with appropriate and timely control measures. These tools could help decision-makers to improve health system responses and preventive measures related to vector control. PMID:27223693

  1. Seasonal and regional dynamics of M. ulcerans transmission in environmental context: deciphering the role of water bugs as hosts and vectors.

    Science.gov (United States)

    Marion, Estelle; Eyangoh, Sara; Yeramian, Edouard; Doannio, Julien; Landier, Jordi; Aubry, Jacques; Fontanet, Arnaud; Rogier, Christophe; Cassisa, Viviane; Cottin, Jane; Marot, Agnès; Eveillard, Matthieu; Kamdem, Yannick; Legras, Pierre; Deshayes, Caroline; Saint-André, Jean-Paul; Marsollier, Laurent

    2010-07-06

    Buruli ulcer, the third mycobacterial disease after tuberculosis and leprosy, is caused by the environmental mycobacterium M. ulcerans. Various modes of transmission have been suspected for this disease, with no general consensus acceptance for any of them up to now. Since laboratory models demonstrated the ability of water bugs to transmit M. ulcerans, a particular attention is focused on the transmission of the bacilli by water bugs as hosts and vectors. However, it is only through detailed knowledge of the biodiversity and ecology of water bugs that the importance of this mode of transmission can be fully assessed. It is the objective of the work here to decipher the role of water bugs in M. ulcerans ecology and transmission, based on large-scale field studies. The distribution of M. ulcerans-hosting water bugs was monitored on previously unprecedented time and space scales: a total of 7,407 water bugs, belonging to large number of different families, were collected over one year, in Buruli ulcer endemic and non endemic areas in central Cameroon. This study demonstrated the presence of M. ulcerans in insect saliva. In addition, the field results provided a full picture of the ecology of transmission in terms of biodiversity and detailed specification of seasonal and regional dynamics, with large temporal heterogeneity in the insect tissue colonization rate and detection of M. ulcerans only in water bug tissues collected in Buruli ulcer endemic areas. The large-scale detection of bacilli in saliva of biting water bugs gives enhanced weight to their role in M. ulcerans transmission. On practical grounds, beyond the ecological interest, the results concerning seasonal and regional dynamics can provide an efficient tool in the hands of sanitary authorities to monitor environmental risks associated with Buruli ulcer.

  2. Understanding Social Dynamics and School Shootings Requires Larger Context. Commentary on: "Bullying, Romantic Rejection, and Conflicts with Teachers: The Crucial Role of Social Dynamics in the Development of School Shootings--A Systematic Review"

    Science.gov (United States)

    Klein, Jessie

    2014-01-01

    Jessie Klein begins her commentary on "Bullying, Romantic Rejection, and Conflicts with Teachers: The Crucial Role of Social Dynamics in the Development of School Shootings--A Systematic Review" (Sommer, Leuschner, and Scheithauer," International Journal of Developmental Science" v 8, n1-2, p3-24, 2014) by saying that to…

  3. Context-Oriented Language Engineering

    NARCIS (Netherlands)

    T. van der Storm (Tijs)

    2017-01-01

    textabstractContext-oriented programming (COP) facilitates creating software that can dynamically adapt to its environment, such as device, user preferences, sensor inputs and so on. Software language engineering (SLE) is the discipline of principled methods and techniques for creating software

  4. Digital Citizenship within Global Contexts

    Science.gov (United States)

    Searson, Michael; Hancock, Marsali; Soheil, Nusrat; Shepherd, Gregory

    2015-01-01

    EduSummIT 2013 featured a working group that examined digital citizenship within a global context. Group members recognized that, given today's international, regional, political, and social dynamics, the notion of "global" might be more aspirational than practical. The development of informed policies and practices serving and involving…

  5. Planck 2013 results. XV. CMB power spectra and likelihood

    Science.gov (United States)

    Planck Collaboration; Ade, P. A. R.; Aghanim, N.; Armitage-Caplan, C.; Arnaud, M.; Ashdown, M.; Atrio-Barandela, F.; Aumont, J.; Baccigalupi, C.; Banday, A. J.; Barreiro, R. B.; Bartlett, J. G.; Battaner, E.; Benabed, K.; Benoît, A.; Benoit-Lévy, A.; Bernard, J.-P.; Bersanelli, M.; Bielewicz, P.; Bobin, J.; Bock, J. J.; Bonaldi, A.; Bonavera, L.; Bond, J. R.; Borrill, J.; Bouchet, F. R.; Boulanger, F.; Bridges, M.; Bucher, M.; Burigana, C.; Butler, R. C.; Calabrese, E.; Cardoso, J.-F.; Catalano, A.; Challinor, A.; Chamballu, A.; Chiang, H. C.; Chiang, L.-Y.; Christensen, P. R.; Church, S.; Clements, D. L.; Colombi, S.; Colombo, L. P. L.; Combet, C.; Couchot, F.; Coulais, A.; Crill, B. P.; Curto, A.; Cuttaia, F.; Danese, L.; Davies, R. D.; Davis, R. J.; de Bernardis, P.; de Rosa, A.; de Zotti, G.; Delabrouille, J.; Delouis, J.-M.; Désert, F.-X.; Dickinson, C.; Diego, J. M.; Dole, H.; Donzelli, S.; Doré, O.; Douspis, M.; Dunkley, J.; Dupac, X.; Efstathiou, G.; Elsner, F.; Enßlin, T. A.; Eriksen, H. K.; Finelli, F.; Forni, O.; Frailis, M.; Fraisse, A. A.; Franceschi, E.; Gaier, T. C.; Galeotta, S.; Galli, S.; Ganga, K.; Giard, M.; Giardino, G.; Giraud-Héraud, Y.; Gjerløw, E.; González-Nuevo, J.; Górski, K. M.; Gratton, S.; Gregorio, A.; Gruppuso, A.; Gudmundsson, J. E.; Hansen, F. K.; Hanson, D.; Harrison, D.; Helou, G.; Henrot-Versillé, S.; Hernández-Monteagudo, C.; Herranz, D.; Hildebrandt, S. R.; Hivon, E.; Hobson, M.; Holmes, W. A.; Hornstrup, A.; Hovest, W.; Huffenberger, K. M.; Hurier, G.; Jaffe, A. H.; Jaffe, T. R.; Jewell, J.; Jones, W. C.; Juvela, M.; Keihänen, E.; Keskitalo, R.; Kiiveri, K.; Kisner, T. S.; Kneissl, R.; Knoche, J.; Knox, L.; Kunz, M.; Kurki-Suonio, H.; Lagache, G.; Lähteenmäki, A.; Lamarre, J.-M.; Lasenby, A.; Lattanzi, M.; Laureijs, R. J.; Lawrence, C. R.; Le Jeune, M.; Leach, S.; Leahy, J. P.; Leonardi, R.; León-Tavares, J.; Lesgourgues, J.; Liguori, M.; Lilje, P. B.; Linden-Vørnle, M.; Lindholm, V.; López-Caniego, M.; Lubin, P. M.; Macías-Pérez, J. F.; Maffei, B.; Maino, D.; Mandolesi, N.; Marinucci, D.; Maris, M.; Marshall, D. J.; Martin, P. G.; Martínez-González, E.; Masi, S.; Massardi, M.; Matarrese, S.; Matthai, F.; Mazzotta, P.; Meinhold, P. R.; Melchiorri, A.; Mendes, L.; Menegoni, E.; Mennella, A.; Migliaccio, M.; Millea, M.; Mitra, S.; Miville-Deschênes, M.-A.; Molinari, D.; Moneti, A.; Montier, L.; Morgante, G.; Mortlock, D.; Moss, A.; Munshi, D.; Murphy, J. A.; Naselsky, P.; Nati, F.; Natoli, P.; Netterfield, C. B.; Nørgaard-Nielsen, H. U.; Noviello, F.; Novikov, D.; Novikov, I.; O'Dwyer, I. J.; Orieux, F.; Osborne, S.; Oxborrow, C. A.; Paci, F.; Pagano, L.; Pajot, F.; Paladini, R.; Paoletti, D.; Partridge, B.; Pasian, F.; Patanchon, G.; Paykari, P.; Perdereau, O.; Perotto, L.; Perrotta, F.; Piacentini, F.; Piat, M.; Pierpaoli, E.; Pietrobon, D.; Plaszczynski, S.; Pointecouteau, E.; Polenta, G.; Ponthieu, N.; Popa, L.; Poutanen, T.; Pratt, G. W.; Prézeau, G.; Prunet, S.; Puget, J.-L.; Rachen, J. P.; Rahlin, A.; Rebolo, R.; Reinecke, M.; Remazeilles, M.; Renault, C.; Ricciardi, S.; Riller, T.; Ringeval, C.; Ristorcelli, I.; Rocha, G.; Rosset, C.; Roudier, G.; Rowan-Robinson, M.; Rubiño-Martín, J. A.; Rusholme, B.; Sandri, M.; Sanselme, L.; Santos, D.; Savini, G.; Scott, D.; Seiffert, M. D.; Shellard, E. P. S.; Spencer, L. D.; Starck, J.-L.; Stolyarov, V.; Stompor, R.; Sudiwala, R.; Sureau, F.; Sutton, D.; Suur-Uski, A.-S.; Sygnet, J.-F.; Tauber, J. A.; Tavagnacco, D.; Terenzi, L.; Toffolatti, L.; Tomasi, M.; Tristram, M.; Tucci, M.; Tuovinen, J.; Türler, M.; Valenziano, L.; Valiviita, J.; Van Tent, B.; Varis, J.; Vielva, P.; Villa, F.; Vittorio, N.; Wade, L. A.; Wandelt, B. D.; Wehus, I. K.; White, M.; White, S. D. M.; Yvon, D.; Zacchei, A.; Zonca, A.

    2014-11-01

    This paper presents the Planck 2013 likelihood, a complete statistical description of the two-point correlation function of the CMB temperature fluctuations that accounts for all known relevant uncertainties, both instrumental and astrophysical in nature. We use this likelihood to derive our best estimate of the CMB angular power spectrum from Planck over three decades in multipole moment, ℓ, covering 2 ≤ ℓ ≤ 2500. The main source of uncertainty at ℓ ≲ 1500 is cosmic variance. Uncertainties in small-scale foreground modelling and instrumental noise dominate the error budget at higher ℓs. For ℓ impact of residual foreground and instrumental uncertainties on the final cosmological parameters. We find good internal agreement among the high-ℓ cross-spectra with residuals below a few μK2 at ℓ ≲ 1000, in agreement with estimated calibration uncertainties. We compare our results with foreground-cleaned CMB maps derived from all Planck frequencies, as well as with cross-spectra derived from the 70 GHz Planck map, and find broad agreement in terms of spectrum residuals and cosmological parameters. We further show that the best-fit ΛCDM cosmology is in excellent agreement with preliminary PlanckEE and TE polarisation spectra. We find that the standard ΛCDM cosmology is well constrained by Planck from the measurements at ℓ ≲ 1500. One specific example is the spectral index of scalar perturbations, for which we report a 5.4σ deviation from scale invariance, ns = 1. Increasing the multipole range beyond ℓ ≃ 1500 does not increase our accuracy for the ΛCDM parameters, but instead allows us to study extensions beyond the standard model. We find no indication of significant departures from the ΛCDM framework. Finally, we report a tension between the Planck best-fit ΛCDM model and the low-ℓ spectrum in the form of a power deficit of 5-10% at ℓ ≲ 40, with a statistical significance of 2.5-3σ. Without a theoretically motivated model for

  6. Likelihood ratio model for classification of forensic evidence

    Energy Technology Data Exchange (ETDEWEB)

    Zadora, G., E-mail: gzadora@ies.krakow.pl [Institute of Forensic Research, Westerplatte 9, 31-033 Krakow (Poland); Neocleous, T., E-mail: tereza@stats.gla.ac.uk [University of Glasgow, Department of Statistics, 15 University Gardens, Glasgow G12 8QW (United Kingdom)

    2009-05-29

    One of the problems of analysis of forensic evidence such as glass fragments, is the determination of their use-type category, e.g. does a glass fragment originate from an unknown window or container? Very small glass fragments arise during various accidents and criminal offences, and could be carried on the clothes, shoes and hair of participants. It is therefore necessary to obtain information on their physicochemical composition in order to solve the classification problem. Scanning Electron Microscopy coupled with an Energy Dispersive X-ray Spectrometer and the Glass Refractive Index Measurement method are routinely used in many forensic institutes for the investigation of glass. A natural form of glass evidence evaluation for forensic purposes is the likelihood ratio-LR = p(E|H{sub 1})/p(E|H{sub 2}). The main aim of this paper was to study the performance of LR models for glass object classification which considered one or two sources of data variability, i.e. between-glass-object variability and(or) within-glass-object variability. Within the proposed model a multivariate kernel density approach was adopted for modelling the between-object distribution and a multivariate normal distribution was adopted for modelling within-object distributions. Moreover, a graphical method of estimating the dependence structure was employed to reduce the highly multivariate problem to several lower-dimensional problems. The performed analysis showed that the best likelihood model was the one which allows to include information about between and within-object variability, and with variables derived from elemental compositions measured by SEM-EDX, and refractive values determined before (RI{sub b}) and after (RI{sub a}) the annealing process, in the form of dRI = log{sub 10}|RI{sub a} - RI{sub b}|. This model gave better results than the model with only between-object variability considered. In addition, when dRI and variables derived from elemental compositions were used, this

  7. Likelihood ratio model for classification of forensic evidence

    International Nuclear Information System (INIS)

    Zadora, G.; Neocleous, T.

    2009-01-01

    One of the problems of analysis of forensic evidence such as glass fragments, is the determination of their use-type category, e.g. does a glass fragment originate from an unknown window or container? Very small glass fragments arise during various accidents and criminal offences, and could be carried on the clothes, shoes and hair of participants. It is therefore necessary to obtain information on their physicochemical composition in order to solve the classification problem. Scanning Electron Microscopy coupled with an Energy Dispersive X-ray Spectrometer and the Glass Refractive Index Measurement method are routinely used in many forensic institutes for the investigation of glass. A natural form of glass evidence evaluation for forensic purposes is the likelihood ratio-LR = p(E|H 1 )/p(E|H 2 ). The main aim of this paper was to study the performance of LR models for glass object classification which considered one or two sources of data variability, i.e. between-glass-object variability and(or) within-glass-object variability. Within the proposed model a multivariate kernel density approach was adopted for modelling the between-object distribution and a multivariate normal distribution was adopted for modelling within-object distributions. Moreover, a graphical method of estimating the dependence structure was employed to reduce the highly multivariate problem to several lower-dimensional problems. The performed analysis showed that the best likelihood model was the one which allows to include information about between and within-object variability, and with variables derived from elemental compositions measured by SEM-EDX, and refractive values determined before (RI b ) and after (RI a ) the annealing process, in the form of dRI = log 10 |RI a - RI b |. This model gave better results than the model with only between-object variability considered. In addition, when dRI and variables derived from elemental compositions were used, this model outperformed two other

  8. Analysis of the dynamics of venous blood flow in the context of lower limb temperature distribution and tissue composition in the elderly.

    Science.gov (United States)

    Skomudek, Aleksandra; Gilowska, Iwona; Jasiński, Ryszard; Rożek-Piechura, Krystyna

    2017-01-01

    The elderly are particularly vulnerable to degenerative diseases, such as circulatory and respiratory system and vascular system diseases. The objective of this study was therefore to evaluate the distribution of temperature and the dynamics of venous blood flow in the lower limbs (LLs) and to assess the interdependence of these parameters in terms of the somatic components in males and females participating in activities at the University of the Third Age. The study included 60 females (mean age 67.4 years) and 40 males (mean age 67.5 years). A body composition assessment was performed using the bioimpedance technique - Tanita BC-418MA. The following parameters were examined: fat%, fat mass, fat-free mass, and total body water. The minimal, maximal, and mean temperature values and their distributions were examined using infrared thermographic camera VarioCAM Head. Measurements of the venous refilling time and the work of the LL venous pump were examined using a Rheo Dopplex II PPG. In males, the mean value of the right LL temperature was 30.58 and the mean value of the left LL was 30.28; the P -value was 0.805769. In females, the mean value of the right LL temperature was 29.58 and the mean value of the left limb was 29.52; the P -value was 0.864773. In males, the right limb blood flow was 34.17 and the left limb blood flow was 34.67; the P -value was 0.359137. In females, the right limb blood flow was 26.89 and the left limb blood flow was 26.09; the P -value was 0.796348. Research results showed that the temperature distribution and the dynamics of blood flow are not significantly different between the right and left extremities in both males and females. However, significant temperature differences were found between the gender groups. Significantly higher temperature values in both the right and left extremities were recorded in males than in females.

  9. Analysis of the dynamics of venous blood flow in the context of lower limb temperature distribution and tissue composition in the elderly

    Directory of Open Access Journals (Sweden)

    Skomudek A

    2017-08-01

    Full Text Available Aleksandra Skomudek,1,2 Iwona Gilowska,1,3 Ryszard Jasiński,4 Krystyna Rożek-Piechura4 1Department of Physical Education and Physiotherapy, Opole University of Technology, Opolskie, 2Department of Clinical Physiotherapy, 3Department of Biochemistry and Physiology, 4Department of Physiotherapy and Occupational Therapy in Conservative and Interventional Medicine, University of Physical Education in Wroclaw, Wroclaw, Poland Objective: The elderly are particularly vulnerable to degenerative diseases, such as circulatory and respiratory system and vascular system diseases. The objective of this study was therefore to evaluate the distribution of temperature and the dynamics of venous blood flow in the lower limbs (LLs and to assess the interdependence of these parameters in terms of the somatic components in males and females participating in activities at the University of the Third Age. Materials and methods: The study included 60 females (mean age 67.4 years and 40 males (mean age 67.5 years. A body composition assessment was performed using the bioimpedance technique – Tanita BC-418MA. The following parameters were examined: fat%, fat mass, fat-free mass, and total body water. The minimal, maximal, and mean temperature values and their distributions were examined using infrared thermographic camera VarioCAM Head. Measurements of the venous refilling time and the work of the LL venous pump were examined using a Rheo Dopplex II PPG. Results: In males, the mean value of the right LL temperature was 30.58 and the mean value of the left LL was 30.28; the P-value was 0.805769. In females, the mean value of the right LL temperature was 29.58 and the mean value of the left limb was 29.52; the P-value was 0.864773. In males, the right limb blood flow was 34.17 and the left limb blood flow was 34.67; the P-value was 0.359137. In females, the right limb blood flow was 26.89 and the left limb blood flow was 26.09; the P-value was 0.796348. Conclusion

  10. A simulation study of likelihood inference procedures in rayleigh distribution with censored data

    International Nuclear Information System (INIS)

    Baklizi, S. A.; Baker, H. M.

    2001-01-01

    Inference procedures based on the likelihood function are considered for the one parameter Rayleigh distribution with type1 and type 2 censored data. Using simulation techniques, the finite sample performances of the maximum likelihood estimator and the large sample likelihood interval estimation procedures based on the Wald, the Rao, and the likelihood ratio statistics are investigated. It appears that the maximum likelihood estimator is unbiased. The approximate variance estimates obtained from the asymptotic normal distribution of the maximum likelihood estimator are accurate under type 2 censored data while they tend to be smaller than the actual variances when considering type1 censored data of small size. It appears also that interval estimation based on the Wald and Rao statistics need much more sample size than interval estimation based on the likelihood ratio statistic to attain reasonable accuracy. (authors). 15 refs., 4 tabs

  11. Efficient algorithms for maximum likelihood decoding in the surface code

    Science.gov (United States)

    Bravyi, Sergey; Suchara, Martin; Vargo, Alexander

    2014-09-01

    We describe two implementations of the optimal error correction algorithm known as the maximum likelihood decoder (MLD) for the two-dimensional surface code with a noiseless syndrome extraction. First, we show how to implement MLD exactly in time O (n2), where n is the number of code qubits. Our implementation uses a reduction from MLD to simulation of matchgate quantum circuits. This reduction however requires a special noise model with independent bit-flip and phase-flip errors. Secondly, we show how to implement MLD approximately for more general noise models using matrix product states (MPS). Our implementation has running time O (nχ3), where χ is a parameter that controls the approximation precision. The key step of our algorithm, borrowed from the density matrix renormalization-group method, is a subroutine for contracting a tensor network on the two-dimensional grid. The subroutine uses MPS with a bond dimension χ to approximate the sequence of tensors arising in the course of contraction. We benchmark the MPS-based decoder against the standard minimum weight matching decoder observing a significant reduction of the logical error probability for χ ≥4.

  12. Maximum likelihood sequence estimation for optical complex direct modulation.

    Science.gov (United States)

    Che, Di; Yuan, Feng; Shieh, William

    2017-04-17

    Semiconductor lasers are versatile optical transmitters in nature. Through the direct modulation (DM), the intensity modulation is realized by the linear mapping between the injection current and the light power, while various angle modulations are enabled by the frequency chirp. Limited by the direct detection, DM lasers used to be exploited only as 1-D (intensity or angle) transmitters by suppressing or simply ignoring the other modulation. Nevertheless, through the digital coherent detection, simultaneous intensity and angle modulations (namely, 2-D complex DM, CDM) can be realized by a single laser diode. The crucial technique of CDM is the joint demodulation of intensity and differential phase with the maximum likelihood sequence estimation (MLSE), supported by a closed-form discrete signal approximation of frequency chirp to characterize the MLSE transition probability. This paper proposes a statistical method for the transition probability to significantly enhance the accuracy of the chirp model. Using the statistical estimation, we demonstrate the first single-channel 100-Gb/s PAM-4 transmission over 1600-km fiber with only 10G-class DM lasers.

  13. Ringing Artefact Reduction By An Efficient Likelihood Improvement Method

    Science.gov (United States)

    Fuderer, Miha

    1989-10-01

    In MR imaging, the extent of the acquired spatial frequencies of the object is necessarily finite. The resulting image shows artefacts caused by "truncation" of its Fourier components. These are known as Gibbs artefacts or ringing artefacts. These artefacts are particularly. visible when the time-saving reduced acquisition method is used, say, when scanning only the lowest 70% of the 256 data lines. Filtering the data results in loss of resolution. A method is described that estimates the high frequency data from the low-frequency data lines, with the likelihood of the image as criterion. It is a computationally very efficient method, since it requires practically only two extra Fourier transforms, in addition to the normal. reconstruction. The results of this method on MR images of human subjects are promising. Evaluations on a 70% acquisition image show about 20% decrease of the error energy after processing. "Error energy" is defined as the total power of the difference to a 256-data-lines reference image. The elimination of ringing artefacts then appears almost complete..

  14. Scale invariant for one-sided multivariate likelihood ratio tests

    Directory of Open Access Journals (Sweden)

    Samruam Chongcharoen

    2010-07-01

    Full Text Available Suppose 1 2 , ,..., n X X X is a random sample from Np ( ,V distribution. Consider 0 1 2 : ... 0 p H      and1 : 0 for 1, 2,..., i H   i  p , let 1 0 H  H denote the hypothesis that 1 H holds but 0 H does not, and let ~ 0 H denote thehypothesis that 0 H does not hold. Because the likelihood ratio test (LRT of 0 H versus 1 0 H  H is complicated, severalad hoc tests have been proposed. Tang, Gnecco and Geller (1989 proposed an approximate LRT, Follmann (1996 suggestedrejecting 0 H if the usual test of 0 H versus ~ 0 H rejects 0 H with significance level 2 and a weighted sum of the samplemeans is positive, and Chongcharoen, Singh and Wright (2002 modified Follmann’s test to include information about thecorrelation structure in the sum of the sample means. Chongcharoen and Wright (2007, 2006 give versions of the Tang-Gnecco-Geller tests and Follmann-type tests, respectively, with invariance properties. With LRT’s scale invariant desiredproperty, we investigate its powers by using Monte Carlo techniques and compare them with the tests which we recommendin Chongcharoen and Wright (2007, 2006.

  15. Maximum-likelihood estimation of recent shared ancestry (ERSA).

    Science.gov (United States)

    Huff, Chad D; Witherspoon, David J; Simonson, Tatum S; Xing, Jinchuan; Watkins, W Scott; Zhang, Yuhua; Tuohy, Therese M; Neklason, Deborah W; Burt, Randall W; Guthery, Stephen L; Woodward, Scott R; Jorde, Lynn B

    2011-05-01

    Accurate estimation of recent shared ancestry is important for genetics, evolution, medicine, conservation biology, and forensics. Established methods estimate kinship accurately for first-degree through third-degree relatives. We demonstrate that chromosomal segments shared by two individuals due to identity by descent (IBD) provide much additional information about shared ancestry. We developed a maximum-likelihood method for the estimation of recent shared ancestry (ERSA) from the number and lengths of IBD segments derived from high-density SNP or whole-genome sequence data. We used ERSA to estimate relationships from SNP genotypes in 169 individuals from three large, well-defined human pedigrees. ERSA is accurate to within one degree of relationship for 97% of first-degree through fifth-degree relatives and 80% of sixth-degree and seventh-degree relatives. We demonstrate that ERSA's statistical power approaches the maximum theoretical limit imposed by the fact that distant relatives frequently share no DNA through a common ancestor. ERSA greatly expands the range of relationships that can be estimated from genetic data and is implemented in a freely available software package.

  16. Quantifying uncertainty, variability and likelihood for ordinary differential equation models

    LENUS (Irish Health Repository)

    Weisse, Andrea Y

    2010-10-28

    Abstract Background In many applications, ordinary differential equation (ODE) models are subject to uncertainty or variability in initial conditions and parameters. Both, uncertainty and variability can be quantified in terms of a probability density function on the state and parameter space. Results The partial differential equation that describes the evolution of this probability density function has a form that is particularly amenable to application of the well-known method of characteristics. The value of the density at some point in time is directly accessible by the solution of the original ODE extended by a single extra dimension (for the value of the density). This leads to simple methods for studying uncertainty, variability and likelihood, with significant advantages over more traditional Monte Carlo and related approaches especially when studying regions with low probability. Conclusions While such approaches based on the method of characteristics are common practice in other disciplines, their advantages for the study of biological systems have so far remained unrecognized. Several examples illustrate performance and accuracy of the approach and its limitations.

  17. Affective mapping: An activation likelihood estimation (ALE) meta-analysis.

    Science.gov (United States)

    Kirby, Lauren A J; Robinson, Jennifer L

    2017-11-01

    Functional neuroimaging has the spatial resolution to explain the neural basis of emotions. Activation likelihood estimation (ALE), as opposed to traditional qualitative meta-analysis, quantifies convergence of activation across studies within affective categories. Others have used ALE to investigate a broad range of emotions, but without the convenience of the BrainMap database. We used the BrainMap database and analysis resources to run separate meta-analyses on coordinates reported for anger, anxiety, disgust, fear, happiness, humor, and sadness. Resultant ALE maps were compared to determine areas of convergence between emotions, as well as to identify affect-specific networks. Five out of the seven emotions demonstrated consistent activation within the amygdala, whereas all emotions consistently activated the right inferior frontal gyrus, which has been implicated as an integration hub for affective and cognitive processes. These data provide the framework for models of affect-specific networks, as well as emotional processing hubs, which can be used for future studies of functional or effective connectivity. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. Dark matter CMB constraints and likelihoods for poor particle physicists

    Energy Technology Data Exchange (ETDEWEB)

    Cline, James M.; Scott, Pat, E-mail: jcline@physics.mcgill.ca, E-mail: patscott@physics.mcgill.ca [Department of Physics, McGill University, 3600 rue University, Montréal, QC, H3A 2T8 (Canada)

    2013-03-01

    The cosmic microwave background provides constraints on the annihilation and decay of light dark matter at redshifts between 100 and 1000, the strength of which depends upon the fraction of energy ending up in the form of electrons and photons. The resulting constraints are usually presented for a limited selection of annihilation and decay channels. Here we provide constraints on the annihilation cross section and decay rate, at discrete values of the dark matter mass m{sub χ}, for all the annihilation and decay channels whose secondary spectra have been computed using PYTHIA in arXiv:1012.4515 (''PPPC 4 DM ID: a poor particle physicist cookbook for dark matter indirect detection''), namely e, μ, τ, V → e, V → μ, V → τ, u, d s, c, b, t, γ, g, W, Z and h. By interpolating in mass, these can be used to find the CMB constraints and likelihood functions from WMAP7 and Planck for a wide range of dark matter models, including those with annihilation or decay into a linear combination of different channels.

  19. Dark matter CMB constraints and likelihoods for poor particle physicists

    International Nuclear Information System (INIS)

    Cline, James M.; Scott, Pat

    2013-01-01

    The cosmic microwave background provides constraints on the annihilation and decay of light dark matter at redshifts between 100 and 1000, the strength of which depends upon the fraction of energy ending up in the form of electrons and photons. The resulting constraints are usually presented for a limited selection of annihilation and decay channels. Here we provide constraints on the annihilation cross section and decay rate, at discrete values of the dark matter mass m χ , for all the annihilation and decay channels whose secondary spectra have been computed using PYTHIA in arXiv:1012.4515 (''PPPC 4 DM ID: a poor particle physicist cookbook for dark matter indirect detection''), namely e, μ, τ, V → e, V → μ, V → τ, u, d s, c, b, t, γ, g, W, Z and h. By interpolating in mass, these can be used to find the CMB constraints and likelihood functions from WMAP7 and Planck for a wide range of dark matter models, including those with annihilation or decay into a linear combination of different channels

  20. Physical activity may decrease the likelihood of children developing constipation.

    Science.gov (United States)

    Seidenfaden, Sandra; Ormarsson, Orri Thor; Lund, Sigrun H; Bjornsson, Einar S

    2018-01-01

    Childhood constipation is common. We evaluated children diagnosed with constipation, who were referred to an Icelandic paediatric emergency department, and determined the effect of lifestyle factors on its aetiology. The parents of children who were diagnosed with constipation and participated in a phase IIB clinical trial on laxative suppositories answered an online questionnaire about their children's lifestyle and constipation in March-April 2013. The parents of nonconstipated children that visited the paediatric department of Landspitali University Hospital or an Icelandic outpatient clinic answered the same questionnaire. We analysed responses regarding 190 children aged one year to 18 years: 60 with constipation and 130 without. We found that 40% of the constipated children had recurrent symptoms, 27% had to seek medical attention more than once and 33% received medication per rectum. The 47 of 130 control group subjects aged 10-18 were much more likely to exercise more than three times a week (72%) and for more than a hour (62%) than the 26 of 60 constipated children of the same age (42% and 35%, respectively). Constipation risk factors varied with age and many children diagnosed with constipation had recurrent symptoms. Physical activity may affect the likelihood of developing constipation in older children. ©2017 Foundation Acta Paediatrica. Published by John Wiley & Sons Ltd.

  1. Maximum likelihood pedigree reconstruction using integer linear programming.

    Science.gov (United States)

    Cussens, James; Bartlett, Mark; Jones, Elinor M; Sheehan, Nuala A

    2013-01-01

    Large population biobanks of unrelated individuals have been highly successful in detecting common genetic variants affecting diseases of public health concern. However, they lack the statistical power to detect more modest gene-gene and gene-environment interaction effects or the effects of rare variants for which related individuals are ideally required. In reality, most large population studies will undoubtedly contain sets of undeclared relatives, or pedigrees. Although a crude measure of relatedness might sometimes suffice, having a good estimate of the true pedigree would be much more informative if this could be obtained efficiently. Relatives are more likely to share longer haplotypes around disease susceptibility loci and are hence biologically more informative for rare variants than unrelated cases and controls. Distant relatives are arguably more useful for detecting variants with small effects because they are less likely to share masking environmental effects. Moreover, the identification of relatives enables appropriate adjustments of statistical analyses that typically assume unrelatedness. We propose to exploit an integer linear programming optimisation approach to pedigree learning, which is adapted to find valid pedigrees by imposing appropriate constraints. Our method is not restricted to small pedigrees and is guaranteed to return a maximum likelihood pedigree. With additional constraints, we can also search for multiple high-probability pedigrees and thus account for the inherent uncertainty in any particular pedigree reconstruction. The true pedigree is found very quickly by comparison with other methods when all individuals are observed. Extensions to more complex problems seem feasible. © 2012 Wiley Periodicals, Inc.

  2. Constructing diagnostic likelihood: clinical decisions using subjective versus statistical probability.

    Science.gov (United States)

    Kinnear, John; Jackson, Ruth

    2017-07-01

    Although physicians are highly trained in the application of evidence-based medicine, and are assumed to make rational decisions, there is evidence that their decision making is prone to biases. One of the biases that has been shown to affect accuracy of judgements is that of representativeness and base-rate neglect, where the saliency of a person's features leads to overestimation of their likelihood of belonging to a group. This results in the substitution of 'subjective' probability for statistical probability. This study examines clinicians' propensity to make estimations of subjective probability when presented with clinical information that is considered typical of a medical condition. The strength of the representativeness bias is tested by presenting choices in textual and graphic form. Understanding of statistical probability is also tested by omitting all clinical information. For the questions that included clinical information, 46.7% and 45.5% of clinicians made judgements of statistical probability, respectively. Where the question omitted clinical information, 79.9% of clinicians made a judgement consistent with statistical probability. There was a statistically significant difference in responses to the questions with and without representativeness information (χ2 (1, n=254)=54.45, pprobability. One of the causes for this representativeness bias may be the way clinical medicine is taught where stereotypic presentations are emphasised in diagnostic decision making. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  3. Race of source effects in the elaboration likelihood model.

    Science.gov (United States)

    White, P H; Harkins, S G

    1994-11-01

    In a series of experiments, we investigated the effect of race of source on persuasive communications in the Elaboration Likelihood Model (R.E. Petty & J.T. Cacioppo, 1981, 1986). In Experiment 1, we found no evidence that White participants responded to a Black source as a simple negative cue. Experiment 2 suggested the possibility that exposure to a Black source led to low-involvement message processing. In Experiments 3 and 4, a distraction paradigm was used to test this possibility, and it was found that participants under low involvement were highly motivated to process a message presented by a Black source. In Experiment 5, we found that attitudes toward the source's ethnic group, rather than violations of expectancies, accounted for this processing effect. Taken together, the results of these experiments are consistent with S.L. Gaertner and J.F. Dovidio's (1986) theory of aversive racism, which suggests that Whites, because of a combination of egalitarian values and underlying negative racial attitudes, are very concerned about not appearing unfavorable toward Blacks, leading them to be highly motivated to process messages presented by a source from this group.

  4. Health service resilience in Yobe state, Nigeria in the context of the Boko Haram insurgency: a systems dynamics analysis using group model building.

    Science.gov (United States)

    Ager, Alastair K; Lembani, Martina; Mohammed, Abdulaziz; Mohammed Ashir, Garba; Abdulwahab, Ahmad; de Pinho, Helen; Delobelle, Peter; Zarowsky, Christina

    2015-01-01

    Yobe State has faced severe disruption of its health service as a result of the Boko Haram insurgency. A systems dynamics analysis was conducted to identify key pathways of threat to provision and emerging pathways of response and adaptation. Structured interviews were conducted with 39 stakeholders from three local government areas selected to represent the diversity of conflict experience across the state: Damaturu, Fune and Nguru, and with four officers of the PRRINN-MNCH program providing technical assistance for primary care development in the state. A group model building session was convened with 11 senior stakeholders, which used participatory scripts to review thematic analysis of interviews and develop a preliminary systems model linking identified variables. Population migration and transport restrictions have substantially impacted access to health provision. The human resource for health capability of the state has been severely diminished through the outward migration of (especially non-indigenous) health workers and the suspension of programmes providing external technical assistance. The political will of the Yobe State government to strengthen health provision - through lifting a moratorium on recruitment and providing incentives for retention and support of staff - has supported a recovery of health systems functioning. Policies of free-drug provision and decentralized drug supply appear to have been protective of the operation of the health system. Community resources and cohesion have been significant assets in combatting the impacts of the insurgency on service utilization and quality. Staff commitment and motivation - particularly amongst staff indigenous to the state - has protected health care quality and enabled flexibility of human resource deployment. A systems analysis using participatory group model building provided a mechanism to identify key pathways of threat and adaptation with regard to health service functioning. Generalizable

  5. Use of Context in Video Processing

    Science.gov (United States)

    Wu, Chen; Aghajan, Hamid

    Interpreting an event or a scene based on visual data often requires additional contextual information. Contextual information may be obtained from different sources. In this chapter, we discuss two broad categories of contextual sources: environmental context and user-centric context. Environmental context refers to information derived from domain knowledge or from concurrently sensed effects in the area of operation. User-centric context refers to information obtained and accumulated from the user. Both types of context can include static or dynamic contextual elements. Examples from a smart home environment are presented to illustrate how different types of contextual data can be applied to aid the decision-making process.

  6. FLEAD: online frequency likelihood estimation anomaly detection for mobile sensing

    NARCIS (Netherlands)

    Le Viet Duc, L Duc; Scholten, Johan; Havinga, Paul J.M.

    With the rise of smartphone platforms, adaptive sensing becomes an predominant key to overcome intricate constraints such as smartphone's capabilities and dynamic data. One way to do this is estimating the event probability based on anomaly detection to invoke heavy processes, such as switching on

  7. Estimating likelihood of future crashes for crash-prone drivers

    Directory of Open Access Journals (Sweden)

    Subasish Das

    2015-06-01

    Full Text Available At-fault crash-prone drivers are usually considered as the high risk group for possible future incidents or crashes. In Louisiana, 34% of crashes are repeatedly committed by the at-fault crash-prone drivers who represent only 5% of the total licensed drivers in the state. This research has conducted an exploratory data analysis based on the driver faultiness and proneness. The objective of this study is to develop a crash prediction model to estimate the likelihood of future crashes for the at-fault drivers. The logistic regression method is used by employing eight years' traffic crash data (2004–2011 in Louisiana. Crash predictors such as the driver's crash involvement, crash and road characteristics, human factors, collision type, and environmental factors are considered in the model. The at-fault and not-at-fault status of the crashes are used as the response variable. The developed model has identified a few important variables, and is used to correctly classify at-fault crashes up to 62.40% with a specificity of 77.25%. This model can identify as many as 62.40% of the crash incidence of at-fault drivers in the upcoming year. Traffic agencies can use the model for monitoring the performance of an at-fault crash-prone drivers and making roadway improvements meant to reduce crash proneness. From the findings, it is recommended that crash-prone drivers should be targeted for special safety programs regularly through education and regulations.

  8. Smoking increases the likelihood of Helicobacter pylori treatment failure.

    Science.gov (United States)

    Itskoviz, David; Boltin, Doron; Leibovitzh, Haim; Tsadok Perets, Tsachi; Comaneshter, Doron; Cohen, Arnon; Niv, Yaron; Levi, Zohar

    2017-07-01

    Data regarding the impact of smoking on the success of Helicobacter pylori (H. pylori) eradication are conflicting, partially due to the fact that sociodemographic status is associated with both smoking and H. pylori treatment success. We aimed to assess the effect of smoking on H. pylori eradication rates after controlling for sociodemographic confounders. Included were subjects aged 15 years or older, with a first time positive C 13 -urea breath test (C 13 -UBT) between 2007 to 2014, who underwent a second C 13 -UBT after receiving clarithromycin-based triple therapy. Data regarding age, gender, socioeconomic status (SES), smoking (current smokers or "never smoked"), and drug use were extracted from the Clalit health maintenance organization database. Out of 120,914 subjects with a positive first time C 13 -UBT, 50,836 (42.0%) underwent a second C 13 -UBT test. After excluding former smokers, 48,130 remained who were eligible for analysis. The mean age was 44.3±18.2years, 69.2% were females, 87.8% were Jewish and 12.2% Arabs, 25.5% were current smokers. The overall eradication failure rates were 33.3%: 34.8% in current smokers and 32.8% in subjects who never smoked. In a multivariate analysis, eradication failure was positively associated with current smoking (Odds Ratio {OR} 1.15, 95% CI 1.10-1.20, psmoking was found to significantly increase the likelihood of unsuccessful first-line treatment for H. pylori infection. Copyright © 2017 Editrice Gastroenterologica Italiana S.r.l. Published by Elsevier Ltd. All rights reserved.

  9. Obstetric History and Likelihood of Preterm Birth of Twins.

    Science.gov (United States)

    Easter, Sarah Rae; Little, Sarah E; Robinson, Julian N; Mendez-Figueroa, Hector; Chauhan, Suneet P

    2018-01-05

     The objective of this study was to investigate the relationship between preterm birth in a prior pregnancy and preterm birth in a twin pregnancy.  We performed a secondary analysis of a randomized controlled trial evaluating 17-α-hydroxyprogesterone caproate in twins. Women were classified as nulliparous, multiparous with a prior term birth, or multiparous with a prior preterm birth. We used logistic regression to examine the odds of spontaneous preterm birth of twins before 35 weeks according to past obstetric history.  Of the 653 women analyzed, 294 were nulliparas, 310 had a prior term birth, and 49 had a prior preterm birth. Prior preterm birth increased the likelihood of spontaneous delivery before 35 weeks (adjusted odds ratio [aOR]: 2.44, 95% confidence interval [CI]: 1.28-4.66), whereas prior term delivery decreased these odds (aOR: 0.55, 95% CI: 0.38-0.78) in the current twin pregnancy compared with the nulliparous reference group. This translated into a lower odds of composite neonatal morbidity (aOR: 0.38, 95% CI: 0.27-0.53) for women with a prior term delivery.  For women carrying twins, a history of preterm birth increases the odds of spontaneous preterm birth, whereas a prior term birth decreases odds of spontaneous preterm birth and neonatal morbidity for the current twin pregnancy. These results offer risk stratification and reassurance for clinicians. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  10. A comparison of likelihood ratio tests and Rao's score test for three separable covariance matrix structures.

    Science.gov (United States)

    Filipiak, Katarzyna; Klein, Daniel; Roy, Anuradha

    2017-01-01

    The problem of testing the separability of a covariance matrix against an unstructured variance-covariance matrix is studied in the context of multivariate repeated measures data using Rao's score test (RST). The RST statistic is developed with the first component of the separable structure as a first-order autoregressive (AR(1)) correlation matrix or an unstructured (UN) covariance matrix under the assumption of multivariate normality. It is shown that the distribution of the RST statistic under the null hypothesis of any separability does not depend on the true values of the mean or the unstructured components of the separable structure. A significant advantage of the RST is that it can be performed for small samples, even smaller than the dimension of the data, where the likelihood ratio test (LRT) cannot be used, and it outperforms the standard LRT in a number of contexts. Monte Carlo simulations are then used to study the comparative behavior of the null distribution of the RST statistic, as well as that of the LRT statistic, in terms of sample size considerations, and for the estimation of the empirical percentiles. Our findings are compared with existing results where the first component of the separable structure is a compound symmetry (CS) correlation matrix. It is also shown by simulations that the empirical null distribution of the RST statistic converges faster than the empirical null distribution of the LRT statistic to the limiting χ 2 distribution. The tests are implemented on a real dataset from medical studies. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Autistic disorders and schizophrenia: related or remote? An anatomical likelihood estimation.

    Directory of Open Access Journals (Sweden)

    Charlton Cheung

    Full Text Available Shared genetic and environmental risk factors have been identified for autistic spectrum disorders (ASD and schizophrenia. Social interaction, communication, emotion processing, sensorimotor gating and executive function are disrupted in both, stimulating debate about whether these are related conditions. Brain imaging studies constitute an informative and expanding resource to determine whether brain structural phenotype of these disorders is distinct or overlapping. We aimed to synthesize existing datasets characterizing ASD and schizophrenia within a common framework, to quantify their structural similarities. In a novel modification of Anatomical Likelihood Estimation (ALE, 313 foci were extracted from 25 voxel-based studies comprising 660 participants (308 ASD, 352 first-episode schizophrenia and 801 controls. The results revealed that, compared to controls, lower grey matter volumes within limbic-striato-thalamic circuitry were common to ASD and schizophrenia. Unique features of each disorder included lower grey matter volume in amygdala, caudate, frontal and medial gyrus for schizophrenia and putamen for autism. Thus, in terms of brain volumetrics, ASD and schizophrenia have a clear degree of overlap that may reflect shared etiological mechanisms. However, the distinctive neuroanatomy also mapped in each condition raises the question about how this is arrived in the context of common etiological pressures.

  12. A strategy to reduce cross-cultural miscommunication and increase the likelihood of improving health outcomes.

    Science.gov (United States)

    Kagawa-Singer, Marjorie; Kassim-Lakha, Shaheen

    2003-06-01

    Encounters between physicians and patients from different cultural backgrounds are becoming commonplace. Physicians strive to improve health outcomes and increase quality of life for every patient, yet these discordant encounters appear to be a significant factor, beyond socioeconomic barriers, in creating the unequal and avoidable excess burden of disease borne by members of ethnic minority populations in the United States. Most clinicians lack the information to understand how culture influences the clinical encounter and the skills to effectively bridge potential differences. New strategies are required to expand medical training to adequately address culturally discordant encounters among the physicians, their patients, and the families, for all three may have different concepts regarding the nature of the disease, expectations about treatment, and modes of appropriate communication beyond language. The authors provide an anthropological perspective of the fundamental relationship between culture and health, and outline systemic changes needed within the social and legal structures of the health care system to reduce the risk of cross-cultural miscommunication and increase the likelihood of improving health outcomes for all populations within the multicultural U.S. society. The authors define the strengths inherent within every culture, provide a guideline for the clinician to evaluate disease and illness within its cultural context, and outline the clinical skills required to negotiate among potential differences to reach mutually desired goals for care. Last, they indicate the structural changes required in the health care setting to enable and support such practice.

  13. High-order Composite Likelihood Inference for Max-Stable Distributions and Processes

    KAUST Repository

    Castruccio, Stefano; Huser, Raphaë l; Genton, Marc G.

    2015-01-01

    In multivariate or spatial extremes, inference for max-stable processes observed at a large collection of locations is a very challenging problem in computational statistics, and current approaches typically rely on less expensive composite likelihoods constructed from small subsets of data. In this work, we explore the limits of modern state-of-the-art computational facilities to perform full likelihood inference and to efficiently evaluate high-order composite likelihoods. With extensive simulations, we assess the loss of information of composite likelihood estimators with respect to a full likelihood approach for some widely-used multivariate or spatial extreme models, we discuss how to choose composite likelihood truncation to improve the efficiency, and we also provide recommendations for practitioners. This article has supplementary material online.

  14. High-order Composite Likelihood Inference for Max-Stable Distributions and Processes

    KAUST Repository

    Castruccio, Stefano

    2015-09-29

    In multivariate or spatial extremes, inference for max-stable processes observed at a large collection of locations is a very challenging problem in computational statistics, and current approaches typically rely on less expensive composite likelihoods constructed from small subsets of data. In this work, we explore the limits of modern state-of-the-art computational facilities to perform full likelihood inference and to efficiently evaluate high-order composite likelihoods. With extensive simulations, we assess the loss of information of composite likelihood estimators with respect to a full likelihood approach for some widely-used multivariate or spatial extreme models, we discuss how to choose composite likelihood truncation to improve the efficiency, and we also provide recommendations for practitioners. This article has supplementary material online.

  15. Supplementary Material for: High-Order Composite Likelihood Inference for Max-Stable Distributions and Processes

    KAUST Repository

    Castruccio, Stefano; Huser, Raphaë l; Genton, Marc G.

    2016-01-01

    In multivariate or spatial extremes, inference for max-stable processes observed at a large collection of points is a very challenging problem and current approaches typically rely on less expensive composite likelihoods constructed from small subsets of data. In this work, we explore the limits of modern state-of-the-art computational facilities to perform full likelihood inference and to efficiently evaluate high-order composite likelihoods. With extensive simulations, we assess the loss of information of composite likelihood estimators with respect to a full likelihood approach for some widely used multivariate or spatial extreme models, we discuss how to choose composite likelihood truncation to improve the efficiency, and we also provide recommendations for practitioners. This article has supplementary material online.

  16. Modeling trust context in networks

    CERN Document Server

    Adali, Sibel

    2013-01-01

    We make complex decisions every day, requiring trust in many different entities for different reasons. These decisions are not made by combining many isolated trust evaluations. Many interlocking factors play a role, each dynamically impacting the others.? In this brief, 'trust context' is defined as the system level description of how the trust evaluation process unfolds.Networks today are part of almost all human activity, supporting and shaping it. Applications increasingly incorporate new interdependencies and new trust contexts. Social networks connect people and organizations throughout

  17. Semantic Context Reasoning Using Ontology Based Models

    NARCIS (Netherlands)

    Mantovaneli Pessoa, Rodrigo; Calvi, Camilo Zardo; Pereira Filho, J.G.; Pereira Filho, José Gonçalves; Guareis de farias, Cléver; Neisse, R.; Pras, A.; Pras, Aiko; van Sinderen, M.J.; van Sinderen, Marten J.

    New mobile computing technologies and the increasing use of portable devices have pushed the development of the so-called context-aware applications. This new class of applications aims at improving human-computer interactions by supporting dynamic adaptations according to context changes. This

  18. Maximum Likelihood Estimation and Inference With Examples in R, SAS and ADMB

    CERN Document Server

    Millar, Russell B

    2011-01-01

    This book takes a fresh look at the popular and well-established method of maximum likelihood for statistical estimation and inference. It begins with an intuitive introduction to the concepts and background of likelihood, and moves through to the latest developments in maximum likelihood methodology, including general latent variable models and new material for the practical implementation of integrated likelihood using the free ADMB software. Fundamental issues of statistical inference are also examined, with a presentation of some of the philosophical debates underlying the choice of statis

  19. Who drinks where: youth selection of drinking contexts.

    Science.gov (United States)

    Lipperman-Kreda, Sharon; Mair, Christina F; Bersamin, Melina; Gruenewald, Paul J; Grube, Joel W

    2015-04-01

    Different drinkers may experience specific risks depending on where they consume alcohol. This longitudinal study examined drinking patterns, and demographic and psychosocial characteristics associated with youth drinking in different contexts. We used survey data from 665 past-year alcohol-using youths (ages 13 to 16 at Wave 1) in 50 midsized California cities. Measures of drinking behaviors and drinking in 7 contexts were obtained at 3 annual time points. Other characteristics included gender, age, race, parental education, weekly disposable income, general deviance, and past-year cigarette smoking. Results of multilevel regression analyses show that more frequent past-year alcohol use was associated with an increased likelihood of drinking at parties and at someone else's home. Greater continued volumes of alcohol (i.e., heavier drinking) was associated with increased likelihood of drinking at parking lots or street corners. Deviance was positively associated with drinking in most contexts, and past-year cigarette smoking was positively associated with drinking at beaches or parks and someone else's home. Age and deviance were positively associated with drinking in a greater number of contexts. The likelihood of youth drinking at parties and someone else's home increased over time, whereas the likelihood of drinking at parking lots/street corners decreased. Also, deviant youths progress to drinking in their own home, beaches or parks, and restaurants/bars/nightclubs more rapidly. The contexts in which youths consume alcohol change over time. These changes vary by individual characteristics. The redistribution of drinking contexts over the early life course may contribute to specific risks associated with different drinking contexts. Copyright © 2015 by the Research Society on Alcoholism.

  20. Hybrid context aware recommender systems

    Science.gov (United States)

    Jain, Rajshree; Tyagi, Jaya; Singh, Sandeep Kumar; Alam, Taj

    2017-10-01

    Recommender systems and context awareness is currently a vital field of research. Most hybrid recommendation systems implement content based and collaborative filtering techniques whereas this work combines context and collaborative filtering. The paper presents a hybrid context aware recommender system for books and movies that gives recommendations based on the user context as well as user or item similarity. It also addresses the issue of dimensionality reduction using weighted pre filtering based on dynamically entered user context and preference of context. This unique step helps to reduce the size of dataset for collaborative filtering. Bias subtracted collaborative filtering is used so as to consider the relative rating of a particular user and not the absolute values. Cosine similarity is used as a metric to determine the similarity between users or items. The unknown ratings are calculated and evaluated using MSE (Mean Squared Error) in test and train datasets. The overall process of recommendation has helped to personalize recommendations and give more accurate results with reduced complexity in collaborative filtering.

  1. Relative likelihood for life as a function of cosmic time

    Energy Technology Data Exchange (ETDEWEB)

    Loeb, Abraham [Astronomy department, Harvard University, 60 Garden Street, Cambridge, MA 02138 (United States); Batista, Rafael A.; Sloan, David, E-mail: aloeb@cfa.harvard.edu, E-mail: rafael.alvesbatista@physics.ox.ac.uk, E-mail: david.sloan@physics.ox.ac.uk [Department of Physics - Astrophysics, University of Oxford, DWB, Keble Road, OX1 3RH, Oxford (United Kingdom)

    2016-08-01

    Is life most likely to emerge at the present cosmic time near a star like the Sun? We address this question by calculating the relative formation probability per unit time of habitable Earth-like planets within a fixed comoving volume of the Universe, dP ( t )/ dt , starting from the first stars and continuing to the distant cosmic future. We conservatively restrict our attention to the context of ''life as we know it'' and the standard cosmological model, ΛCDM . We find that unless habitability around low mass stars is suppressed, life is most likely to exist near ∼ 0.1 M {sub ⊙} stars ten trillion years from now. Spectroscopic searches for biosignatures in the atmospheres of transiting Earth-mass planets around low mass stars will determine whether present-day life is indeed premature or typical from a cosmic perspective.

  2. Relative likelihood for life as a function of cosmic time

    International Nuclear Information System (INIS)

    Loeb, Abraham; Batista, Rafael A.; Sloan, David

    2016-01-01

    Is life most likely to emerge at the present cosmic time near a star like the Sun? We address this question by calculating the relative formation probability per unit time of habitable Earth-like planets within a fixed comoving volume of the Universe, dP ( t )/ dt , starting from the first stars and continuing to the distant cosmic future. We conservatively restrict our attention to the context of ''life as we know it'' and the standard cosmological model, ΛCDM . We find that unless habitability around low mass stars is suppressed, life is most likely to exist near ∼ 0.1 M ⊙ stars ten trillion years from now. Spectroscopic searches for biosignatures in the atmospheres of transiting Earth-mass planets around low mass stars will determine whether present-day life is indeed premature or typical from a cosmic perspective.

  3. Cultural Context and Translation

    Institute of Scientific and Technical Information of China (English)

    张敏

    2009-01-01

    cultural context plays an important role in translation. Because translation is a cross-culture activity, the culture context that influ-ences translating is consisted of both the culture contexts of source language and target language. This article firstly analyzes the concept of context and cultural context, then according to the procedure of translating classifies cultural context into two stages and talks about how they respectively influence translating.

  4. Enhancing their likelihood for a positive future: the perspective of inner-city youth.

    Science.gov (United States)

    Ginsburg, Kenneth R; Alexander, Penny M; Hunt, Jean; Sullivan, Maisha; Zhao, Huaqing; Cnaan, Avital

    2002-06-01

    Inner-city youth must overcome many environmental challenges as they strive for success. Their outcome is influenced by the interplay of protective forces and risk factors. To learn directly from youth what solutions they believe would most influence their likelihood of achieving a positive future. In-school 8th-, 9th-, and 12th-graders in north Philadelphia generated, prioritized, and explained their own solutions through a 4-stage hierarchical process facilitated by AmeriCorps workers. In Stage 1, 60 randomly selected students participated in 8 focus groups to develop the study question. In Stage 2, youth in Nominal Group Technique sessions generated and prioritized solutions. In Stage 3, a survey for each grade that included their top prioritized ideas was distributed, and youth rated each idea on a Likert scale (5= Definitely would make me more likely to have a positive future to 1 = Would definitely not.). One thousand twenty-two ninth-graders (69% of in-school youth at 5 high schools) returned usable surveys. Ninety-three percent of responders were 14 to 16 years old, 44% were male, 54% were black, and 32% were Latino. Four hundred seventeen 8th-graders and 322 12th-graders returned usable surveys. In Stage 4, youth in 10 focus groups added meaning and context to the ideas. The highest rated items in all grades were solutions that promoted education or increased job opportunities. Ninth-graders ranked helping youth get into college first by the Marginal Homogeneity Test. The creation of more jobs was ranked second. Third rank was shared by more job training, keeping youth from dropping out of school, and better books for schools. The next tier of items focused mostly on opportunities for youth to spend their free time productively and to have interactions with adults. Many items calling for the reduction of risk behaviors or disruptive surroundings were rated lower. The Kruskal-Wallis test found little variation in rating of the ideas by gender, race, or

  5. Aircraft control surface failure detection and isolation using the OSGLR test. [orthogonal series generalized likelihood ratio

    Science.gov (United States)

    Bonnice, W. F.; Motyka, P.; Wagner, E.; Hall, S. R.

    1986-01-01

    The performance of the orthogonal series generalized likelihood ratio (OSGLR) test in detecting and isolating commercial aircraft control surface and actuator failures is evaluated. A modification to incorporate age-weighting which significantly reduces the sensitivity of the algorithm to modeling errors is presented. The steady-state implementation of the algorithm based on a single linear model valid for a cruise flight condition is tested using a nonlinear aircraft simulation. A number of off-nominal no-failure flight conditions including maneuvers, nonzero flap deflections, different turbulence levels and steady winds were tested. Based on the no-failure decision functions produced by off-nominal flight conditions, the failure detection and isolation performance at the nominal flight condition was determined. The extension of the algorithm to a wider flight envelope by scheduling on dynamic pressure and flap deflection is examined. Based on this testing, the OSGLR algorithm should be capable of detecting control surface failures that would affect the safe operation of a commercial aircraft. Isolation may be difficult if there are several surfaces which produce similar effects on the aircraft. Extending the algorithm over the entire operating envelope of a commercial aircraft appears feasible.

  6. Optimizing Likelihood Models for Particle Trajectory Segmentation in Multi-State Systems.

    Science.gov (United States)

    Young, Dylan Christopher; Scrimgeour, Jan

    2018-06-19

    Particle tracking offers significant insight into the molecular mechanics that govern the behav- ior of living cells. The analysis of molecular trajectories that transition between different motive states, such as diffusive, driven and tethered modes, is of considerable importance, with even single trajectories containing significant amounts of information about a molecule's environment and its interactions with cellular structures. Hidden Markov models (HMM) have been widely adopted to perform the segmentation of such complex tracks. In this paper, we show that extensive analysis of hidden Markov model outputs using data derived from multi-state Brownian dynamics simulations can be used both for the optimization of the likelihood models used to describe the states of the system and for characterization of the technique's failure mechanisms. This analysis was made pos- sible by the implementation of parallelized adaptive direct search algorithm on a Nvidia graphics processing unit. This approach provides critical information for the visualization of HMM failure and successful design of particle tracking experiments where trajectories contain multiple mobile states. © 2018 IOP Publishing Ltd.

  7. Analysis of Pairwise Interactions in a Maximum Likelihood Sense to Identify Leaders in a Group

    Directory of Open Access Journals (Sweden)

    Violet Mwaffo

    2017-07-01

    Full Text Available Collective motion in animal groups manifests itself in the form of highly coordinated maneuvers determined by local interactions among individuals. A particularly critical question in understanding the mechanisms behind such interactions is to detect and classify leader–follower relationships within the group. In the technical literature of coupled dynamical systems, several methods have been proposed to reconstruct interaction networks, including linear correlation analysis, transfer entropy, and event synchronization. While these analyses have been helpful in reconstructing network models from neuroscience to public health, rules on the most appropriate method to use for a specific dataset are lacking. Here, we demonstrate the possibility of detecting leaders in a group from raw positional data in a model-free approach that combines multiple methods in a maximum likelihood sense. We test our framework on synthetic data of groups of self-propelled Vicsek particles, where a single agent acts as a leader and both the size of the interaction region and the level of inherent noise are systematically varied. To assess the feasibility of detecting leaders in real-world applications, we study a synthetic dataset of fish shoaling, generated by using a recent data-driven model for social behavior, and an experimental dataset of pharmacologically treated zebrafish. Not only does our approach offer a robust strategy to detect leaders in synthetic data but it also allows for exploring the role of psychoactive compounds on leader–follower relationships.

  8. A likelihood-based biostatistical model for analyzing consumer movement in simultaneous choice experiments.

    Science.gov (United States)

    Zeilinger, Adam R; Olson, Dawn M; Andow, David A

    2014-08-01

    Consumer feeding preference among resource choices has critical implications for basic ecological and evolutionary processes, and can be highly relevant to applied problems such as ecological risk assessment and invasion biology. Within consumer choice experiments, also known as feeding preference or cafeteria experiments, measures of relative consumption and measures of consumer movement can provide distinct and complementary insights into the strength, causes, and consequences of preference. Despite the distinct value of inferring preference from measures of consumer movement, rigorous and biologically relevant analytical methods are lacking. We describe a simple, likelihood-based, biostatistical model for analyzing the transient dynamics of consumer movement in a paired-choice experiment. With experimental data consisting of repeated discrete measures of consumer location, the model can be used to estimate constant consumer attraction and leaving rates for two food choices, and differences in choice-specific attraction and leaving rates can be tested using model selection. The model enables calculation of transient and equilibrial probabilities of consumer-resource association, which could be incorporated into larger scale movement models. We explore the effect of experimental design on parameter estimation through stochastic simulation and describe methods to check that data meet model assumptions. Using a dataset of modest sample size, we illustrate the use of the model to draw inferences on consumer preference as well as underlying behavioral mechanisms. Finally, we include a user's guide and computer code scripts in R to facilitate use of the model by other researchers.

  9. Discrete dynamics versus analytic dynamics

    DEFF Research Database (Denmark)

    Toxværd, Søren

    2014-01-01

    For discrete classical Molecular dynamics obtained by the “Verlet” algorithm (VA) with the time increment h there exists a shadow Hamiltonian H˜ with energy E˜(h) , for which the discrete particle positions lie on the analytic trajectories for H˜ . Here, we proof that there, independent...... of such an analytic analogy, exists an exact hidden energy invariance E * for VA dynamics. The fact that the discrete VA dynamics has the same invariances as Newtonian dynamics raises the question, which of the formulations that are correct, or alternatively, the most appropriate formulation of classical dynamics....... In this context the relation between the discrete VA dynamics and the (general) discrete dynamics investigated by Lee [Phys. Lett. B122, 217 (1983)] is presented and discussed....

  10. Exploring Social Dynamics in School Science Context

    Directory of Open Access Journals (Sweden)

    Mehmet C. Ayar

    2014-09-01

    Full Text Available The purpose of this study was to explore the socio-cultural practices and interactions of learning science in a science classroom within the concept of communities of practice. Our qualitative data were collected through observing, taking field notes, and conducting interviews in a public science classroom during an entire school year. The study occurred in a seventh-grade classroom with a veteran physical science teacher, with more than 10 years teaching experience, and 22 students. For this article, we presented two classroom vignettes that reflect a sample of the participation, practice, and community that was observed in the science classroom on a daily basis. The first vignette illustrated a typical formula of Initiation–Response–Feedback (I-R-F that transfers knowledge to students through a teacher-led discussion with the entire class. The second vignette described a laboratory activity designed to allow students to apply or discover knowledge through practical experience, while taking responsibility for their learning through small-group work. The normative practices and routine behaviors of the science classroom are highlighted through the description of material resources, and different modes of participation accompanied by assigned roles and responsibilities. What we observed was that laboratory activities reproduced the epistemic authority of the I-R-F rather than creating collective cognitive responsibility where students have the independence to explore and create authentic science experiences.

  11. Knowledge governance in a dynamic global context

    DEFF Research Database (Denmark)

    Foss, Nicolai Juul

    This paper tells the story of the emergence of distinct research around the theory of the firm at Copenhagen Business School within the last two decades, focussing on elements of continuity in the thinking of key CBS persons in the period. It discusses the current research agenda of the Center fo...

  12. Deconvolving the wedge: maximum-likelihood power spectra via spherical-wave visibility modelling

    Science.gov (United States)

    Ghosh, A.; Mertens, F. G.; Koopmans, L. V. E.

    2018-03-01

    Direct detection of the Epoch of Reionization (EoR) via the red-shifted 21-cm line will have unprecedented implications on the study of structure formation in the infant Universe. To fulfil this promise, current and future 21-cm experiments need to detect this weak EoR signal in the presence of foregrounds that are several orders of magnitude larger. This requires extreme noise control and improved wide-field high dynamic-range imaging techniques. We propose a new imaging method based on a maximum likelihood framework which solves for the interferometric equation directly on the sphere, or equivalently in the uvw-domain. The method uses the one-to-one relation between spherical waves and spherical harmonics (SpH). It consistently handles signals from the entire sky, and does not require a w-term correction. The SpH coefficients represent the sky-brightness distribution and the visibilities in the uvw-domain, and provide a direct estimate of the spatial power spectrum. Using these spectrally smooth SpH coefficients, bright foregrounds can be removed from the signal, including their side-lobe noise, which is one of the limiting factors in high dynamics-range wide-field imaging. Chromatic effects causing the so-called `wedge' are effectively eliminated (i.e. deconvolved) in the cylindrical (k⊥, k∥) power spectrum, compared to a power spectrum computed directly from the images of the foreground visibilities where the wedge is clearly present. We illustrate our method using simulated Low-Frequency Array observations, finding an excellent reconstruction of the input EoR signal with minimal bias.

  13. Hierarchical Context Modeling for Video Event Recognition.

    Science.gov (United States)

    Wang, Xiaoyang; Ji, Qiang

    2016-10-11

    Current video event recognition research remains largely target-centered. For real-world surveillance videos, targetcentered event recognition faces great challenges due to large intra-class target variation, limited image resolution, and poor detection and tracking results. To mitigate these challenges, we introduced a context-augmented video event recognition approach. Specifically, we explicitly capture different types of contexts from three levels including image level, semantic level, and prior level. At the image level, we introduce two types of contextual features including the appearance context features and interaction context features to capture the appearance of context objects and their interactions with the target objects. At the semantic level, we propose a deep model based on deep Boltzmann machine to learn event object representations and their interactions. At the prior level, we utilize two types of prior-level contexts including scene priming and dynamic cueing. Finally, we introduce a hierarchical context model that systematically integrates the contextual information at different levels. Through the hierarchical context model, contexts at different levels jointly contribute to the event recognition. We evaluate the hierarchical context model for event recognition on benchmark surveillance video datasets. Results show that incorporating contexts in each level can improve event recognition performance, and jointly integrating three levels of contexts through our hierarchical model achieves the best performance.

  14. Bias Correction for the Maximum Likelihood Estimate of Ability. Research Report. ETS RR-05-15

    Science.gov (United States)

    Zhang, Jinming

    2005-01-01

    Lord's bias function and the weighted likelihood estimation method are effective in reducing the bias of the maximum likelihood estimate of an examinee's ability under the assumption that the true item parameters are known. This paper presents simulation studies to determine the effectiveness of these two methods in reducing the bias when the item…

  15. Bias correction in the hierarchical likelihood approach to the analysis of multivariate survival data.

    Science.gov (United States)

    Jeon, Jihyoun; Hsu, Li; Gorfine, Malka

    2012-07-01

    Frailty models are useful for measuring unobserved heterogeneity in risk of failures across clusters, providing cluster-specific risk prediction. In a frailty model, the latent frailties shared by members within a cluster are assumed to act multiplicatively on the hazard function. In order to obtain parameter and frailty variate estimates, we consider the hierarchical likelihood (H-likelihood) approach (Ha, Lee and Song, 2001. Hierarchical-likelihood approach for frailty models. Biometrika 88, 233-243) in which the latent frailties are treated as "parameters" and estimated jointly with other parameters of interest. We find that the H-likelihood estimators perform well when the censoring rate is low, however, they are substantially biased when the censoring rate is moderate to high. In this paper, we propose a simple and easy-to-implement bias correction method for the H-likelihood estimators under a shared frailty model. We also extend the method to a multivariate frailty model, which incorporates complex dependence structure within clusters. We conduct an extensive simulation study and show that the proposed approach performs very well for censoring rates as high as 80%. We also illustrate the method with a breast cancer data set. Since the H-likelihood is the same as the penalized likelihood function, the proposed bias correction method is also applicable to the penalized likelihood estimators.

  16. Analyzing multivariate survival data using composite likelihood and flexible parametric modeling of the hazard functions

    DEFF Research Database (Denmark)

    Nielsen, Jan; Parner, Erik

    2010-01-01

    In this paper, we model multivariate time-to-event data by composite likelihood of pairwise frailty likelihoods and marginal hazards using natural cubic splines. Both right- and interval-censored data are considered. The suggested approach is applied on two types of family studies using the gamma...

  17. Existence and uniqueness of the maximum likelihood estimator for models with a Kronecker product covariance structure

    NARCIS (Netherlands)

    Ros, B.P.; Bijma, F.; de Munck, J.C.; de Gunst, M.C.M.

    2016-01-01

    This paper deals with multivariate Gaussian models for which the covariance matrix is a Kronecker product of two matrices. We consider maximum likelihood estimation of the model parameters, in particular of the covariance matrix. There is no explicit expression for the maximum likelihood estimator

  18. Use of deterministic sampling for exploring likelihoods in linkage analysis for quantitative traits.

    NARCIS (Netherlands)

    Mackinnon, M.J.; Beek, van der S.; Kinghorn, B.P.

    1996-01-01

    Deterministic sampling was used to numerically evaluate the expected log-likelihood surfaces of QTL-marker linkage models in large pedigrees with simple structures. By calculating the expected values of likelihoods, questions of power of experimental designs, bias in parameter estimates, approximate

  19. Likelihood ratio data to report the validation of a forensic fingerprint evaluation method

    NARCIS (Netherlands)

    Ramos, Daniel; Haraksim, Rudolf; Meuwly, Didier

    2017-01-01

    Data to which the authors refer to throughout this article are likelihood ratios (LR) computed from the comparison of 5–12 minutiae fingermarks with fingerprints. These LRs data are used for the validation of a likelihood ratio (LR) method in forensic evidence evaluation. These data present a

  20. Predictors of Self-Reported Likelihood of Working with Older Adults

    Science.gov (United States)

    Eshbaugh, Elaine M.; Gross, Patricia E.; Satrom, Tatum

    2010-01-01

    This study examined the self-reported likelihood of working with older adults in a future career among 237 college undergraduates at a midsized Midwestern university. Although aging anxiety was not significantly related to likelihood of working with older adults, those students who had a greater level of death anxiety were less likely than other…

  1. Organizational Justice and Men's Likelihood to Sexually Harass: The Moderating Role of Sexism and Personality

    Science.gov (United States)

    Krings, Franciska; Facchin, Stephanie

    2009-01-01

    This study demonstrated relations between men's perceptions of organizational justice and increased sexual harassment proclivities. Respondents reported higher likelihood to sexually harass under conditions of low interactional justice, suggesting that sexual harassment likelihood may increase as a response to perceived injustice. Moreover, the…

  2. Sampling variability in forensic likelihood-ratio computation: A simulation study

    NARCIS (Netherlands)

    Ali, Tauseef; Spreeuwers, Lieuwe Jan; Veldhuis, Raymond N.J.; Meuwly, Didier

    2015-01-01

    Recently, in the forensic biometric community, there is a growing interest to compute a metric called “likelihood- ratio‿ when a pair of biometric specimens is compared using a biometric recognition system. Generally, a biomet- ric recognition system outputs a score and therefore a likelihood-ratio

  3. Statistical modelling of survival data with random effects h-likelihood approach

    CERN Document Server

    Ha, Il Do; Lee, Youngjo

    2017-01-01

    This book provides a groundbreaking introduction to the likelihood inference for correlated survival data via the hierarchical (or h-) likelihood in order to obtain the (marginal) likelihood and to address the computational difficulties in inferences and extensions. The approach presented in the book overcomes shortcomings in the traditional likelihood-based methods for clustered survival data such as intractable integration. The text includes technical materials such as derivations and proofs in each chapter, as well as recently developed software programs in R (“frailtyHL”), while the real-world data examples together with an R package, “frailtyHL” in CRAN, provide readers with useful hands-on tools. Reviewing new developments since the introduction of the h-likelihood to survival analysis (methods for interval estimation of the individual frailty and for variable selection of the fixed effects in the general class of frailty models) and guiding future directions, the book is of interest to research...

  4. The likelihood principle and its proof – a never-ending story…

    DEFF Research Database (Denmark)

    Jørgensen, Thomas Martini

    2015-01-01

    An ongoing controversy in philosophy of statistics is the so-called “likelihood principle” essentially stating that all evidence which is obtained from an experiment about an unknown quantity θ is contained in the likelihood function of θ. Common classical statistical methodology, such as the use...... of significance tests, and confidence intervals, depends on the experimental procedure and unrealized events and thus violates the likelihood principle. The likelihood principle was identified by that name and proved in a famous paper by Allan Birnbaum in 1962. However, ever since both the principle itself...... as well as the proof has been highly debated. This presentation will illustrate the debate of both the principle and its proof, from 1962 and up to today. An often-used experiment to illustrate the controversy between classical interpretation and evidential confirmation based on the likelihood principle...

  5. Sampling of systematic errors to estimate likelihood weights in nuclear data uncertainty propagation

    International Nuclear Information System (INIS)

    Helgesson, P.; Sjöstrand, H.; Koning, A.J.; Rydén, J.; Rochman, D.; Alhassan, E.; Pomp, S.

    2016-01-01

    In methodologies for nuclear data (ND) uncertainty assessment and propagation based on random sampling, likelihood weights can be used to infer experimental information into the distributions for the ND. As the included number of correlated experimental points grows large, the computational time for the matrix inversion involved in obtaining the likelihood can become a practical problem. There are also other problems related to the conventional computation of the likelihood, e.g., the assumption that all experimental uncertainties are Gaussian. In this study, a way to estimate the likelihood which avoids matrix inversion is investigated; instead, the experimental correlations are included by sampling of systematic errors. It is shown that the model underlying the sampling methodology (using univariate normal distributions for random and systematic errors) implies a multivariate Gaussian for the experimental points (i.e., the conventional model). It is also shown that the likelihood estimates obtained through sampling of systematic errors approach the likelihood obtained with matrix inversion as the sample size for the systematic errors grows large. In studied practical cases, it is seen that the estimates for the likelihood weights converge impractically slowly with the sample size, compared to matrix inversion. The computational time is estimated to be greater than for matrix inversion in cases with more experimental points, too. Hence, the sampling of systematic errors has little potential to compete with matrix inversion in cases where the latter is applicable. Nevertheless, the underlying model and the likelihood estimates can be easier to intuitively interpret than the conventional model and the likelihood function involving the inverted covariance matrix. Therefore, this work can both have pedagogical value and be used to help motivating the conventional assumption of a multivariate Gaussian for experimental data. The sampling of systematic errors could also

  6. Quasi-Maximum Likelihood Estimation and Bootstrap Inference in Fractional Time Series Models with Heteroskedasticity of Unknown Form

    DEFF Research Database (Denmark)

    Cavaliere, Giuseppe; Nielsen, Morten Ørregaard; Taylor, Robert

    We consider the problem of conducting estimation and inference on the parameters of univariate heteroskedastic fractionally integrated time series models. We first extend existing results in the literature, developed for conditional sum-of squares estimators in the context of parametric fractional...... time series models driven by conditionally homoskedastic shocks, to allow for conditional and unconditional heteroskedasticity both of a quite general and unknown form. Global consistency and asymptotic normality are shown to still obtain; however, the covariance matrix of the limiting distribution...... of the estimator now depends on nuisance parameters derived both from the weak dependence and heteroskedasticity present in the shocks. We then investigate classical methods of inference based on the Wald, likelihood ratio and Lagrange multiplier tests for linear hypotheses on either or both of the long and short...

  7. Likelihood ratio and posterior odds in forensic genetics: Two sides of the same coin.

    Science.gov (United States)

    Caliebe, Amke; Walsh, Susan; Liu, Fan; Kayser, Manfred; Krawczak, Michael

    2017-05-01

    It has become widely accepted in forensics that, owing to a lack of sensible priors, the evidential value of matching DNA profiles in trace donor identification or kinship analysis is most sensibly communicated in the form of a likelihood ratio (LR). This restraint does not abate the fact that the posterior odds (PO) would be the preferred basis for returning a verdict. A completely different situation holds for Forensic DNA Phenotyping (FDP), which is aimed at predicting externally visible characteristics (EVCs) of a trace donor from DNA left behind at the crime scene. FDP is intended to provide leads to the police investigation helping them to find unknown trace donors that are unidentifiable by DNA profiling. The statistical models underlying FDP typically yield posterior odds (PO) for an individual possessing a certain EVC. This apparent discrepancy has led to confusion as to when LR or PO is the appropriate outcome of forensic DNA analysis to be communicated to the investigating authorities. We thus set out to clarify the distinction between LR and PO in the context of forensic DNA profiling and FDP from a statistical point of view. In so doing, we also addressed the influence of population affiliation on LR and PO. In contrast to the well-known population dependency of the LR in DNA profiling, the PO as obtained in FDP may be widely population-independent. The actual degree of independence, however, is a matter of (i) how much of the causality of the respective EVC is captured by the genetic markers used for FDP and (ii) by the extent to which non-genetic such as environmental causal factors of the same EVC are distributed equally throughout populations. The fact that an LR should be communicated in cases of DNA profiling whereas the PO are suitable for FDP does not conflict with theory, but rather reflects the immanent differences between these two forensic applications of DNA information. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Climatic and ecological future of the Amazon: likelihood and causes of change

    Science.gov (United States)

    Cook, B.; Zeng, N.; Yoon, J.-H.

    2010-05-01

    Some recent climate modeling results suggested a possible dieback of the Amazon rainforest under future climate change, a prediction that raised considerable interest as well as controversy. To determine the likelihood and causes of such changes, we analyzed the output of 15 models from the Intergovernmental Panel on Climate Change Fourth Assessment Report (IPCC/AR4) and a dynamic vegetation model VEGAS driven by these climate output. Our results suggest that the core of the Amazon rainforest should remain largely stable as rainfall is projected to increase in nearly all models. However, the periphery, notably the southern edge of the Amazon and further south in central Brazil, are in danger of drying out, driven by two main processes. Firstly, a decline in precipitation of 22% in the southern Amazon's dry season (May-September) reduces soil moisture, despite an increase in precipitation during the wet season, due to nonlinear responses in hydrology and ecosystem dynamics. Two dynamical mechanisms may explain the lower dry season rainfall: (1) a general subtropical drying under global warming when the dry season southern Amazon is under the control of the subtropical high pressure; (2) a stronger north-south tropical Atlantic sea surface temperature gradient, and to lesser degree a warmer eastern equatorial Pacific. Secondly, evaporation demand will increase due to the general warming, further reducing soil moisture. In terms of ecosystem response, higher maintenance cost and reduced productivity under warming may also have additional adverse impact. The drying corresponds to a lengthening of the dry season by 11 days. As a consequence, the median of the models projects a reduction of 20% in vegetation carbon stock in the southern Amazon, central Brazil, and parts of the Andean Mountains. Further, VEGAS predicts enhancement of fire risk by 10-15%. The increase in fire is primarily due to the reduction in soil moisture, and the decrease in dry season rainfall, which

  9. The fine-tuning cost of the likelihood in SUSY models

    CERN Document Server

    Ghilencea, D M

    2013-01-01

    In SUSY models, the fine tuning of the electroweak (EW) scale with respect to their parameters gamma_i={m_0, m_{1/2}, mu_0, A_0, B_0,...} and the maximal likelihood L to fit the experimental data are usually regarded as two different problems. We show that, if one regards the EW minimum conditions as constraints that fix the EW scale, this commonly held view is not correct and that the likelihood contains all the information about fine-tuning. In this case we show that the corrected likelihood is equal to the ratio L/Delta of the usual likelihood L and the traditional fine tuning measure Delta of the EW scale. A similar result is obtained for the integrated likelihood over the set {gamma_i}, that can be written as a surface integral of the ratio L/Delta, with the surface in gamma_i space determined by the EW minimum constraints. As a result, a large likelihood actually demands a large ratio L/Delta or equivalently, a small chi^2_{new}=chi^2_{old}+2*ln(Delta). This shows the fine-tuning cost to the likelihood ...

  10. Estimation of Model's Marginal likelihood Using Adaptive Sparse Grid Surrogates in Bayesian Model Averaging

    Science.gov (United States)

    Zeng, X.

    2015-12-01

    A large number of model executions are required to obtain alternative conceptual models' predictions and their posterior probabilities in Bayesian model averaging (BMA). The posterior model probability is estimated through models' marginal likelihood and prior probability. The heavy computation burden hinders the implementation of BMA prediction, especially for the elaborated marginal likelihood estimator. For overcoming the computation burden of BMA, an adaptive sparse grid (SG) stochastic collocation method is used to build surrogates for alternative conceptual models through the numerical experiment of a synthetical groundwater model. BMA predictions depend on model posterior weights (or marginal likelihoods), and this study also evaluated four marginal likelihood estimators, including arithmetic mean estimator (AME), harmonic mean estimator (HME), stabilized harmonic mean estimator (SHME), and thermodynamic integration estimator (TIE). The results demonstrate that TIE is accurate in estimating conceptual models' marginal likelihoods. The BMA-TIE has better predictive performance than other BMA predictions. TIE has high stability for estimating conceptual model's marginal likelihood. The repeated estimated conceptual model's marginal likelihoods by TIE have significant less variability than that estimated by other estimators. In addition, the SG surrogates are efficient to facilitate BMA predictions, especially for BMA-TIE. The number of model executions needed for building surrogates is 4.13%, 6.89%, 3.44%, and 0.43% of the required model executions of BMA-AME, BMA-HME, BMA-SHME, and BMA-TIE, respectively.

  11. Zero-inflated Poisson model based likelihood ratio test for drug safety signal detection.

    Science.gov (United States)

    Huang, Lan; Zheng, Dan; Zalkikar, Jyoti; Tiwari, Ram

    2017-02-01

    In recent decades, numerous methods have been developed for data mining of large drug safety databases, such as Food and Drug Administration's (FDA's) Adverse Event Reporting System, where data matrices are formed by drugs such as columns and adverse events as rows. Often, a large number of cells in these data matrices have zero cell counts and some of them are "true zeros" indicating that the drug-adverse event pairs cannot occur, and these zero counts are distinguished from the other zero counts that are modeled zero counts and simply indicate that the drug-adverse event pairs have not occurred yet or have not been reported yet. In this paper, a zero-inflated Poisson model based likelihood ratio test method is proposed to identify drug-adverse event pairs that have disproportionately high reporting rates, which are also called signals. The maximum likelihood estimates of the model parameters of zero-inflated Poisson model based likelihood ratio test are obtained using the expectation and maximization algorithm. The zero-inflated Poisson model based likelihood ratio test is also modified to handle the stratified analyses for binary and categorical covariates (e.g. gender and age) in the data. The proposed zero-inflated Poisson model based likelihood ratio test method is shown to asymptotically control the type I error and false discovery rate, and its finite sample performance for signal detection is evaluated through a simulation study. The simulation results show that the zero-inflated Poisson model based likelihood ratio test method performs similar to Poisson model based likelihood ratio test method when the estimated percentage of true zeros in the database is small. Both the zero-inflated Poisson model based likelihood ratio test and likelihood ratio test methods are applied to six selected drugs, from the 2006 to 2011 Adverse Event Reporting System database, with varying percentages of observed zero-count cells.

  12. Massive optimal data compression and density estimation for scalable, likelihood-free inference in cosmology

    Science.gov (United States)

    Alsing, Justin; Wandelt, Benjamin; Feeney, Stephen

    2018-03-01

    Many statistical models in cosmology can be simulated forwards but have intractable likelihood functions. Likelihood-free inference methods allow us to perform Bayesian inference from these models using only forward simulations, free from any likelihood assumptions or approximations. Likelihood-free inference generically involves simulating mock data and comparing to the observed data; this comparison in data-space suffers from the curse of dimensionality and requires compression of the data to a small number of summary statistics to be tractable. In this paper we use massive asymptotically-optimal data compression to reduce the dimensionality of the data-space to just one number per parameter, providing a natural and optimal framework for summary statistic choice for likelihood-free inference. Secondly, we present the first cosmological application of Density Estimation Likelihood-Free Inference (DELFI), which learns a parameterized model for joint distribution of data and parameters, yielding both the parameter posterior and the model evidence. This approach is conceptually simple, requires less tuning than traditional Approximate Bayesian Computation approaches to likelihood-free inference and can give high-fidelity posteriors from orders of magnitude fewer forward simulations. As an additional bonus, it enables parameter inference and Bayesian model comparison simultaneously. We demonstrate Density Estimation Likelihood-Free Inference with massive data compression on an analysis of the joint light-curve analysis supernova data, as a simple validation case study. We show that high-fidelity posterior inference is possible for full-scale cosmological data analyses with as few as ˜104 simulations, with substantial scope for further improvement, demonstrating the scalability of likelihood-free inference to large and complex cosmological datasets.

  13. Service Degradation in Context Management Frameworks

    DEFF Research Database (Denmark)

    Shawky, Ahmed; Olsen, Rasmus Løvenstein; Pedersen, Jens Myrup

    2011-01-01

    information. The paper considers a developed framework from the ICT project, OPEN, and investigates the impact of applying Differentiated Services (DiffServ) Quality of Services (QoS). The paper finally provides insight in how the insight gained can be utilized to ensure reliable remote accessed context......Context aware network services are a new and inter-esting way to enhance network users experience. A context aware application/service enhances network performance in relation to dynamic context information, e.g. mobility, location and device information as it senses and reacts to environment...... changes. The reliability of the information accessed is a key factor in achieving reliable context aware application. This paper will review the service degradation in Context Management Frameworks (CMF) and the effect of high network utilization, with particular focus on the reliability of the accessed...

  14. Maximal information analysis: I - various Wayne State plots and the most common likelihood principle

    International Nuclear Information System (INIS)

    Bonvicini, G.

    2005-01-01

    Statistical analysis using all moments of the likelihood L(y vertical bar α) (y being the data and α being the fit parameters) is presented. The relevant plots for various data fitting situations are presented. The goodness of fit (GOF) parameter (currently the χ 2 ) is redefined as the isoprobability level in a multidimensional space. Many useful properties of statistical analysis are summarized in a new statistical principle which states that the most common likelihood, and not the tallest, is the best possible likelihood, when comparing experiments or hypotheses

  15. Simplified likelihood for the re-interpretation of public CMS results

    CERN Document Server

    The CMS Collaboration

    2017-01-01

    In this note, a procedure for the construction of simplified likelihoods for the re-interpretation of the results of CMS searches for new physics is presented. The procedure relies on the use of a reduced set of information on the background models used in these searches which can readily be provided by the CMS collaboration. A toy example is used to demonstrate the procedure and its accuracy in reproducing the full likelihood for setting limits in models for physics beyond the standard model. Finally, two representative searches from the CMS collaboration are used to demonstrate the validity of the simplified likelihood approach under realistic conditions.

  16. Efficient Levenberg-Marquardt minimization of the maximum likelihood estimator for Poisson deviates

    International Nuclear Information System (INIS)

    Laurence, T.; Chromy, B.

    2010-01-01

    Histograms of counted events are Poisson distributed, but are typically fitted without justification using nonlinear least squares fitting. The more appropriate maximum likelihood estimator (MLE) for Poisson distributed data is seldom used. We extend the use of the Levenberg-Marquardt algorithm commonly used for nonlinear least squares minimization for use with the MLE for Poisson distributed data. In so doing, we remove any excuse for not using this more appropriate MLE. We demonstrate the use of the algorithm and the superior performance of the MLE using simulations and experiments in the context of fluorescence lifetime imaging. Scientists commonly form histograms of counted events from their data, and extract parameters by fitting to a specified model. Assuming that the probability of occurrence for each bin is small, event counts in the histogram bins will be distributed according to the Poisson distribution. We develop here an efficient algorithm for fitting event counting histograms using the maximum likelihood estimator (MLE) for Poisson distributed data, rather than the non-linear least squares measure. This algorithm is a simple extension of the common Levenberg-Marquardt (L-M) algorithm, is simple to implement, quick and robust. Fitting using a least squares measure is most common, but it is the maximum likelihood estimator only for Gaussian-distributed data. Non-linear least squares methods may be applied to event counting histograms in cases where the number of events is very large, so that the Poisson distribution is well approximated by a Gaussian. However, it is not easy to satisfy this criterion in practice - which requires a large number of events. It has been well-known for years that least squares procedures lead to biased results when applied to Poisson-distributed data; a recent paper providing extensive characterization of these biases in exponential fitting is given. The more appropriate measure based on the maximum likelihood estimator (MLE

  17. Description logics of context

    CSIR Research Space (South Africa)

    Klarman, S

    2013-05-01

    Full Text Available We introduce Description Logics of Context (DLCs) - an extension of Description Logics (DLs) for context-based reasoning. Our approach descends from J. McCarthy's tradition of treating contexts as formal objects over which one can quantify...

  18. Debris Likelihood, based on GhostNet, NASA Aqua MODIS, and GOES Imager, EXPERIMENTAL

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Debris Likelihood Index (Estimated) is calculated from GhostNet, NASA Aqua MODIS Chl a and NOAA GOES Imager SST data. THIS IS AN EXPERIMENTAL PRODUCT: intended...

  19. A biclustering algorithm for binary matrices based on penalized Bernoulli likelihood

    KAUST Repository

    Lee, Seokho; Huang, Jianhua Z.

    2013-01-01

    We propose a new biclustering method for binary data matrices using the maximum penalized Bernoulli likelihood estimation. Our method applies a multi-layer model defined on the logits of the success probabilities, where each layer represents a

  20. Performances of the likelihood-ratio classifier based on different data modelings

    NARCIS (Netherlands)

    Chen, C.; Veldhuis, Raymond N.J.

    2008-01-01

    The classical likelihood ratio classifier easily collapses in many biometric applications especially with independent training-test subjects. The reason lies in the inaccurate estimation of the underlying user-specific feature density. Firstly, the feature density estimation suffers from

  1. Finite mixture model: A maximum likelihood estimation approach on time series data

    Science.gov (United States)

    Yen, Phoong Seuk; Ismail, Mohd Tahir; Hamzah, Firdaus Mohamad

    2014-09-01

    Recently, statistician emphasized on the fitting of finite mixture model by using maximum likelihood estimation as it provides asymptotic properties. In addition, it shows consistency properties as the sample sizes increases to infinity. This illustrated that maximum likelihood estimation is an unbiased estimator. Moreover, the estimate parameters obtained from the application of maximum likelihood estimation have smallest variance as compared to others statistical method as the sample sizes increases. Thus, maximum likelihood estimation is adopted in this paper to fit the two-component mixture model in order to explore the relationship between rubber price and exchange rate for Malaysia, Thailand, Philippines and Indonesia. Results described that there is a negative effect among rubber price and exchange rate for all selected countries.

  2. Moral Identity Predicts Doping Likelihood via Moral Disengagement and Anticipated Guilt.

    Science.gov (United States)

    Kavussanu, Maria; Ring, Christopher

    2017-08-01

    In this study, we integrated elements of social cognitive theory of moral thought and action and the social cognitive model of moral identity to better understand doping likelihood in athletes. Participants (N = 398) recruited from a variety of team sports completed measures of moral identity, moral disengagement, anticipated guilt, and doping likelihood. Moral identity predicted doping likelihood indirectly via moral disengagement and anticipated guilt. Anticipated guilt about potential doping mediated the relationship between moral disengagement and doping likelihood. Our findings provide novel evidence to suggest that athletes, who feel that being a moral person is central to their self-concept, are less likely to use banned substances due to their lower tendency to morally disengage and the more intense feelings of guilt they expect to experience for using banned substances.

  3. Parameter estimation in astronomy through application of the likelihood ratio. [satellite data analysis techniques

    Science.gov (United States)

    Cash, W.

    1979-01-01

    Many problems in the experimental estimation of parameters for models can be solved through use of the likelihood ratio test. Applications of the likelihood ratio, with particular attention to photon counting experiments, are discussed. The procedures presented solve a greater range of problems than those currently in use, yet are no more difficult to apply. The procedures are proved analytically, and examples from current problems in astronomy are discussed.

  4. Maximum Likelihood Approach for RFID Tag Set Cardinality Estimation with Detection Errors

    DEFF Research Database (Denmark)

    Nguyen, Chuyen T.; Hayashi, Kazunori; Kaneko, Megumi

    2013-01-01

    Abstract Estimation schemes of Radio Frequency IDentification (RFID) tag set cardinality are studied in this paper using Maximum Likelihood (ML) approach. We consider the estimation problem under the model of multiple independent reader sessions with detection errors due to unreliable radio...... is evaluated under dierent system parameters and compared with that of the conventional method via computer simulations assuming flat Rayleigh fading environments and framed-slotted ALOHA based protocol. Keywords RFID tag cardinality estimation maximum likelihood detection error...

  5. Modified Moment, Maximum Likelihood and Percentile Estimators for the Parameters of the Power Function Distribution

    Directory of Open Access Journals (Sweden)

    Azam Zaka

    2014-10-01

    Full Text Available This paper is concerned with the modifications of maximum likelihood, moments and percentile estimators of the two parameter Power function distribution. Sampling behavior of the estimators is indicated by Monte Carlo simulation. For some combinations of parameter values, some of the modified estimators appear better than the traditional maximum likelihood, moments and percentile estimators with respect to bias, mean square error and total deviation.

  6. Practical Statistics for LHC Physicists: Descriptive Statistics, Probability and Likelihood (1/3)

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    These lectures cover those principles and practices of statistics that are most relevant for work at the LHC. The first lecture discusses the basic ideas of descriptive statistics, probability and likelihood. The second lecture covers the key ideas in the frequentist approach, including confidence limits, profile likelihoods, p-values, and hypothesis testing. The third lecture covers inference in the Bayesian approach. Throughout, real-world examples will be used to illustrate the practical application of the ideas. No previous knowledge is assumed.

  7. Tests and Confidence Intervals for an Extended Variance Component Using the Modified Likelihood Ratio Statistic

    DEFF Research Database (Denmark)

    Christensen, Ole Fredslund; Frydenberg, Morten; Jensen, Jens Ledet

    2005-01-01

    The large deviation modified likelihood ratio statistic is studied for testing a variance component equal to a specified value. Formulas are presented in the general balanced case, whereas in the unbalanced case only the one-way random effects model is studied. Simulation studies are presented......, showing that the normal approximation to the large deviation modified likelihood ratio statistic gives confidence intervals for variance components with coverage probabilities very close to the nominal confidence coefficient....

  8. The likelihood ratio as a random variable for linked markers in kinship analysis.

    Science.gov (United States)

    Egeland, Thore; Slooten, Klaas

    2016-11-01

    The likelihood ratio is the fundamental quantity that summarizes the evidence in forensic cases. Therefore, it is important to understand the theoretical properties of this statistic. This paper is the last in a series of three, and the first to study linked markers. We show that for all non-inbred pairwise kinship comparisons, the expected likelihood ratio in favor of a type of relatedness depends on the allele frequencies only via the number of alleles, also for linked markers, and also if the true relationship is another one than is tested for by the likelihood ratio. Exact expressions for the expectation and variance are derived for all these cases. Furthermore, we show that the expected likelihood ratio is a non-increasing function if the recombination rate increases between 0 and 0.5 when the actual relationship is the one investigated by the LR. Besides being of theoretical interest, exact expressions such as obtained here can be used for software validation as they allow to verify the correctness up to arbitrary precision. The paper also presents results and advice of practical importance. For example, we argue that the logarithm of the likelihood ratio behaves in a fundamentally different way than the likelihood ratio itself in terms of expectation and variance, in agreement with its interpretation as weight of evidence. Equipped with the results presented and freely available software, one may check calculations and software and also do power calculations.

  9. Penalised Maximum Likelihood Simultaneous Longitudinal PET Image Reconstruction with Difference-Image Priors.

    Science.gov (United States)

    Ellis, Sam; Reader, Andrew J

    2018-04-26

    Many clinical contexts require the acquisition of multiple positron emission tomography (PET) scans of a single subject, for example to observe and quantify changes in functional behaviour in tumours after treatment in oncology. Typically, the datasets from each of these scans are reconstructed individually, without exploiting the similarities between them. We have recently shown that sharing information between longitudinal PET datasets by penalising voxel-wise differences during image reconstruction can improve reconstructed images by reducing background noise and increasing the contrast-to-noise ratio of high activity lesions. Here we present two additional novel longitudinal difference-image priors and evaluate their performance using 2D simulation studies and a 3D real dataset case study. We have previously proposed a simultaneous difference-image-based penalised maximum likelihood (PML) longitudinal image reconstruction method that encourages sparse difference images (DS-PML), and in this work we propose two further novel prior terms. The priors are designed to encourage longitudinal images with corresponding differences which have i) low entropy (DE-PML), and ii) high sparsity in their spatial gradients (DTV-PML). These two new priors and the originally proposed longitudinal prior were applied to 2D simulated treatment response [ 18 F]fluorodeoxyglucose (FDG) brain tumour datasets and compared to standard maximum likelihood expectation-maximisation (MLEM) reconstructions. These 2D simulation studies explored the effects of penalty strengths, tumour behaviour, and inter-scan coupling on reconstructed images. Finally, a real two-scan longitudinal data series acquired from a head and neck cancer patient was reconstructed with the proposed methods and the results compared to standard reconstruction methods. Using any of the three priors with an appropriate penalty strength produced images with noise levels equivalent to those seen when using standard

  10. Content, Context & Connectivity Persuasive Interplay

    DEFF Research Database (Denmark)

    Sørensen, Christian Grund

    2013-01-01

    -supported research project under EACEA). In the development of this project several categories of content have been implemented in technology enhanced learning tools. These have been designed to support learning in different contexts and eventually the role of the connectivity of these learning objects and tools......The aim of this paper is to discuss the relationship between content, context and connectivity and suggesting a model of Dynamic Interplay. This is done in relation to a specific learning environment concerning cultural mediation, in casu the Kaj Munk Case of the EuroPLOT-project (an EU...... is discussed. Focus is here on The Kaj Munk Study Edition, The Conceptual Pond, Immersive Layers Design, and Generative Learning Objects (GLOs) which are applications affiliated with the Munk case. This paper explores the persuasive potential of the interplay between the different applications for the benefit...

  11. Imagining Another Context during Encoding Offsets Context-Dependent Forgetting

    Science.gov (United States)

    Masicampo, E. J.; Sahakyan, Lili

    2014-01-01

    We tested whether imagining another context during encoding would offset context-dependent forgetting. All participants studied a list of words in Context A. Participants who remained in Context A during the test recalled more than participants who were tested in another context (Context B), demonstrating the standard context-dependent forgetting…

  12. Service Innovation in Industrial Contexts

    OpenAIRE

    Kowalkowski, Christian

    2016-01-01

    Both academics and practitioners emphasize the importance for product firms of pursuing service innovation. Despite a strategic focus on service-led growth, however, many firms struggle to succeed with their service innovation initiatives. In order to increase our understanding of the nature of service innovation in product firms, this chapter discusses the specificities in, and dynamics of, service offerings, service processes, and business models in industrial contexts. First, it outlines k...

  13. The fine-tuning cost of the likelihood in SUSY models

    International Nuclear Information System (INIS)

    Ghilencea, D.M.; Ross, G.G.

    2013-01-01

    In SUSY models, the fine-tuning of the electroweak (EW) scale with respect to their parameters γ i ={m 0 ,m 1/2 ,μ 0 ,A 0 ,B 0 ,…} and the maximal likelihood L to fit the experimental data are usually regarded as two different problems. We show that, if one regards the EW minimum conditions as constraints that fix the EW scale, this commonly held view is not correct and that the likelihood contains all the information about fine-tuning. In this case we show that the corrected likelihood is equal to the ratio L/Δ of the usual likelihood L and the traditional fine-tuning measure Δ of the EW scale. A similar result is obtained for the integrated likelihood over the set {γ i }, that can be written as a surface integral of the ratio L/Δ, with the surface in γ i space determined by the EW minimum constraints. As a result, a large likelihood actually demands a large ratio L/Δ or equivalently, a small χ new 2 =χ old 2 +2lnΔ. This shows the fine-tuning cost to the likelihood (χ new 2 ) of the EW scale stability enforced by SUSY, that is ignored in data fits. A good χ new 2 /d.o.f.≈1 thus demands SUSY models have a fine-tuning amount Δ≪exp(d.o.f./2), which provides a model-independent criterion for acceptable fine-tuning. If this criterion is not met, one can thus rule out SUSY models without a further χ 2 /d.o.f. analysis. Numerical methods to fit the data can easily be adapted to account for this effect.

  14. Caliper Context Annotation Library

    Energy Technology Data Exchange (ETDEWEB)

    2015-09-30

    To understand the performance of parallel programs, developers need to be able to relate performance measurement data with context information, such as the call path / line numbers or iteration numbers where measurements were taken. Caliper provides a generic way to specify and collect multi-dimensional context information across the software stack, and provide ti to third-party measurement tools or write it into a file or database in the form of context streams.

  15. Expert elicitation on ultrafine particles: likelihood of health effects and causal pathways

    Directory of Open Access Journals (Sweden)

    Brunekreef Bert

    2009-07-01

    Full Text Available Abstract Background Exposure to fine ambient particulate matter (PM has consistently been associated with increased morbidity and mortality. The relationship between exposure to ultrafine particles (UFP and health effects is less firmly established. If UFP cause health effects independently from coarser fractions, this could affect health impact assessment of air pollution, which would possibly lead to alternative policy options to be considered to reduce the disease burden of PM. Therefore, we organized an expert elicitation workshop to assess the evidence for a causal relationship between exposure to UFP and health endpoints. Methods An expert elicitation on the health effects of ambient ultrafine particle exposure was carried out, focusing on: 1 the likelihood of causal relationships with key health endpoints, and 2 the likelihood of potential causal pathways for cardiac events. Based on a systematic peer-nomination procedure, fourteen European experts (epidemiologists, toxicologists and clinicians were selected, of whom twelve attended. They were provided with a briefing book containing key literature. After a group discussion, individual expert judgments in the form of ratings of the likelihood of causal relationships and pathways were obtained using a confidence scheme adapted from the one used by the Intergovernmental Panel on Climate Change. Results The likelihood of an independent causal relationship between increased short-term UFP exposure and increased all-cause mortality, hospital admissions for cardiovascular and respiratory diseases, aggravation of asthma symptoms and lung function decrements was rated medium to high by most experts. The likelihood for long-term UFP exposure to be causally related to all cause mortality, cardiovascular and respiratory morbidity and lung cancer was rated slightly lower, mostly medium. The experts rated the likelihood of each of the six identified possible causal pathways separately. Out of these

  16. Context Sensitive Health Informatics

    DEFF Research Database (Denmark)

    Kuziemsky, Craig; Nøhr, Christian; Aarts, Jos

    2013-01-01

    Context is a key consideration when designing and evaluating health information technology (HIT) and cannot be overstated. Unintended consequences are common post HIT implementation and even well designed technology may not achieve desired outcomes because of contextual issues. While context should...... be considered in the design and evaluation of health information systems (HISs) there is a shortcoming of empirical research on contextual aspects of HIT. This conference integrates the sociotechnical and Human-Centered-Design (HCD) approaches and showcases current research on context sensitive health...... informatics. The papers and presentations outlines theories and models for studying contextual issues and insights on how we can better design HIT to accommodate different healthcare contexts....

  17. Assessing Compatibility of Direct Detection Data: Halo-Independent Global Likelihood Analyses

    CERN Document Server

    Gelmini, Graciela B.

    2016-10-18

    We present two different halo-independent methods utilizing a global maximum likelihood that can assess the compatibility of dark matter direct detection data given a particular dark matter model. The global likelihood we use is comprised of at least one extended likelihood and an arbitrary number of Poisson or Gaussian likelihoods. In the first method we find the global best fit halo function and construct a two sided pointwise confidence band, which can then be compared with those derived from the extended likelihood alone to assess the joint compatibility of the data. In the second method we define a "constrained parameter goodness-of-fit" test statistic, whose $p$-value we then use to define a "plausibility region" (e.g. where $p \\geq 10\\%$). For any halo function not entirely contained within the plausibility region, the level of compatibility of the data is very low (e.g. $p < 10 \\%$). As an example we apply these methods to CDMS-II-Si and SuperCDMS data, assuming dark matter particles with elastic s...

  18. Physician Bayesian updating from personal beliefs about the base rate and likelihood ratio.

    Science.gov (United States)

    Rottman, Benjamin Margolin

    2017-02-01

    Whether humans can accurately make decisions in line with Bayes' rule has been one of the most important yet contentious topics in cognitive psychology. Though a number of paradigms have been used for studying Bayesian updating, rarely have subjects been allowed to use their own preexisting beliefs about the prior and the likelihood. A study is reported in which physicians judged the posttest probability of a diagnosis for a patient vignette after receiving a test result, and the physicians' posttest judgments were compared to the normative posttest calculated from their own beliefs in the sensitivity and false positive rate of the test (likelihood ratio) and prior probability of the diagnosis. On the one hand, the posttest judgments were strongly related to the physicians' beliefs about both the prior probability as well as the likelihood ratio, and the priors were used considerably more strongly than in previous research. On the other hand, both the prior and the likelihoods were still not used quite as much as they should have been, and there was evidence of other nonnormative aspects to the updating, such as updating independent of the likelihood beliefs. By focusing on how physicians use their own prior beliefs for Bayesian updating, this study provides insight into how well experts perform probabilistic inference in settings in which they rely upon their own prior beliefs rather than experimenter-provided cues. It suggests that there is reason to be optimistic about experts' abilities, but that there is still considerable need for improvement.

  19. A Walk on the Wild Side: The Impact of Music on Risk-Taking Likelihood

    Science.gov (United States)

    Enström, Rickard; Schmaltz, Rodney

    2017-01-01

    From a marketing perspective, there has been substantial interest in on the role of risk-perception on consumer behavior. Specific ‘problem music’ like rap and heavy metal has long been associated with delinquent behavior, including violence, drug use, and promiscuous sex. Although individuals’ risk preferences have been investigated across a range of decision-making situations, there has been little empirical work demonstrating the direct role music may have on the likelihood of engaging in risky activities. In the exploratory study reported here, we assessed the impact of listening to different styles of music while assessing risk-taking likelihood through a psychometric scale. Risk-taking likelihood was measured across ethical, financial, health and safety, recreational and social domains. Through the means of a canonical correlation analysis, the multivariate relationship between different music styles and individual risk-taking likelihood across the different domains is discussed. Our results indicate that listening to different types of music does influence risk-taking likelihood, though not in areas of health and safety. PMID:28539908

  20. A Walk on the Wild Side: The Impact of Music on Risk-Taking Likelihood.

    Science.gov (United States)

    Enström, Rickard; Schmaltz, Rodney

    2017-01-01

    From a marketing perspective, there has been substantial interest in on the role of risk-perception on consumer behavior. Specific 'problem music' like rap and heavy metal has long been associated with delinquent behavior, including violence, drug use, and promiscuous sex. Although individuals' risk preferences have been investigated across a range of decision-making situations, there has been little empirical work demonstrating the direct role music may have on the likelihood of engaging in risky activities. In the exploratory study reported here, we assessed the impact of listening to different styles of music while assessing risk-taking likelihood through a psychometric scale. Risk-taking likelihood was measured across ethical, financial, health and safety, recreational and social domains. Through the means of a canonical correlation analysis, the multivariate relationship between different music styles and individual risk-taking likelihood across the different domains is discussed. Our results indicate that listening to different types of music does influence risk-taking likelihood, though not in areas of health and safety.

  1. Approximate likelihood approaches for detecting the influence of primordial gravitational waves in cosmic microwave background polarization

    Science.gov (United States)

    Pan, Zhen; Anderes, Ethan; Knox, Lloyd

    2018-05-01

    One of the major targets for next-generation cosmic microwave background (CMB) experiments is the detection of the primordial B-mode signal. Planning is under way for Stage-IV experiments that are projected to have instrumental noise small enough to make lensing and foregrounds the dominant source of uncertainty for estimating the tensor-to-scalar ratio r from polarization maps. This makes delensing a crucial part of future CMB polarization science. In this paper we present a likelihood method for estimating the tensor-to-scalar ratio r from CMB polarization observations, which combines the benefits of a full-scale likelihood approach with the tractability of the quadratic delensing technique. This method is a pixel space, all order likelihood analysis of the quadratic delensed B modes, and it essentially builds upon the quadratic delenser by taking into account all order lensing and pixel space anomalies. Its tractability relies on a crucial factorization of the pixel space covariance matrix of the polarization observations which allows one to compute the full Gaussian approximate likelihood profile, as a function of r , at the same computational cost of a single likelihood evaluation.

  2. Susceptibility, likelihood to be diagnosed, worry and fear for contracting Lyme disease.

    Science.gov (United States)

    Fogel, Joshua; Chawla, Gurasees S

    Risk perception and psychological concerns are relevant for understanding how people view Lyme disease. This study investigates the four separate outcomes of susceptibility, likelihood to be diagnosed, worry, and fear for contracting Lyme disease. University students (n=713) were surveyed about demographics, perceived health, Lyme disease knowledge, Lyme disease preventive behaviors, Lyme disease history, and Lyme disease miscellaneous variables. We found that women were associated with increased susceptibility and fear. Asian/Asian-American race/ethnicity was associated with increased worry and fear. Perceived good health was associated with increased likelihood to be diagnosed, worry, and fear. Correct knowledge was associated with increased susceptibility and likelihood to be diagnosed. Those who typically spend a lot of time outdoors were associated with increased susceptibility, likelihood to be diagnosed, worry, and fear. In conclusion, healthcare providers and public health campaigns should address susceptibility, likelihood to be diagnosed, worry, and fear about Lyme disease, and should particularly target women and Asians/Asian-Americans to address any possible misconceptions and/or offer effective coping strategies. Copyright © 2016 King Saud Bin Abdulaziz University for Health Sciences. Published by Elsevier Ltd. All rights reserved.

  3. A Walk on the Wild Side: The Impact of Music on Risk-Taking Likelihood

    Directory of Open Access Journals (Sweden)

    Rickard Enström

    2017-05-01

    Full Text Available From a marketing perspective, there has been substantial interest in on the role of risk-perception on consumer behavior. Specific ‘problem music’ like rap and heavy metal has long been associated with delinquent behavior, including violence, drug use, and promiscuous sex. Although individuals’ risk preferences have been investigated across a range of decision-making situations, there has been little empirical work demonstrating the direct role music may have on the likelihood of engaging in risky activities. In the exploratory study reported here, we assessed the impact of listening to different styles of music while assessing risk-taking likelihood through a psychometric scale. Risk-taking likelihood was measured across ethical, financial, health and safety, recreational and social domains. Through the means of a canonical correlation analysis, the multivariate relationship between different music styles and individual risk-taking likelihood across the different domains is discussed. Our results indicate that listening to different types of music does influence risk-taking likelihood, though not in areas of health and safety.

  4. Global Contexts for Learning: Exploring the Relationship between Low-Context Online Learning and High-Context Learners

    Science.gov (United States)

    Westbrook, Timothy Paul

    2014-01-01

    Current research on culture and distance education suggests that cultural variables influence student success online. When online courses are writing-based, they may provide easy information dissemination; however, the low-context medium may restrict the learning experience and class dynamic due to the lack of nonverbal communication. Students who…

  5. Generalized linear models with random effects unified analysis via H-likelihood

    CERN Document Server

    Lee, Youngjo; Pawitan, Yudi

    2006-01-01

    Since their introduction in 1972, generalized linear models (GLMs) have proven useful in the generalization of classical normal models. Presenting methods for fitting GLMs with random effects to data, Generalized Linear Models with Random Effects: Unified Analysis via H-likelihood explores a wide range of applications, including combining information over trials (meta-analysis), analysis of frailty models for survival data, genetic epidemiology, and analysis of spatial and temporal models with correlated errors.Written by pioneering authorities in the field, this reference provides an introduction to various theories and examines likelihood inference and GLMs. The authors show how to extend the class of GLMs while retaining as much simplicity as possible. By maximizing and deriving other quantities from h-likelihood, they also demonstrate how to use a single algorithm for all members of the class, resulting in a faster algorithm as compared to existing alternatives. Complementing theory with examples, many of...

  6. Maximum likelihood estimation and EM algorithm of Copas-like selection model for publication bias correction.

    Science.gov (United States)

    Ning, Jing; Chen, Yong; Piao, Jin

    2017-07-01

    Publication bias occurs when the published research results are systematically unrepresentative of the population of studies that have been conducted, and is a potential threat to meaningful meta-analysis. The Copas selection model provides a flexible framework for correcting estimates and offers considerable insight into the publication bias. However, maximizing the observed likelihood under the Copas selection model is challenging because the observed data contain very little information on the latent variable. In this article, we study a Copas-like selection model and propose an expectation-maximization (EM) algorithm for estimation based on the full likelihood. Empirical simulation studies show that the EM algorithm and its associated inferential procedure performs well and avoids the non-convergence problem when maximizing the observed likelihood. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  7. Parallelization of maximum likelihood fits with OpenMP and CUDA

    CERN Document Server

    Jarp, S; Leduc, J; Nowak, A; Pantaleo, F

    2011-01-01

    Data analyses based on maximum likelihood fits are commonly used in the high energy physics community for fitting statistical models to data samples. This technique requires the numerical minimization of the negative log-likelihood function. MINUIT is the most common package used for this purpose in the high energy physics community. The main algorithm in this package, MIGRAD, searches the minimum by using the gradient information. The procedure requires several evaluations of the function, depending on the number of free parameters and their initial values. The whole procedure can be very CPU-time consuming in case of complex functions, with several free parameters, many independent variables and large data samples. Therefore, it becomes particularly important to speed-up the evaluation of the negative log-likelihood function. In this paper we present an algorithm and its implementation which benefits from data vectorization and parallelization (based on OpenMP) and which was also ported to Graphics Processi...

  8. Maximum likelihood positioning for gamma-ray imaging detectors with depth of interaction measurement

    International Nuclear Information System (INIS)

    Lerche, Ch.W.; Ros, A.; Monzo, J.M.; Aliaga, R.J.; Ferrando, N.; Martinez, J.D.; Herrero, V.; Esteve, R.; Gadea, R.; Colom, R.J.; Toledo, J.; Mateo, F.; Sebastia, A.; Sanchez, F.; Benlloch, J.M.

    2009-01-01

    The center of gravity algorithm leads to strong artifacts for gamma-ray imaging detectors that are based on monolithic scintillation crystals and position sensitive photo-detectors. This is a consequence of using the centroids as position estimates. The fact that charge division circuits can also be used to compute the standard deviation of the scintillation light distribution opens a way out of this drawback. We studied the feasibility of maximum likelihood estimation for computing the true gamma-ray photo-conversion position from the centroids and the standard deviation of the light distribution. The method was evaluated on a test detector that consists of the position sensitive photomultiplier tube H8500 and a monolithic LSO crystal (42mmx42mmx10mm). Spatial resolution was measured for the centroids and the maximum likelihood estimates. The results suggest that the maximum likelihood positioning is feasible and partially removes the strong artifacts of the center of gravity algorithm.

  9. Maximum likelihood positioning for gamma-ray imaging detectors with depth of interaction measurement

    Energy Technology Data Exchange (ETDEWEB)

    Lerche, Ch.W. [Grupo de Sistemas Digitales, ITACA, Universidad Politecnica de Valencia, 46022 Valencia (Spain)], E-mail: lerche@ific.uv.es; Ros, A. [Grupo de Fisica Medica Nuclear, IFIC, Universidad de Valencia-Consejo Superior de Investigaciones Cientificas, 46980 Paterna (Spain); Monzo, J.M.; Aliaga, R.J.; Ferrando, N.; Martinez, J.D.; Herrero, V.; Esteve, R.; Gadea, R.; Colom, R.J.; Toledo, J.; Mateo, F.; Sebastia, A. [Grupo de Sistemas Digitales, ITACA, Universidad Politecnica de Valencia, 46022 Valencia (Spain); Sanchez, F.; Benlloch, J.M. [Grupo de Fisica Medica Nuclear, IFIC, Universidad de Valencia-Consejo Superior de Investigaciones Cientificas, 46980 Paterna (Spain)

    2009-06-01

    The center of gravity algorithm leads to strong artifacts for gamma-ray imaging detectors that are based on monolithic scintillation crystals and position sensitive photo-detectors. This is a consequence of using the centroids as position estimates. The fact that charge division circuits can also be used to compute the standard deviation of the scintillation light distribution opens a way out of this drawback. We studied the feasibility of maximum likelihood estimation for computing the true gamma-ray photo-conversion position from the centroids and the standard deviation of the light distribution. The method was evaluated on a test detector that consists of the position sensitive photomultiplier tube H8500 and a monolithic LSO crystal (42mmx42mmx10mm). Spatial resolution was measured for the centroids and the maximum likelihood estimates. The results suggest that the maximum likelihood positioning is feasible and partially removes the strong artifacts of the center of gravity algorithm.

  10. Maximum Likelihood Blind Channel Estimation for Space-Time Coding Systems

    Directory of Open Access Journals (Sweden)

    Hakan A. Çırpan

    2002-05-01

    Full Text Available Sophisticated signal processing techniques have to be developed for capacity enhancement of future wireless communication systems. In recent years, space-time coding is proposed to provide significant capacity gains over the traditional communication systems in fading wireless channels. Space-time codes are obtained by combining channel coding, modulation, transmit diversity, and optional receive diversity in order to provide diversity at the receiver and coding gain without sacrificing the bandwidth. In this paper, we consider the problem of blind estimation of space-time coded signals along with the channel parameters. Both conditional and unconditional maximum likelihood approaches are developed and iterative solutions are proposed. The conditional maximum likelihood algorithm is based on iterative least squares with projection whereas the unconditional maximum likelihood approach is developed by means of finite state Markov process modelling. The performance analysis issues of the proposed methods are studied. Finally, some simulation results are presented.

  11. Uncertainty about the true source. A note on the likelihood ratio at the activity level.

    Science.gov (United States)

    Taroni, Franco; Biedermann, Alex; Bozza, Silvia; Comte, Jennifer; Garbolino, Paolo

    2012-07-10

    This paper focuses on likelihood ratio based evaluations of fibre evidence in cases in which there is uncertainty about whether or not the reference item available for analysis - that is, an item typically taken from the suspect or seized at his home - is the item actually worn at the time of the offence. A likelihood ratio approach is proposed that, for situations in which certain categorical assumptions can be made about additionally introduced parameters, converges to formula described in existing literature. The properties of the proposed likelihood ratio approach are analysed through sensitivity analyses and discussed with respect to possible argumentative implications that arise in practice. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  12. LASER: A Maximum Likelihood Toolkit for Detecting Temporal Shifts in Diversification Rates From Molecular Phylogenies

    Directory of Open Access Journals (Sweden)

    Daniel L. Rabosky

    2006-01-01

    Full Text Available Rates of species origination and extinction can vary over time during evolutionary radiations, and it is possible to reconstruct the history of diversification using molecular phylogenies of extant taxa only. Maximum likelihood methods provide a useful framework for inferring temporal variation in diversification rates. LASER is a package for the R programming environment that implements maximum likelihood methods based on the birth-death process to test whether diversification rates have changed over time. LASER contrasts the likelihood of phylogenetic data under models where diversification rates have changed over time to alternative models where rates have remained constant over time. Major strengths of the package include the ability to detect temporal increases in diversification rates and the inference of diversification parameters under multiple rate-variable models of diversification. The program and associated documentation are freely available from the R package archive at http://cran.r-project.org.

  13. Contexts as Shared Commitments

    Directory of Open Access Journals (Sweden)

    Manuel eGarcía-Carpintero

    2015-12-01

    Full Text Available Contemporary semantics assumes two different notions of context: one coming from Kaplan (1989, on which contexts are sets of predetermined parameters, and another originated in Stalnaker (1978, on which contexts are sets of propositions that are common ground. The latter is deservedly more popular, given its flexibility to account for context-dependent aspects of language beyond manifest indexicals, such as epistemic modals, predicates of taste, and so on and so forth; in fact, properly dealing with demonstratives (perhaps ultimately all indexicals requires that further flexibility. Even if we acknowledge Lewis (1980 point that, in a sense, Kaplanian contexts already include common ground contexts, it is better to be clear and explicit about what contexts constitutively are. Now, Stalnaker (1978, 2002, 2014 defines context-as-common-ground as a set of propositions, but recent work shows that this is not an accurate conception. The paper explains why, and provides an alternative. The main reason is that several phenomena (presuppositional treatments of pejoratives and predicates of taste, forces other than assertion require that the common ground includes non-doxastic attitudes such as appraisals, emotions, etc. Hence the common ground should not be taken to include merely contents (propositions, but those together with attitudes concerning them: shared commitments, as I will defend.

  14. Talking wearables exploit context

    NARCIS (Netherlands)

    Geldof, S.; Terken, J.M.B.

    2001-01-01

    This paper addresses the issue of how natural language generation technology can contribute to less intrusive wearable devices. Based on the investigation of how humans adapt the form of their utterances to the context of their hearer, we propose a strategy to relate (physical) context to the

  15. Average Revisited in Context

    Science.gov (United States)

    Watson, Jane; Chick, Helen

    2012-01-01

    This paper analyses the responses of 247 middle school students to items requiring the concept of average in three different contexts: a city's weather reported in maximum daily temperature, the number of children in a family, and the price of houses. The mixed but overall disappointing performance on the six items in the three contexts indicates…

  16. Likelihood for transcriptions in a genetic regulatory system under asymmetric stable Lévy noise.

    Science.gov (United States)

    Wang, Hui; Cheng, Xiujun; Duan, Jinqiao; Kurths, Jürgen; Li, Xiaofan

    2018-01-01

    This work is devoted to investigating the evolution of concentration in a genetic regulation system, when the synthesis reaction rate is under additive and multiplicative asymmetric stable Lévy fluctuations. By focusing on the impact of skewness (i.e., non-symmetry) in the probability distributions of noise, we find that via examining the mean first exit time (MFET) and the first escape probability (FEP), the asymmetric fluctuations, interacting with nonlinearity in the system, lead to peculiar likelihood for transcription. This includes, in the additive noise case, realizing higher likelihood of transcription for larger positive skewness (i.e., asymmetry) index β, causing a stochastic bifurcation at the non-Gaussianity index value α = 1 (i.e., it is a separating point or line for the likelihood for transcription), and achieving a turning point at the threshold value β≈-0.5 (i.e., beyond which the likelihood for transcription suddenly reversed for α values). The stochastic bifurcation and turning point phenomena do not occur in the symmetric noise case (β = 0). While in the multiplicative noise case, non-Gaussianity index value α = 1 is a separating point or line for both the MFET and the FEP. We also investigate the noise enhanced stability phenomenon. Additionally, we are able to specify the regions in the whole parameter space for the asymmetric noise, in which we attain desired likelihood for transcription. We have conducted a series of numerical experiments in "regulating" the likelihood of gene transcription by tuning asymmetric stable Lévy noise indexes. This work offers insights for possible ways of achieving gene regulation in experimental research.

  17. Evaluation in Context

    DEFF Research Database (Denmark)

    Jaap, Kamps; Lalmas, Mounia; Larsen, Birger

    All search happens in a particular context - such as the particular collection of a digital library, its associated search tasks, and its associated users. Information retrieval researchers usually agree on the importance of context, but they rarely address the issue. In particular, evaluation in......, e.g. those that suit a particular task scenario, and zoom in on the relative performance for such a group of topics.......All search happens in a particular context - such as the particular collection of a digital library, its associated search tasks, and its associated users. Information retrieval researchers usually agree on the importance of context, but they rarely address the issue. In particular, evaluation....../assessors - by designing targeted questionnaires. The questionnaire data becomes part of the evaluation test-suite as valuable data on the context of the search requests.We have experimented with this questionnaire approach during the evaluation campaign of the INitiative for the Evaluation of XML Retrieval (INEX...

  18. Generalized Empirical Likelihood-Based Focused Information Criterion and Model Averaging

    Directory of Open Access Journals (Sweden)

    Naoya Sueishi

    2013-07-01

    Full Text Available This paper develops model selection and averaging methods for moment restriction models. We first propose a focused information criterion based on the generalized empirical likelihood estimator. We address the issue of selecting an optimal model, rather than a correct model, for estimating a specific parameter of interest. Then, this study investigates a generalized empirical likelihood-based model averaging estimator that minimizes the asymptotic mean squared error. A simulation study suggests that our averaging estimator can be a useful alternative to existing post-selection estimators.

  19. Maximum likelihood estimation of the parameters of nonminimum phase and noncausal ARMA models

    DEFF Research Database (Denmark)

    Rasmussen, Klaus Bolding

    1994-01-01

    The well-known prediction-error-based maximum likelihood (PEML) method can only handle minimum phase ARMA models. This paper presents a new method known as the back-filtering-based maximum likelihood (BFML) method, which can handle nonminimum phase and noncausal ARMA models. The BFML method...... is identical to the PEML method in the case of a minimum phase ARMA model, and it turns out that the BFML method incorporates a noncausal ARMA filter with poles outside the unit circle for estimation of the parameters of a causal, nonminimum phase ARMA model...

  20. Maximum likelihood estimation of the position of a radiating source in a waveguide

    International Nuclear Information System (INIS)

    Hinich, M.J.

    1979-01-01

    An array of sensors is receiving radiation from a source of interest. The source and the array are in a one- or two-dimensional waveguide. The maximum-likelihood estimators of the coordinates of the source are analyzed under the assumptions that the noise field is Gaussian. The Cramer-Rao lower bound is of the order of the number of modes which define the source excitation function. The results show that the accuracy of the maximum likelihood estimator of source depth using a vertical array in a infinite horizontal waveguide (such as the ocean) is limited by the number of modes detected by the array regardless of the array size

  1. A maximum pseudo-likelihood approach for estimating species trees under the coalescent model

    Directory of Open Access Journals (Sweden)

    Edwards Scott V

    2010-10-01

    Full Text Available Abstract Background Several phylogenetic approaches have been developed to estimate species trees from collections of gene trees. However, maximum likelihood approaches for estimating species trees under the coalescent model are limited. Although the likelihood of a species tree under the multispecies coalescent model has already been derived by Rannala and Yang, it can be shown that the maximum likelihood estimate (MLE of the species tree (topology, branch lengths, and population sizes from gene trees under this formula does not exist. In this paper, we develop a pseudo-likelihood function of the species tree to obtain maximum pseudo-likelihood estimates (MPE of species trees, with branch lengths of the species tree in coalescent units. Results We show that the MPE of the species tree is statistically consistent as the number M of genes goes to infinity. In addition, the probability that the MPE of the species tree matches the true species tree converges to 1 at rate O(M -1. The simulation results confirm that the maximum pseudo-likelihood approach is statistically consistent even when the species tree is in the anomaly zone. We applied our method, Maximum Pseudo-likelihood for Estimating Species Trees (MP-EST to a mammal dataset. The four major clades found in the MP-EST tree are consistent with those in the Bayesian concatenation tree. The bootstrap supports for the species tree estimated by the MP-EST method are more reasonable than the posterior probability supports given by the Bayesian concatenation method in reflecting the level of uncertainty in gene trees and controversies over the relationship of four major groups of placental mammals. Conclusions MP-EST can consistently estimate the topology and branch lengths (in coalescent units of the species tree. Although the pseudo-likelihood is derived from coalescent theory, and assumes no gene flow or horizontal gene transfer (HGT, the MP-EST method is robust to a small amount of HGT in the

  2. Regularization parameter selection methods for ill-posed Poisson maximum likelihood estimation

    International Nuclear Information System (INIS)

    Bardsley, Johnathan M; Goldes, John

    2009-01-01

    In image processing applications, image intensity is often measured via the counting of incident photons emitted by the object of interest. In such cases, image data noise is accurately modeled by a Poisson distribution. This motivates the use of Poisson maximum likelihood estimation for image reconstruction. However, when the underlying model equation is ill-posed, regularization is needed. Regularized Poisson likelihood estimation has been studied extensively by the authors, though a problem of high importance remains: the choice of the regularization parameter. We will present three statistically motivated methods for choosing the regularization parameter, and numerical examples will be presented to illustrate their effectiveness

  3. Planck 2015 results: XI. CMB power spectra, likelihoods, and robustness of parameters

    DEFF Research Database (Denmark)

    Aghanim, N.; Arnaud, M.; Ashdown, M.

    2016-01-01

    on the same hybrid approach used for the previous release, i.e., a pixel-based likelihood at low multipoles (ℓ data and of Planck polarization......This paper presents the Planck 2015 likelihoods, statistical descriptions of the 2-point correlationfunctions of the cosmic microwave background (CMB) temperature and polarization fluctuations that account for relevant uncertainties, both instrumental and astrophysical in nature. They are based...... information, along with more detailed models of foregrounds and instrumental uncertainties. The increased redundancy brought by more than doubling the amount of data analysed enables further consistency checks and enhanced immunity to systematic effects. It also improves the constraining power of Planck...

  4. Average Likelihood Methods of Classification of Code Division Multiple Access (CDMA)

    Science.gov (United States)

    2016-05-01

    subject to code matrices that follows the structure given by (113). [⃗ yR y⃗I ] = √ Es 2L [ GR1 −GI1 GI2 GR2 ] [ QR −QI QI QR ] [⃗ bR b⃗I ] + [⃗ nR n⃗I... QR ] [⃗ b+ b⃗− ] + [⃗ n+ n⃗− ] (115) The average likelihood for type 4 CDMA (116) is a special case of type 1 CDMA with twice the code length and...AVERAGE LIKELIHOOD METHODS OF CLASSIFICATION OF CODE DIVISION MULTIPLE ACCESS (CDMA) MAY 2016 FINAL TECHNICAL REPORT APPROVED FOR PUBLIC RELEASE

  5. Block Empirical Likelihood for Longitudinal Single-Index Varying-Coefficient Model

    Directory of Open Access Journals (Sweden)

    Yunquan Song

    2013-01-01

    Full Text Available In this paper, we consider a single-index varying-coefficient model with application to longitudinal data. In order to accommodate the within-group correlation, we apply the block empirical likelihood procedure to longitudinal single-index varying-coefficient model, and prove a nonparametric version of Wilks’ theorem which can be used to construct the block empirical likelihood confidence region with asymptotically correct coverage probability for the parametric component. In comparison with normal approximations, the proposed method does not require a consistent estimator for the asymptotic covariance matrix, making it easier to conduct inference for the model's parametric component. Simulations demonstrate how the proposed method works.

  6. Design of Simplified Maximum-Likelihood Receivers for Multiuser CPM Systems

    Directory of Open Access Journals (Sweden)

    Li Bing

    2014-01-01

    Full Text Available A class of simplified maximum-likelihood receivers designed for continuous phase modulation based multiuser systems is proposed. The presented receiver is built upon a front end employing mismatched filters and a maximum-likelihood detector defined in a low-dimensional signal space. The performance of the proposed receivers is analyzed and compared to some existing receivers. Some schemes are designed to implement the proposed receivers and to reveal the roles of different system parameters. Analysis and numerical results show that the proposed receivers can approach the optimum multiuser receivers with significantly (even exponentially in some cases reduced complexity and marginal performance degradation.

  7. Design of simplified maximum-likelihood receivers for multiuser CPM systems.

    Science.gov (United States)

    Bing, Li; Bai, Baoming

    2014-01-01

    A class of simplified maximum-likelihood receivers designed for continuous phase modulation based multiuser systems is proposed. The presented receiver is built upon a front end employing mismatched filters and a maximum-likelihood detector defined in a low-dimensional signal space. The performance of the proposed receivers is analyzed and compared to some existing receivers. Some schemes are designed to implement the proposed receivers and to reveal the roles of different system parameters. Analysis and numerical results show that the proposed receivers can approach the optimum multiuser receivers with significantly (even exponentially in some cases) reduced complexity and marginal performance degradation.

  8. CONTEXT 2015 Doctorial Symposium

    DEFF Research Database (Denmark)

    Eklund, Peter; wegener, rebekah

    2015-01-01

    What is the CONTEXT 2015 Doctoral Symposium? The CONTEXT 2015 Doctoral Symposium is an opportunity for doctoral researchers to showcase their work and discuss problems, challenges, and ideas in an open and collegial environment with expert feedback. The Doctoral Symposium is a workshop for doctoral...... feedback and general advice in a constructive atmosphere. Doctoral researchers will present and discuss their research in a supportive atmosphere with other doctoral researchers and an international panel of established researchers that provide expert feedback. The workshop will take place on a single full...... day, Monday November 2, 2015, the day prior to the start of the main CONTEXT 2015 conference....

  9. Measuring School Contexts

    Directory of Open Access Journals (Sweden)

    Chandra L. Muller PhD

    2015-11-01

    Full Text Available This article describes issues in measuring school contexts with an eye toward understanding students’ experiences and outcomes. I begin with an overview of the conceptual underpinnings related to measuring contexts, briefly describe the initiatives at the National Center for Education Statistics to measure school contexts, and identify possible gaps in those initiatives that if filled could provide valuable new data for researchers. Next, I discuss new approaches and opportunities for measurement, and special considerations related to diverse populations and youth development. I conclude with recommendations for future priorities.

  10. Derivation of LDA log likelihood ratio one-to-one classifier

    NARCIS (Netherlands)

    Spreeuwers, Lieuwe Jan

    2014-01-01

    The common expression for the Likelihood Ratio classifier using LDA assumes that the reference class mean is available. In biometrics, this is often not the case and only a single sample of the reference class is available. In this paper expressions are derived for biometric comparison between

  11. Comparison of IRT Likelihood Ratio Test and Logistic Regression DIF Detection Procedures

    Science.gov (United States)

    Atar, Burcu; Kamata, Akihito

    2011-01-01

    The Type I error rates and the power of IRT likelihood ratio test and cumulative logit ordinal logistic regression procedures in detecting differential item functioning (DIF) for polytomously scored items were investigated in this Monte Carlo simulation study. For this purpose, 54 simulation conditions (combinations of 3 sample sizes, 2 sample…

  12. Imagination perspective affects ratings of the likelihood of occurrence of autobiographical memories.

    Science.gov (United States)

    Marsh, Benjamin U; Pezdek, Kathy; Lam, Shirley T

    2014-07-01

    Two experiments tested and confirmed the hypothesis that when the phenomenological characteristics of imagined events are more similar to those of related autobiographical memories, the imagined event is more likely to be considered to have occurred. At Time 1 and 2-weeks later, individuals rated the likelihood of occurrence for 20 life events. In Experiment 1, 1-week after Time 1, individuals imagined 3 childhood events from a first-person or third-person perspective. There was a no-imagination control. An increase in likelihood ratings from Time 1 to Time 2 resulted when imagination was from the third-person but not first-person perspective. In Experiment 2, childhood and recent events were imagined from a third- or first-person perspective. A significant interaction resulted. For childhood events, likelihood change scores were greater for third-person than first-person perspective; for recent adult events, likelihood change scores were greater for first-person than third-person perspective, although this latter trend was not significant. Copyright © 2014 Elsevier B.V. All rights reserved.

  13. Supervisor Autonomy and Considerate Leadership Style are Associated with Supervisors' Likelihood to Accommodate Back Injured Workers.

    Science.gov (United States)

    McGuire, Connor; Kristman, Vicki L; Shaw, William; Williams-Whitt, Kelly; Reguly, Paula; Soklaridis, Sophie

    2015-09-01

    To determine the association between supervisors' leadership style and autonomy and supervisors' likelihood of supporting job accommodations for back-injured workers. A cross-sectional study of supervisors from Canadian and US employers was conducted using a web-based, self-report questionnaire that included a case vignette of a back-injured worker. Autonomy and two dimensions of leadership style (considerate and initiating structure) were included as exposures. The outcome, supervisors' likeliness to support job accommodation, was measured with the Job Accommodation Scale (JAS). We conducted univariate analyses of all variables and bivariate analyses of the JAS score with each exposure and potential confounding factor. We used multivariable generalized linear models to control for confounding factors. A total of 796 supervisors participated. Considerate leadership style (β = .012; 95% CI .009-.016) and autonomy (β = .066; 95% CI .025-.11) were positively associated with supervisors' likelihood to accommodate after adjusting for appropriate confounding factors. An initiating structure leadership style was not significantly associated with supervisors' likelihood to accommodate (β = .0018; 95% CI -.0026 to .0061) after adjusting for appropriate confounders. Autonomy and a considerate leadership style were positively associated with supervisors' likelihood to accommodate a back-injured worker. Providing supervisors with more autonomy over decisions of accommodation and developing their considerate leadership style may aid in increasing work accommodation for back-injured workers and preventing prolonged work disability.

  14. Supervisor Autonomy and Considerate Leadership Style are Associated with Supervisors’ Likelihood to Accommodate Back Injured Workers

    Science.gov (United States)

    McGuire, Connor; Kristman, Vicki L; Williams-Whitt, Kelly; Reguly, Paula; Shaw, William; Soklaridis, Sophie

    2015-01-01

    PURPOSE To determine the association between supervisors’ leadership style and autonomy and supervisors’ likelihood of supporting job accommodations for back-injured workers. METHODS A cross-sectional study of supervisors from Canadian and US employers was conducted using a web-based, self-report questionnaire that included a case vignette of a back-injured worker. Autonomy and two dimensions of leadership style (considerate and initiating structure) were included as exposures. The outcome, supervisors’ likeliness to support job accommodation, was measured with the Job Accommodation Scale. We conducted univariate analyses of all variables and bivariate analyses of the JAS score with each exposure and potential confounding factor. We used multivariable generalized linear models to control for confounding factors. RESULTS A total of 796 supervisors participated. Considerate leadership style (β= .012; 95% CI: .009–.016) and autonomy (β= .066; 95% CI: .025–.11) were positively associated with supervisors’ likelihood to accommodate after adjusting for appropriate confounding factors. An initiating structure leadership style was not significantly associated with supervisors’ likelihood to accommodate (β = .0018; 95% CI: −.0026–.0061) after adjusting for appropriate confounders. CONCLUSIONS Autonomy and a considerate leadership style were positively associated with supervisors’ likelihood to accommodate a back-injured worker. Providing supervisors with more autonomy over decisions of accommodation and developing their considerate leadership style may aid in increasing work accommodation for back-injured workers and preventing prolonged work disability. PMID:25595332

  15. Maximum likelihood estimation of ancestral codon usage bias parameters in Drosophila

    DEFF Research Database (Denmark)

    Nielsen, Rasmus; Bauer DuMont, Vanessa L; Hubisz, Melissa J

    2007-01-01

    : the selection coefficient for optimal codon usage (S), allowing joint maximum likelihood estimation of S and the dN/dS ratio. We apply the method to previously published data from Drosophila melanogaster, Drosophila simulans, and Drosophila yakuba and show, in accordance with previous results, that the D...

  16. Robust Biometric Score Fusion by Naive Likelihood Ratio via Receiver Operating Characteristics

    NARCIS (Netherlands)

    Tao, Q.; Veldhuis, Raymond N.J.

    This paper presents a novel method of fusing multiple biometrics on the matching score level. We estimate the likelihood ratios of the fused biometric scores, via individual receiver operating characteristics (ROC) which construct the Naive Bayes classifier. Using a limited number of operation

  17. Epilepsy and Intellectual Disability: Does Epilepsy Increase the Likelihood of Co-Morbid Psychopathology?

    Science.gov (United States)

    Arshad, Saadia; Winterhalder, Robert; Underwood, Lisa; Kelesidi, Katerina; Chaplin, Eddie; Kravariti, Eugenia; Anagnostopoulos, Dimitrios; Bouras, Nick; McCarthy, Jane; Tsakanikos, Elias

    2011-01-01

    Although epilepsy is particularly common among people with intellectual disability (ID) it remains unclear whether it is associated with an increased likelihood of co-morbid psychopathology. We therefore investigated rates of mental health problems and other clinical characteristics in patients with ID and epilepsy (N=156) as compared to patients…

  18. Heterogeneity in the Likelihood of Market Advisory Service Use by U.S. Crop Producers

    NARCIS (Netherlands)

    Pennings, J.M.E.; Irwin, S.; Good, D.; Isengildina, O.

    2005-01-01

    Abstract Analysis of a unique data set of 1,400 U.S. crop producers using a mixture-modeling framework shows that the likelihood of Marketing Advisory Services (MAS) use is, among others, driven by the perceived performance of MAS in terms of return and risk reduction, the match between the MAS and

  19. The Influence of criminal history on the likelihood of committing lethal versus nonlethal violence

    NARCIS (Netherlands)

    Ganpat, Soenita M.; Liem, Marieke; van der Leun, Joanne; Nieuwbeerta, Paul

    2014-01-01

    This study focuses on the criminal history of serious violent offenders. Our aim is to determine: (a) to what extent the criminal history of lethally violent offenders differs from nonlethally violent offenders and (b) to what extent one's criminal history influences the likelihood that violence

  20. Application of the Method of Maximum Likelihood to Identification of Bipedal Walking Robots

    Czech Academy of Sciences Publication Activity Database

    Dolinský, Kamil; Čelikovský, Sergej

    (2017) ISSN 1063-6536 R&D Projects: GA ČR(CZ) GA17-04682S Institutional support: RVO:67985556 Keywords : Control * identification * maximum likelihood (ML) * walking robots Subject RIV: BC - Control Systems Theory Impact factor: 3.882, year: 2016 http://ieeexplore.ieee.org/document/7954032/