WorldWideScience

Sample records for dynamic context likelihood

  1. Inference in HIV dynamics models via hierarchical likelihood

    OpenAIRE

    2010-01-01

    HIV dynamical models are often based on non-linear systems of ordinary differential equations (ODE), which do not have analytical solution. Introducing random effects in such models leads to very challenging non-linear mixed-effects models. To avoid the numerical computation of multiple integrals involved in the likelihood, we propose a hierarchical likelihood (h-likelihood) approach, treated in the spirit of a penalized likelihood. We give the asymptotic distribution of the maximum h-likelih...

  2. Inference in HIV dynamics models via hierarchical likelihood

    CERN Document Server

    Commenges, D; Putter, H; Thiebaut, R

    2010-01-01

    HIV dynamical models are often based on non-linear systems of ordinary differential equations (ODE), which do not have analytical solution. Introducing random effects in such models leads to very challenging non-linear mixed-effects models. To avoid the numerical computation of multiple integrals involved in the likelihood, we propose a hierarchical likelihood (h-likelihood) approach, treated in the spirit of a penalized likelihood. We give the asymptotic distribution of the maximum h-likelihood estimators (MHLE) for fixed effects, a result that may be relevant in a more general setting. The MHLE are slightly biased but the bias can be made negligible by using a parametric bootstrap procedure. We propose an efficient algorithm for maximizing the h-likelihood. A simulation study, based on a classical HIV dynamical model, confirms the good properties of the MHLE. We apply it to the analysis of a clinical trial.

  3. Estimating nonlinear dynamic equilibrium economies: a likelihood approach

    OpenAIRE

    2004-01-01

    This paper presents a framework to undertake likelihood-based inference in nonlinear dynamic equilibrium economies. The authors develop a sequential Monte Carlo algorithm that delivers an estimate of the likelihood function of the model using simulation methods. This likelihood can be used for parameter estimation and for model comparison. The algorithm can deal both with nonlinearities of the economy and with the presence of non-normal shocks. The authors show consistency of the estimate and...

  4. Estimating dynamic equilibrium economies: linear versus nonlinear likelihood

    OpenAIRE

    2004-01-01

    This paper compares two methods for undertaking likelihood-based inference in dynamic equilibrium economies: a sequential Monte Carlo filter proposed by Fernández-Villaverde and Rubio-Ramírez (2004) and the Kalman filter. The sequential Monte Carlo filter exploits the nonlinear structure of the economy and evaluates the likelihood function of the model by simulation methods. The Kalman filter estimates a linearization of the economy around the steady state. The authors report two main results...

  5. On the Sampling Interpretation of Confidence Intervals and Hypothesis Tests in the Context of Conditional Maximum Likelihood Estimation.

    Science.gov (United States)

    Maris, E.

    1998-01-01

    The sampling interpretation of confidence intervals and hypothesis tests is discussed in the context of conditional maximum likelihood estimation. Three different interpretations are discussed, and it is shown that confidence intervals constructed from the asymptotic distribution under the third sampling scheme discussed are valid for the first…

  6. Constrained maximum likelihood modal parameter identification applied to structural dynamics

    Science.gov (United States)

    El-Kafafy, Mahmoud; Peeters, Bart; Guillaume, Patrick; De Troyer, Tim

    2016-05-01

    A new modal parameter estimation method to directly establish modal models of structural dynamic systems satisfying two physically motivated constraints will be presented. The constraints imposed in the identified modal model are the reciprocity of the frequency response functions (FRFs) and the estimation of normal (real) modes. The motivation behind the first constraint (i.e. reciprocity) comes from the fact that modal analysis theory shows that the FRF matrix and therefore the residue matrices are symmetric for non-gyroscopic, non-circulatory, and passive mechanical systems. In other words, such types of systems are expected to obey Maxwell-Betti's reciprocity principle. The second constraint (i.e. real mode shapes) is motivated by the fact that analytical models of structures are assumed to either be undamped or proportional damped. Therefore, normal (real) modes are needed for comparison with these analytical models. The work done in this paper is a further development of a recently introduced modal parameter identification method called ML-MM that enables us to establish modal model that satisfies such motivated constraints. The proposed constrained ML-MM method is applied to two real experimental datasets measured on fully trimmed cars. This type of data is still considered as a significant challenge in modal analysis. The results clearly demonstrate the applicability of the method to real structures with significant non-proportional damping and high modal densities.

  7. Improved variance estimation of maximum likelihood estimators in stable first-order dynamic regression models

    NARCIS (Netherlands)

    Kiviet, J.F.; Phillips, G.D.A.

    2014-01-01

    In dynamic regression models conditional maximum likelihood (least-squares) coefficient and variance estimators are biased. Using expansion techniques an approximation is obtained to the bias in variance estimation yielding a bias corrected variance estimator. This is achieved for both the standard

  8. Maximum Likelihood Dynamic Factor Modeling for Arbitrary "N" and "T" Using SEM

    Science.gov (United States)

    Voelkle, Manuel C.; Oud, Johan H. L.; von Oertzen, Timo; Lindenberger, Ulman

    2012-01-01

    This article has 3 objectives that build on each other. First, we demonstrate how to obtain maximum likelihood estimates for dynamic factor models (the direct autoregressive factor score model) with arbitrary "T" and "N" by means of structural equation modeling (SEM) and compare the approach to existing methods. Second, we go beyond standard time…

  9. Efficient estimators for likelihood ratio sensitivity indices of complex stochastic dynamics

    Science.gov (United States)

    Arampatzis, Georgios; Katsoulakis, Markos A.; Rey-Bellet, Luc

    2016-03-01

    We demonstrate that centered likelihood ratio estimators for the sensitivity indices of complex stochastic dynamics are highly efficient with low, constant in time variance and consequently they are suitable for sensitivity analysis in long-time and steady-state regimes. These estimators rely on a new covariance formulation of the likelihood ratio that includes as a submatrix a Fisher information matrix for stochastic dynamics and can also be used for fast screening of insensitive parameters and parameter combinations. The proposed methods are applicable to broad classes of stochastic dynamics such as chemical reaction networks, Langevin-type equations and stochastic models in finance, including systems with a high dimensional parameter space and/or disparate decorrelation times between different observables. Furthermore, they are simple to implement as a standard observable in any existing simulation algorithm without additional modifications.

  10. Elaboration Likelihood Model and an Analysis of the Contexts of Its Application

    Directory of Open Access Journals (Sweden)

    Aslıhan Kıymalıoğlu

    2014-12-01

    Full Text Available Elaboration Likelihood Model (ELM, which supports the existence of two routes to persuasion: central and peripheral routes, has been one of the major models on persuasion. As the number of studies in the Turkish literature on ELM is limited, a detailed explanation of the model together with a comprehensive literature review was considered to be contributory for this gap. The findings of the review reveal that the model was mostly used in marketing and advertising researches, that the concept most frequently used in elaboration process was involvement, and that argument quality and endorser credibility were the factors most often employed in measuring their effect on the dependant variables. The review provides valuable insights as it presents a holistic view of the model and the variables used in the model.

  11. Laser-Based Slam with Efficient Occupancy Likelihood Map Learning for Dynamic Indoor Scenes

    Science.gov (United States)

    Li, Li; Yao, Jian; Xie, Renping; Tu, Jinge; Feng, Chen

    2016-06-01

    Location-Based Services (LBS) have attracted growing attention in recent years, especially in indoor environments. The fundamental technique of LBS is the map building for unknown environments, this technique also named as simultaneous localization and mapping (SLAM) in robotic society. In this paper, we propose a novel approach for SLAMin dynamic indoor scenes based on a 2D laser scanner mounted on a mobile Unmanned Ground Vehicle (UGV) with the help of the grid-based occupancy likelihood map. Instead of applying scan matching in two adjacent scans, we propose to match current scan with the occupancy likelihood map learned from all previous scans in multiple scales to avoid the accumulation of matching errors. Due to that the acquisition of the points in a scan is sequential but not simultaneous, there unavoidably exists the scan distortion at different extents. To compensate the scan distortion caused by the motion of the UGV, we propose to integrate a velocity of a laser range finder (LRF) into the scan matching optimization framework. Besides, to reduce the effect of dynamic objects such as walking pedestrians often existed in indoor scenes as much as possible, we propose a new occupancy likelihood map learning strategy by increasing or decreasing the probability of each occupancy grid after each scan matching. Experimental results in several challenged indoor scenes demonstrate that our proposed approach is capable of providing high-precision SLAM results.

  12. Maximum-likelihood, self-consistent side chain free energies with applications to protein molecular dynamics

    CERN Document Server

    Jumper, John M; Sosnick, Tobin R

    2016-01-01

    To address the large gap between time scales that can be easily reached by molecular simulations and those required to understand protein dynamics, we propose a new methodology that computes a self-consistent approximation of the side chain free energy at every integration step. In analogy with the adiabatic Born-Oppenheimer approximation in which the nuclear dynamics are governed by the energy of the instantaneously-equilibrated electronic degrees of freedom, the protein backbone dynamics are simulated as preceding according to the dictates of the free energy of an instantaneously-equilibrated side chain potential. The side chain free energy is computed on the fly; hence, the protein backbone dynamics traverse a greatly smoothed energetic landscape, resulting in extremely rapid equilibration and sampling of the Boltzmann distribution. Because our method employs a reduced model involving single-bead side chains, we also provide a novel, maximum-likelihood type method to parameterize the side chain model using...

  13. Evaluation of dynamic coastal response to sea-level rise modifies inundation likelihood

    Science.gov (United States)

    Lentz, Erika E.; Thieler, E. Robert; Plant, Nathaniel G.; Stippa, Sawyer R.; Horton, Radley M.; Gesch, Dean B.

    2016-01-01

    Sea-level rise (SLR) poses a range of threats to natural and built environments1, 2, making assessments of SLR-induced hazards essential for informed decision making3. We develop a probabilistic model that evaluates the likelihood that an area will inundate (flood) or dynamically respond (adapt) to SLR. The broad-area applicability of the approach is demonstrated by producing 30 × 30 m resolution predictions for more than 38,000 km2 of diverse coastal landscape in the northeastern United States. Probabilistic SLR projections, coastal elevation and vertical land movement are used to estimate likely future inundation levels. Then, conditioned on future inundation levels and the current land-cover type, we evaluate the likelihood of dynamic response versus inundation. We find that nearly 70% of this coastal landscape has some capacity to respond dynamically to SLR, and we show that inundation models over-predict land likely to submerge. This approach is well suited to guiding coastal resource management decisions that weigh future SLR impacts and uncertainty against ecological targets and economic constraints.

  14. Emigration dynamics: the Indian context.

    Science.gov (United States)

    Premi, M K; Mathur, M D

    1995-01-01

    This report on emigration dynamics in India opens by providing background on short- and long-distance migration to and from India in response to such events as the formation of Pakistan as well as to the policies of the British Empire and Commonwealth. Section 2 discusses India's demographic and sociocultural setting in terms of population growth, urbanization, patterns of internal migration, growth of the labor force, economic growth, poverty alleviation, health, and education. The third section describes the lack of data on international migration. Some data are available on emigrants, but the only information on return migration is that gleaned from surveys in Kerala. Section 4 considers emigration to industrialized countries and notes that it is almost exclusively permanent and largely composed of individuals with professional, technical, or managerial skills. The resulting brain drain is described as is the incidence of illegal migration. India does not create conditions from which citizens must seek asylum, rather the country has absorbed flows of refugees from Pakistan, Tibet, Bangladesh, Afghanistan, and Sri Lanka. Available data on the characteristics of emigrants and return migrants are reviewed in the next two sections, and section 7 looks at the data on financial flows gathered from macro-level estimates of remittances. Section 8 is devoted to the community, family, and individual factors which influence emigration including the networks that facilitate migration and means of meeting migration costs. The ninth section summarizes the political setting with an emphasis on the adverse reaction of Nepal to population movement from India. The final section of the report projects future population movements. It is noted that if there were no restrictions on migration, millions of Indians would emigrate to the Americas, Africa, and Australia. Whereas poverty, unemployment, and population growth will likely erode living conditions in India, the government has

  15. Likelihood based observability analysis and confidence intervals for predictions of dynamic models

    CERN Document Server

    Kreutz, Clemens; Timmer, Jens

    2011-01-01

    Mechanistic dynamic models of biochemical networks such as Ordinary Differential Equations (ODEs) contain unknown parameters like the reaction rate constants and the initial concentrations of the compounds. The large number of parameters as well as their nonlinear impact on the model responses hamper the determination of confidence regions for parameter estimates. At the same time, classical approaches translating the uncertainty of the parameters into confidence intervals for model predictions are hardly feasible. In this article it is shown that a so-called prediction profile likelihood yields reliable confidence intervals for model predictions, despite arbitrarily complex and high-dimensional shapes of the confidence regions for the estimated parameters. Prediction confidence intervals of the dynamic states allow a data-based observability analysis. The approach renders the issue of sampling a high-dimensional parameter space into evaluating one-dimensional prediction spaces. The method is also applicable ...

  16. Complex DNA mixture analysis in a forensic context: evaluating the probative value using a likelihood ratio model.

    Science.gov (United States)

    Haned, Hinda; Benschop, Corina C G; Gill, Peter D; Sijen, Titia

    2015-05-01

    The interpretation of mixed DNA profiles obtained from low template DNA samples has proven to be a particularly difficult task in forensic casework. Newly developed likelihood ratio (LR) models that account for PCR-related stochastic effects, such as allelic drop-out, drop-in and stutters, have enabled the analysis of complex cases that would otherwise have been reported as inconclusive. In such samples, there are uncertainties about the number of contributors, and the correct sets of propositions to consider. Using experimental samples, where the genotypes of the donors are known, we evaluated the feasibility and the relevance of the interpretation of high order mixtures, of three, four and five donors. The relative risks of analyzing high order mixtures of three, four, and five donors, were established by comparison of a 'gold standard' LR, to the LR that would be obtained in casework. The 'gold standard' LR is the ideal LR: since the genotypes and number of contributors are known, it follows that the parameters needed to compute the LR can be determined per contributor. The 'casework LR' was calculated as used in standard practice, where unknown donors are assumed; the parameters were estimated from the available data. Both LRs were calculated using the basic standard model, also termed the drop-out/drop-in model, implemented in the LRmix module of the R package Forensim. We show how our results furthered the understanding of the relevance of analyzing high order mixtures in a forensic context. Limitations are highlighted, and it is illustrated how our study serves as a guide to implement likelihood ratio interpretation of complex DNA profiles in forensic casework.

  17. Dynamic context bindings : infrastructural support for context-aware applications

    OpenAIRE

    Broens, Tom Henri Ferdinand

    2008-01-01

    The world is increasingly equipped with high-capacity, interconnected, mobile and embedded computing devices. Context-awareness provides an attractive approach to personalize applications such that they better suit the user’s needs in this rich computing environment. Context-aware applications use context information, offered by context sources, to adapt their behavior to the situation at hand. The exchange of context information requires an association between a context consuming context-awa...

  18. Efficient and exact maximum likelihood quantisation of genomic features using dynamic programming.

    Science.gov (United States)

    Song, Mingzhou; Haralick, Robert M; Boissinot, Stéphane

    2010-01-01

    An efficient and exact dynamic programming algorithm is introduced to quantise a continuous random variable into a discrete random variable that maximises the likelihood of the quantised probability distribution for the original continuous random variable. Quantisation is often useful before statistical analysis and modelling of large discrete network models from observations of multiple continuous random variables. The quantisation algorithm is applied to genomic features including the recombination rate distribution across the chromosomes and the non-coding transposable element LINE-1 in the human genome. The association pattern is studied between the recombination rate, obtained by quantisation at genomic locations around LINE-1 elements, and the length groups of LINE-1 elements, also obtained by quantisation on LINE-1 length. The exact and density-preserving quantisation approach provides an alternative superior to the inexact and distance-based univariate iterative k-means clustering algorithm for discretisation.

  19. Fluid Dynamics in an Ecological Context

    Science.gov (United States)

    Denny, Mark

    2007-11-01

    Fluid dynamics has long been an invaluable tool in the study of biological mechanics, helping to explain how animals swim and fly, how blood is pumped, gases are exchanged, and propagules are dispersed. The goal of understanding how the physics of fluids has affected the evolution of individual organisms provides strong impetus for teaching and learning fluid mechanics; a viable alternative to the more traditional goals of engineering. In recent years, a third alternative has arisen. The principles of fluid dynamics can be used to specify when and where individual organisms will exceed their physical capabilities, information that can in turn be used to predict species-specific survivorship in a given environment. In other words, biological fluid dynamics can be extended beyond the study of individual organisms to play an important role in our understanding of ecological dynamics. In a world where environmental change is of increasing concern, fluid dynamic aspect of ``ecomechanics'' may be of considerable practical importance. Teaching fluid mechanics in ecology will be discussed in the context of wave-swept rocky shores. Various wave theories can be used to predict the maximum water velocities and accelerations impinging on specific surf-zone plants and animals. Theories of lift, drag, and accelerational forces can then be used to predict the maximum loads imposed on these organisms, loads that can be compared to the organisms' structural limits to predict the fraction of the species that will be dislodged or damaged. Taken across relevant species, this information goes far towards explaining shoreline community dynamics. .

  20. Maximum-Likelihood Adaptive Filter for Partially Observed Boolean Dynamical Systems

    Science.gov (United States)

    Imani, Mahdi; Braga-Neto, Ulisses M.

    2017-01-01

    Partially-observed Boolean dynamical systems (POBDS) are a general class of nonlinear models with application in estimation and control of Boolean processes based on noisy and incomplete measurements. The optimal minimum mean square error (MMSE) algorithms for POBDS state estimation, namely, the Boolean Kalman filter (BKF) and Boolean Kalman smoother (BKS), are intractable in the case of large systems, due to computational and memory requirements. To address this, we propose approximate MMSE filtering and smoothing algorithms based on the auxiliary particle filter (APF) method from sequential Monte-Carlo theory. These algorithms are used jointly with maximum-likelihood (ML) methods for simultaneous state and parameter estimation in POBDS models. In the presence of continuous parameters, ML estimation is performed using the expectation-maximization (EM) algorithm; we develop for this purpose a special smoother which reduces the computational complexity of the EM algorithm. The resulting particle-based adaptive filter is applied to a POBDS model of Boolean gene regulatory networks observed through noisy RNA-Seq time series data, and performance is assessed through a series of numerical experiments using the well-known cell cycle gene regulatory model.

  1. Comparative behaviour of the Dynamically Penalized Likelihood algorithm in inverse radiation therapy planning

    Energy Technology Data Exchange (ETDEWEB)

    Llacer, Jorge [EC Engineering Consultants, LLC, Los Gatos, CA (United States)]. E-mail: jllacer@home.com; Solberg, Timothy D. [Department of Radiation Oncology, University of California, Los Angeles, CA (United States)]. E-mail: Solberg@radonc.ucla.edu; Promberger, Claus [BrainLAB AG, Heimstetten (Germany)]. E-mail: promberg@brainlab.com

    2001-10-01

    This paper presents a description of tests carried out to compare the behaviour of five algorithms in inverse radiation therapy planning: (1) The Dynamically Penalized Likelihood (DPL), an algorithm based on statistical estimation theory; (2) an accelerated version of the same algorithm; (3) a new fast adaptive simulated annealing (ASA) algorithm; (4) a conjugate gradient method; and (5) a Newton gradient method. A three-dimensional mathematical phantom and two clinical cases have been studied in detail. The phantom consisted of a U-shaped tumour with a partially enclosed 'spinal cord'. The clinical examples were a cavernous sinus meningioma and a prostate case. The algorithms have been tested in carefully selected and controlled conditions so as to ensure fairness in the assessment of results. It has been found that all five methods can yield relatively similar optimizations, except when a very demanding optimization is carried out. For the easier cases, the differences are principally in robustness, ease of use and optimization speed. In the more demanding case, there are significant differences in the resulting dose distributions. The accelerated DPL emerges as possibly the algorithm of choice for clinical practice. An appendix describes the differences in behaviour between the new ASA method and the one based on a patent by the Nomos Corporation. (author)

  2. Comparative behaviour of the Dynamically Penalized Likelihood algorithm in inverse radiation therapy planning

    Science.gov (United States)

    Llacer, Jorge; Solberg, Timothy D.; Promberger, Claus

    2001-10-01

    This paper presents a description of tests carried out to compare the behaviour of five algorithms in inverse radiation therapy planning: (1) The Dynamically Penalized Likelihood (DPL), an algorithm based on statistical estimation theory; (2) an accelerated version of the same algorithm; (3) a new fast adaptive simulated annealing (ASA) algorithm; (4) a conjugate gradient method; and (5) a Newton gradient method. A three-dimensional mathematical phantom and two clinical cases have been studied in detail. The phantom consisted of a U-shaped tumour with a partially enclosed 'spinal cord'. The clinical examples were a cavernous sinus meningioma and a prostate case. The algorithms have been tested in carefully selected and controlled conditions so as to ensure fairness in the assessment of results. It has been found that all five methods can yield relatively similar optimizations, except when a very demanding optimization is carried out. For the easier cases, the differences are principally in robustness, ease of use and optimization speed. In the more demanding case, there are significant differences in the resulting dose distributions. The accelerated DPL emerges as possibly the algorithm of choice for clinical practice. An appendix describes the differences in behaviour between the new ASA method and the one based on a patent by the Nomos Corporation.

  3. Philosophy and phylogenetic inference: a comparison of likelihood and parsimony methods in the context of Karl Popper's writings on corroboration.

    Science.gov (United States)

    de Queiroz, K; Poe, S

    2001-06-01

    Advocates of cladistic parsimony methods have invoked the philosophy of Karl Popper in an attempt to argue for the superiority of those methods over phylogenetic methods based on Ronald Fisher's statistical principle of likelihood. We argue that the concept of likelihood in general, and its application to problems of phylogenetic inference in particular, are highly compatible with Popper's philosophy. Examination of Popper's writings reveals that his concept of corroboration is, in fact, based on likelihood. Moreover, because probabilistic assumptions are necessary for calculating the probabilities that define Popper's corroboration, likelihood methods of phylogenetic inference--with their explicit probabilistic basis--are easily reconciled with his concept. In contrast, cladistic parsimony methods, at least as described by certain advocates of those methods, are less easily reconciled with Popper's concept of corroboration. If those methods are interpreted as lacking probabilistic assumptions, then they are incompatible with corroboration. Conversely, if parsimony methods are to be considered compatible with corroboration, then they must be interpreted as carrying implicit probabilistic assumptions. Thus, the non-probabilistic interpretation of cladistic parsimony favored by some advocates of those methods is contradicted by an attempt by the same authors to justify parsimony methods in terms of Popper's concept of corroboration. In addition to being compatible with Popperian corroboration, the likelihood approach to phylogenetic inference permits researchers to test the assumptions of their analytical methods (models) in a way that is consistent with Popper's ideas about the provisional nature of background knowledge.

  4. Using Data to Tune Nearshore Dynamics Models: A Bayesian Approach with Parametric Likelihood

    CERN Document Server

    Balci, Nusret; Venkataramani, Shankar C

    2013-01-01

    We propose a modification of a maximum likelihood procedure for tuning parameter values in models, based upon the comparison of their output to field data. Our methodology, which uses polynomial approximations of the sample space to increase the computational efficiency, differs from similar Bayesian estimation frameworks in the use of an alternative likelihood distribution, is shown to better address problems in which covariance information is lacking, than its more conventional counterpart. Lack of covariance information is a frequent challenge in large-scale geophysical estimation. This is the case in the geophysical problem considered here. We use a nearshore model for long shore currents and observational data of the same to show the contrast between both maximum likelihood methodologies. Beyond a methodological comparison, this study gives estimates of parameter values for the bottom drag and surface forcing that make the particular model most consistent with data; furthermore, we also derive sensitivit...

  5. Regions of constrained maximum likelihood parameter identifiability. [of discrete-time nonlinear dynamic systems with white measurement errors

    Science.gov (United States)

    Lee, C.-H.; Herget, C. J.

    1976-01-01

    This short paper considers the parameter-identification problem of general discrete-time, nonlinear, multiple input-multiple output dynamic systems with Gaussian white distributed measurement errors. Knowledge of the system parameterization is assumed to be available. Regions of constrained maximum likelihood (CML) parameter identifiability are established. A computation procedure employing interval arithmetic is proposed for finding explicit regions of parameter identifiability for the case of linear systems.

  6. A penalized likelihood approach for bivariate conditional normal models for dynamic co-expression analysis.

    Science.gov (United States)

    Chen, Jun; Xie, Jichun; Li, Hongzhe

    2011-03-01

    Gene co-expressions have been widely used in the analysis of microarray gene expression data. However, the co-expression patterns between two genes can be mediated by cellular states, as reflected by expression of other genes, single nucleotide polymorphisms, and activity of protein kinases. In this article, we introduce a bivariate conditional normal model for identifying the variables that can mediate the co-expression patterns between two genes. Based on this model, we introduce a likelihood ratio (LR) test and a penalized likelihood procedure for identifying the mediators that affect gene co-expression patterns. We propose an efficient computational algorithm based on iterative reweighted least squares and cyclic coordinate descent and have shown that when the tuning parameter in the penalized likelihood is appropriately selected, such a procedure has the oracle property in selecting the variables. We present simulation results to compare with existing methods and show that the LR-based approach can perform similarly or better than the existing method of liquid association and the penalized likelihood procedure can be quite effective in selecting the mediators. We apply the proposed method to yeast gene expression data in order to identify the kinases or single nucleotide polymorphisms that mediate the co-expression patterns between genes.

  7. Building a Context World for Dynamic Service Composition

    DEFF Research Database (Denmark)

    Yu, Lian; Glenstrup, Arne John; Su, Shuang

    the physical contexts of the computing environment, user profiles and computed results of services as well. We use ontology techniques to model the domain concepts of application contexts. Context Condition/Effect Description Language is designed to describe the dynamic semantics of the requirements...... and capabilities of goals and services in a concise and editable manner. Goal-driven and planning techniques are used to dynamically implement the service composition according to the domain knowledge and facts in the context world....

  8. Modelling dynamics with context-free grammars

    Science.gov (United States)

    García-Huerta, Juan-M.; Jiménez-Hernández, Hugo; Herrera-Navarro, Ana-M.; Hernández-Díaz, Teresa; Terol-Villalobos, Ivan

    2014-03-01

    This article presents a strategy to model the dynamics performed by vehicles in a freeway. The proposal consists on encode the movement as a set of finite states. A watershed-based segmentation is used to localize regions with high-probability of motion. Each state represents a proportion of a camera projection in a two-dimensional space, where each state is associated to a symbol, such that any combination of symbols is expressed as a language. Starting from a sequence of symbols through a linear algorithm a free-context grammar is inferred. This grammar represents a hierarchical view of common sequences observed into the scene. Most probable grammar rules express common rules associated to normal movement behavior. Less probable rules express themselves a way to quantify non-common behaviors and they might need more attention. Finally, all sequences of symbols that does not match with the grammar rules, may express itself uncommon behaviors (abnormal). The grammar inference is built with several sequences of images taken from a freeway. Testing process uses the sequence of symbols emitted by the scenario, matching the grammar rules with common freeway behaviors. The process of detect abnormal/normal behaviors is managed as the task of verify if any word generated by the scenario is recognized by the grammar.

  9. The relationship between the neural computations for speech and music perception is context-dependent: an activation likelihood estimate study

    Science.gov (United States)

    LaCroix, Arianna N.; Diaz, Alvaro F.; Rogalsky, Corianne

    2015-01-01

    The relationship between the neurobiology of speech and music has been investigated for more than a century. There remains no widespread agreement regarding how (or to what extent) music perception utilizes the neural circuitry that is engaged in speech processing, particularly at the cortical level. Prominent models such as Patel's Shared Syntactic Integration Resource Hypothesis (SSIRH) and Koelsch's neurocognitive model of music perception suggest a high degree of overlap, particularly in the frontal lobe, but also perhaps more distinct representations in the temporal lobe with hemispheric asymmetries. The present meta-analysis study used activation likelihood estimate analyses to identify the brain regions consistently activated for music as compared to speech across the functional neuroimaging (fMRI and PET) literature. Eighty music and 91 speech neuroimaging studies of healthy adult control subjects were analyzed. Peak activations reported in the music and speech studies were divided into four paradigm categories: passive listening, discrimination tasks, error/anomaly detection tasks and memory-related tasks. We then compared activation likelihood estimates within each category for music vs. speech, and each music condition with passive listening. We found that listening to music and to speech preferentially activate distinct temporo-parietal bilateral cortical networks. We also found music and speech to have shared resources in the left pars opercularis but speech-specific resources in the left pars triangularis. The extent to which music recruited speech-activated frontal resources was modulated by task. While there are certainly limitations to meta-analysis techniques particularly regarding sensitivity, this work suggests that the extent of shared resources between speech and music may be task-dependent and highlights the need to consider how task effects may be affecting conclusions regarding the neurobiology of speech and music. PMID:26321976

  10. The relationship between the neural computations for speech and music perception is context-dependent: an activation likelihood estimate study.

    Science.gov (United States)

    LaCroix, Arianna N; Diaz, Alvaro F; Rogalsky, Corianne

    2015-01-01

    The relationship between the neurobiology of speech and music has been investigated for more than a century. There remains no widespread agreement regarding how (or to what extent) music perception utilizes the neural circuitry that is engaged in speech processing, particularly at the cortical level. Prominent models such as Patel's Shared Syntactic Integration Resource Hypothesis (SSIRH) and Koelsch's neurocognitive model of music perception suggest a high degree of overlap, particularly in the frontal lobe, but also perhaps more distinct representations in the temporal lobe with hemispheric asymmetries. The present meta-analysis study used activation likelihood estimate analyses to identify the brain regions consistently activated for music as compared to speech across the functional neuroimaging (fMRI and PET) literature. Eighty music and 91 speech neuroimaging studies of healthy adult control subjects were analyzed. Peak activations reported in the music and speech studies were divided into four paradigm categories: passive listening, discrimination tasks, error/anomaly detection tasks and memory-related tasks. We then compared activation likelihood estimates within each category for music vs. speech, and each music condition with passive listening. We found that listening to music and to speech preferentially activate distinct temporo-parietal bilateral cortical networks. We also found music and speech to have shared resources in the left pars opercularis but speech-specific resources in the left pars triangularis. The extent to which music recruited speech-activated frontal resources was modulated by task. While there are certainly limitations to meta-analysis techniques particularly regarding sensitivity, this work suggests that the extent of shared resources between speech and music may be task-dependent and highlights the need to consider how task effects may be affecting conclusions regarding the neurobiology of speech and music.

  11. The relationship between the neural computations for speech and music perception is context-dependent: an activation likelihood estimate study

    Directory of Open Access Journals (Sweden)

    Arianna eLaCroix

    2015-08-01

    Full Text Available The relationship between the neurobiology of speech and music has been investigated for more than a century. There remains no widespread agreement regarding how (or to what extent music perception utilizes the neural circuitry that is engaged in speech processing, particularly at the cortical level. Prominent models such as Patel’s Shared Syntactic Integration Resource Hypothesis (SSIRH and Koelsch’s neurocognitive model of music perception suggest a high degree of overlap, particularly in the frontal lobe, but also perhaps more distinct representations in the temporal lobe with hemispheric asymmetries. The present meta-analysis study used activation likelihood estimate analyses to identify the brain regions consistently activated for music as compared to speech across the functional neuroimaging (fMRI and PET literature. Eighty music and 91 speech neuroimaging studies of healthy adult control subjects were analyzed. Peak activations reported in the music and speech studies were divided into four paradigm categories: passive listening, discrimination tasks, error/anomaly detection tasks and memory-related tasks. We then compared activation likelihood estimates within each category for music versus speech, and each music condition with passive listening. We found that listening to music and to speech preferentially activate distinct temporo-parietal bilateral cortical networks. We also found music and speech to have shared resources in the left pars opercularis but speech-specific resources in the left pars triangularis. The extent to which music recruited speech-activated frontal resources was modulated by task. While there are certainly limitations to meta-analysis techniques particularly regarding sensitivity, this work suggests that the extent of shared resources between speech and music may be task-dependent and highlights the need to consider how task effects may be affecting conclusions regarding the neurobiology of speech and music.

  12. Learning Dynamics for Robot Control under Varying Contexts

    OpenAIRE

    Petkos, Georgios

    2008-01-01

    High fidelity, compliant robot control requires a sufficiently accurate dynamics model. Often though, it is not possible to obtain a dynamics model sufficiently accurately or at all using analytical methods. In such cases, an alternative is to learn the dynamics model from movement data. This thesis discusses the problems specific to dynamics learning for control under nonstationarity of the dynamics. We refer to the cause of the nonstationarity as the context of the dynamics. ...

  13. DREAM3: Network Inference Using Dynamic Context Likelihood of Relatedness and the Inferelator

    Science.gov (United States)

    2010-03-22

    Methods Enzymol 350: 469–483. 47. Johnson DS, Mortazavi A, Myers RM, Wold B (2007) Genome- wide mapping of in vivo protein-dna interactions. Science ...Mathematics, Courant Institute of Mathematical Sciences , New York University, New York, New York, United States of America, 4 Department of Computer Science ...Courant Institute of Mathematical Sciences , New York University, New York, New York, United States of America Abstract Background: Many current works

  14. Calibrating floor field cellular automaton models for pedestrian dynamics by using likelihood function optimization

    Science.gov (United States)

    Lovreglio, Ruggiero; Ronchi, Enrico; Nilsson, Daniel

    2015-11-01

    The formulation of pedestrian floor field cellular automaton models is generally based on hypothetical assumptions to represent reality. This paper proposes a novel methodology to calibrate these models using experimental trajectories. The methodology is based on likelihood function optimization and allows verifying whether the parameters defining a model statistically affect pedestrian navigation. Moreover, it allows comparing different model specifications or the parameters of the same model estimated using different data collection techniques, e.g. virtual reality experiment, real data, etc. The methodology is here implemented using navigation data collected in a Virtual Reality tunnel evacuation experiment including 96 participants. A trajectory dataset in the proximity of an emergency exit is used to test and compare different metrics, i.e. Euclidean and modified Euclidean distance, for the static floor field. In the present case study, modified Euclidean metrics provide better fitting with the data. A new formulation using random parameters for pedestrian cellular automaton models is also defined and tested.

  15. Context-aware Authorization in Highly Dynamic Environments

    CERN Document Server

    Tigli, Jean-Yves; Rey, Gaetan; Hourdin, Vincent; Riveill, Michel

    2011-01-01

    Highly dynamic computing environments, like ubiquitous and pervasive computing environments, require frequent adaptation of applications. Context is a key to adapt suiting user needs. On the other hand, standard access control trusts users once they have authenticated, despite the fact that they may reach unauthorized contexts. We analyse how taking into account dynamic information like context in the authorization subsystem can improve security, and how this new access control applies to interaction patterns, like messaging or eventing. We experiment and validate our approach using context as an authorization factor for eventing in Web service for device (like UPnP or DPWS), in smart home security.

  16. Context-aware Authorization in Highly Dynamic Environments

    Directory of Open Access Journals (Sweden)

    Vincent Hourdin

    2009-09-01

    Full Text Available Highly dynamic computing environments, like ubiquitous and pervasive computing environments, require frequent adaptation of applications. Context is a key to adapt suiting user needs. On the other hand, standard access control trusts users once they have authenticated, despite the fact that they may reach unauthorized contexts. We analyse how taking into account dynamic information like context in the authorization subsystem can improve security, and how this new access control applies to interaction patterns, like messaging or eventing. We experiment and validate our approach using context as an authorization factor for eventing in Web service for device (like UPnP or DPWS, in smart home security.

  17. Approximate Likelihood

    CERN Document Server

    CERN. Geneva

    2015-01-01

    Most physics results at the LHC end in a likelihood ratio test. This includes discovery and exclusion for searches as well as mass, cross-section, and coupling measurements. The use of Machine Learning (multivariate) algorithms in HEP is mainly restricted to searches, which can be reduced to classification between two fixed distributions: signal vs. background. I will show how we can extend the use of ML classifiers to distributions parameterized by physical quantities like masses and couplings as well as nuisance parameters associated to systematic uncertainties. This allows for one to approximate the likelihood ratio while still using a high dimensional feature vector for the data. Both the MEM and ABC approaches mentioned above aim to provide inference on model parameters (like cross-sections, masses, couplings, etc.). ABC is fundamentally tied Bayesian inference and focuses on the “likelihood free” setting where only a simulator is available and one cannot directly compute the likelihood for the dat...

  18. Empirical likelihood

    CERN Document Server

    Owen, Art B

    2001-01-01

    Empirical likelihood provides inferences whose validity does not depend on specifying a parametric model for the data. Because it uses a likelihood, the method has certain inherent advantages over resampling methods: it uses the data to determine the shape of the confidence regions, and it makes it easy to combined data from multiple sources. It also facilitates incorporating side information, and it simplifies accounting for censored, truncated, or biased sampling.One of the first books published on the subject, Empirical Likelihood offers an in-depth treatment of this method for constructing confidence regions and testing hypotheses. The author applies empirical likelihood to a range of problems, from those as simple as setting a confidence region for a univariate mean under IID sampling, to problems defined through smooth functions of means, regression models, generalized linear models, estimating equations, or kernel smooths, and to sampling with non-identically distributed data. Abundant figures offer vi...

  19. The Use of Dynamic Stochastic Social Behavior Models to Produce Likelihood Functions for Risk Modeling of Proliferation and Terrorist Attacks

    Energy Technology Data Exchange (ETDEWEB)

    Young, Jonathan; Thompson, Sandra E.; Brothers, Alan J.; Whitney, Paul D.; Coles, Garill A.; Henderson, Cindy L.; Wolf, Katherine E.; Hoopes, Bonnie L.

    2008-12-01

    The ability to estimate the likelihood of future events based on current and historical data is essential to the decision making process of many government agencies. Successful predictions related to terror events and characterizing the risks will support development of options for countering these events. The predictive tasks involve both technical and social component models. The social components have presented a particularly difficult challenge. This paper outlines some technical considerations of this modeling activity. Both data and predictions associated with the technical and social models will likely be known with differing certainties or accuracies – a critical challenge is linking across these model domains while respecting this fundamental difference in certainty level. This paper will describe the technical approach being taken to develop the social model and identification of the significant interfaces between the technical and social modeling in the context of analysis of diversion of nuclear material.

  20. Stochastic identification using the maximum likelihood method and a statistical reduction: application to drilling dynamics

    OpenAIRE

    2010-01-01

    International audience; A drill-string is a slender structure that drills rock to search for oil. The nonlinear interaction between the bit and the rock is of great importance for the drill-string dynamics. The interaction model has uncertainties, which are modeled using the nonparametric probabilistic approach. This paper deals with a procedure to perform the identification of the dispersion parameter of the probabilistic model of uncertainties of a bit-rock interaction model. The bit-rock i...

  1. Maximum-Likelihood Sequence Detector for Dynamic Mode High Density Probe Storage

    CERN Document Server

    Kumar, Naveen; Ramamoorthy, Aditya; Salapaka, Murti

    2009-01-01

    There is an ever increasing need for storing data in smaller and smaller form factors driven by the ubiquitous use and increased demands of consumer electronics. A new approach of achieving a few Tb per in2 areal densities, utilizes a cantilever probe with a sharp tip that can be used to deform and assess the topography of the material. The information may be encoded by means of topographic profiles on a polymer medium. The prevalent mode of using the cantilever probe is the static mode that is known to be harsh on the probe and the media. In this paper, the high quality factor dynamic mode operation, which is known to be less harsh on the media and the probe, is analyzed for probe based high density data storage purposes. It is demonstrated that an appropriate level of abstraction is possible that obviates the need for an involved physical model. The read operation is modeled as a communication channel which incorporates the inherent system memory due to the intersymbol interference and the cantilever state ...

  2. On Sustaining Dynamic Adaptation of Context-Aware Services

    Directory of Open Access Journals (Sweden)

    Boudjemaa Boudaa

    2015-03-01

    Full Text Available The modern human is getting more and more mobile having access to online services by using mobile cutting-edge computational devices. In the last decade, the field of context-aware services had led to emerge several works. However, most of the proposed approaches have not provided clear adaptation strategies in case of unforeseen contexts. Dealing with this last at runtime is also another crucial need that has been ignored in their proposals. This paper aims to propose a generic dynamic adaptation process as a phase in a model-driven development life-cycle for context-aware services using the MAPE-K control loop to meet the runtime adaptation. This process is validated by implementing an illustrative application on FraSCAti platform. The main benefit of the proposed process is to sustain the self-reconfiguration of such services at model and code levels by enabling successive dynamic adaptations depending on the changing context.

  3. GROUP DYNAMICS AND TEAM FUNCTIONING IN ORGANIZATIONAL CONTEXT

    Directory of Open Access Journals (Sweden)

    Raluca ZOLTAN

    2015-07-01

    Full Text Available In all kind of organization many activities are done by groups and teams. But how are they formed? What factors influence their existence and development? How members of groups and teams are selected? Which are the consequences in organizational context? In order to answer these questions, in the present paper we describe and analyze the main approaches regarding the formation of work groups and work teams (sociometric approach and group dynamics approach, the main factors that affects group dynamics and the FIRO model for evaluation the team members’ needs.

  4. Measurement of the Top Quark Mass by Dynamical Likelihood Method using the Lepton + Jets Events with the Collider Detector at Fermilab

    Energy Technology Data Exchange (ETDEWEB)

    Kubo, Taichi [Univ. of Tsukuba (Japan)

    2008-02-01

    We have measured the top quark mass with the dynamical likelihood method. The data corresponding to an integrated luminosity of 1.7fb-1 was collected in proton antiproton collisions at a center of mass energy of 1.96 TeV with the CDF detector at Fermilab Tevatron during the period March 2002-March 2007. We select t$\\bar{t}$ pair production candidates by requiring one high energy lepton and four jets, in which at least one of jets must be tagged as a b-jet. In order to reconstruct the top quark mass, we use the dynamical likelihood method based on maximum likelihood method where a likelihood is defined as the differential cross section multiplied by the transfer function from observed quantities to parton quantities, as a function of the top quark mass and the jet energy scale(JES). With this method, we measure the top quark mass to be 171.6 ± 2.0 (stat.+ JES) ± 1.3(syst.) = 171.6 ± 2.4 GeV/c2.

  5. Measurement of the Top Quark Mass by Dynamical Likelihood Method using the Lepton plus Jets Events in 1.96 Tev Proton-Antiproton Collisions

    Energy Technology Data Exchange (ETDEWEB)

    Yorita, Kohei [Waseda Univ., Shinjuku (Japan)

    2005-03-01

    We have measured the top quark mass with the dynamical likelihood method (DLM) using the CDF II detector at the Fermilab Tevatron. The Tevatron produces top and anti-top pairs in pp collisions at a center of mass energy of 1.96 TeV. The data sample used in this paper was accumulated from March 2002 through August 2003 which corresponds to an integrated luminosity of 162 pb-1.

  6. Using likelihood-free inference to compare evolutionary dynamics of the protein networks of H. pylori and P. falciparum.

    Directory of Open Access Journals (Sweden)

    Oliver Ratmann

    2007-11-01

    Full Text Available Gene duplication with subsequent interaction divergence is one of the primary driving forces in the evolution of genetic systems. Yet little is known about the precise mechanisms and the role of duplication divergence in the evolution of protein networks from the prokaryote and eukaryote domains. We developed a novel, model-based approach for Bayesian inference on biological network data that centres on approximate Bayesian computation, or likelihood-free inference. Instead of computing the intractable likelihood of the protein network topology, our method summarizes key features of the network and, based on these, uses a MCMC algorithm to approximate the posterior distribution of the model parameters. This allowed us to reliably fit a flexible mixture model that captures hallmarks of evolution by gene duplication and subfunctionalization to protein interaction network data of Helicobacter pylori and Plasmodium falciparum. The 80% credible intervals for the duplication-divergence component are [0.64, 0.98] for H. pylori and [0.87, 0.99] for P. falciparum. The remaining parameter estimates are not inconsistent with sequence data. An extensive sensitivity analysis showed that incompleteness of PIN data does not largely affect the analysis of models of protein network evolution, and that the degree sequence alone barely captures the evolutionary footprints of protein networks relative to other statistics. Our likelihood-free inference approach enables a fully Bayesian analysis of a complex and highly stochastic system that is otherwise intractable at present. Modelling the evolutionary history of PIN data, it transpires that only the simultaneous analysis of several global aspects of protein networks enables credible and consistent inference to be made from available datasets. Our results indicate that gene duplication has played a larger part in the network evolution of the eukaryote than in the prokaryote, and suggests that single gene

  7. Approximate group context tree: applications to dynamic programming and dynamic choice models

    CERN Document Server

    Belloni, Alexandre

    2011-01-01

    The paper considers a variable length Markov chain model associated with a group of stationary processes that share the same context tree but potentially different conditional probabilities. We propose a new model selection and estimation method, develop oracle inequalities and model selection properties for the estimator. These results also provide conditions under which the use of the group structure can lead to improvements in the overall estimation. Our work is also motivated by two methodological applications: discrete stochastic dynamic programming and dynamic discrete choice models. We analyze the uniform estimation of the value function for dynamic programming and the uniform estimation of average dynamic marginal effects for dynamic discrete choice models accounting for possible imperfect model selection. We also derive the typical behavior of our estimator when applied to polynomially $\\beta$-mixing stochastic processes. For parametric models, we derive uniform rate of convergence for the estimation...

  8. Beyond 'vulnerable groups': contexts and dynamics of vulnerability.

    Science.gov (United States)

    Zarowsky, Christina; Haddad, Slim; Nguyen, Vinh-Kim

    2013-03-01

    This paper reviews approaches to vulnerability in public health, introducing a series of 10 papers addressing vulnerability in health in Africa. We understand vulnerability as simultaneously a condition and a process. Social inequalities are manifest in and exacerbate three key dimensions of vulnerability: the initial level of wellbeing, the degree of exposure to risk, and the capacity to manage risk effectively. We stress the dynamic interactions linking material and social deprivation, poverty, powerlessness and ill health: risks or shocks and their health impacts are intimately interconnected and reinforce each other in a cycle which in the absence of effective interventions, increases vulnerability. An inductive process which does not begin with an a priori definition or measurement of 'vulnerability' and which does not assume the existence of fixed 'vulnerable groups' allowed us both to re-affirm core aspects of existing conceptual frameworks, and to engage in new ways with literature specifically addressing vulnerability and resilience at the population level as well as with literature - for example in ecology, and on the concept of frailty in research on aging - with which researchers on health and poverty in Africa may not be familiar. We invite conceptual and empirical work on vulnerability in complex systems frameworks. These perspectives emphasize contexts and nonlinear causality thus supporting analyses of vulnerability and resilience as both markers and emergent properties of dynamic interactions. We accept a working definition of vulnerability, and recognize that some definable groups of people are more likely than others to suffer harm from exposure to health risks. But we suggest that the real work - at both intellectual and policy/political levels - lies in understanding and responding to the dynamics, meanings and power relations underlying actual instances and processes of vulnerability and harm.

  9. Identifying change in the likelihood of violent recidivism: causal dynamic risk factors in the OASys violence predictor.

    Science.gov (United States)

    Howard, Philip D; Dixon, Louise

    2013-06-01

    Recent studies of multiwave risk assessment have investigated the association between changes in risk factors and violent recidivism. This study analyzed a large multiwave data set of English and Welsh offenders (N = 196,493), assessed in realistic correctional conditions using the static/dynamic Offender Assessment System (OASys). It aimed to compare the predictive validity of the OASys Violence Predictor (OVP) under mandated repeated assessment and one-time initial assessment conditions. Scores on 5 of OVP's 7 purportedly dynamic risk factors changed in 6 to 15% of pairs of successive assessments, whereas the other 2 seldom changed. Violent reoffenders had higher initial total and dynamic OVP scores than nonreoffenders, yet nonreoffenders' dynamic scores fell by significantly more between initial and final assessment. OVP scores from the current assessment achieved greater predictive validity than those from the initial assessment. Cox regression models showed that, for total OVP scores and most risk factors, both the initial score and the change in score from initial to current assessment significantly predicted reoffending. These results consistently showed that OVP includes several causal dynamic risk factors for violent recidivism, which can be measured reliably in operational settings. This adds to the evidence base that links changes in risk factors to changes in future reoffending risk and links the use of repeated assessments to incremental improvements in predictive validity. Further research could quantify the costs and benefits of reassessment in correctional practice, study associations between treatment and dynamic risk factors, and separate the effects of improvements and deteriorations in dynamic risk.

  10. Dynamic context discrimination : psychological evidence for the Sandia Cognitive Framework.

    Energy Technology Data Exchange (ETDEWEB)

    Speed, Ann Elizabeth

    2004-09-01

    Human behavior is a function of an iterative interaction between the stimulus environment and past experience. It is not simply a matter of the current stimulus environment activating the appropriate experience or rule from memory (e.g., if it is dark and I hear a strange noise outside, then I turn on the outside lights and investigate). Rather, it is a dynamic process that takes into account not only things one would generally do in a given situation, but things that have recently become known (e.g., there have recently been coyotes seen in the area and one is known to be rabid), as well as other immediate environmental characteristics (e.g., it is snowing outside, I know my dog is outside, I know the police are already outside, etc.). All of these factors combine to inform me of the most appropriate behavior for the situation. If it were the case that humans had a rule for every possible contingency, the amount of storage that would be required to enable us to fluidly deal with most situations we encounter would rapidly become biologically untenable. We can all deal with contingencies like the one above with fairly little effort, but if it isn't based on rules, what is it based on? The assertion of the Cognitive Systems program at Sandia for the past 5 years is that at the heart of this ability to effectively navigate the world is an ability to discriminate between different contexts (i.e., Dynamic Context Discrimination, or DCD). While this assertion in and of itself might not seem earthshaking, it is compelling that this ability and its components show up in a wide variety of paradigms across different subdisciplines in psychology. We begin by outlining, at a high functional level, the basic ideas of DCD. We then provide evidence from several different literatures and paradigms that support our assertion that DCD is a core aspect of cognitive functioning. Finally, we discuss DCD and the computational model that we have developed as an instantiation of DCD

  11. A Dynamic Ubiquitous Learning Resource Model with Context and Its Effects on Ubiquitous Learning

    Science.gov (United States)

    Chen, Min; Yu, Sheng Quan; Chiang, Feng Kuang

    2017-01-01

    Most ubiquitous learning researchers use resource recommendation and retrieving based on context to provide contextualized learning resources, but it is the kind of one-way context matching. Learners always obtain fixed digital learning resources, which present all learning contents in any context. This study proposed a dynamic ubiquitous learning…

  12. A Dynamic Ubiquitous Learning Resource Model with Context and Its Effects on Ubiquitous Learning

    Science.gov (United States)

    Chen, Min; Yu, Sheng Quan; Chiang, Feng Kuang

    2017-01-01

    Most ubiquitous learning researchers use resource recommendation and retrieving based on context to provide contextualized learning resources, but it is the kind of one-way context matching. Learners always obtain fixed digital learning resources, which present all learning contents in any context. This study proposed a dynamic ubiquitous learning…

  13. An Approach on Dynamic Geospaital Information Service Composition Based on Context Relationship

    Science.gov (United States)

    Cheng, D.; Wang, F.

    2011-08-01

    For the new demand of dynamic integration of spatial data, the model and collaboration of processing functions caused by characteristics of dynamic interaction and the participants random demand for information in geo-collaboration work system, for instance, during on-line consultation meeting, the paper presented an approach considering context dynamic service composition information. Firstly, the paper introduced the dividing method of context relation in consultation from the user and GI services, interaction between services and service perspective, constructed GI service context relation, and established GI service description model considering service's context relation based on OWL-S. For the problems above, the paper proposed an approach on GI service dynamic composition based on context relationship. Then, the paper provided a framework for GI services dynamic composition, and discussed every important component of framework. Finally, an experiment about checking illegal construction on boundary was implemented to illustrate the concepts and ideas discussed in the paper.

  14. Orders of magnitude extension of the effective dynamic range of TDC-based TOFMS data through maximum likelihood estimation.

    Science.gov (United States)

    Ipsen, Andreas; Ebbels, Timothy M D

    2014-10-01

    In a recent article, we derived a probability distribution that was shown to closely approximate that of the data produced by liquid chromatography time-of-flight mass spectrometry (LC/TOFMS) instruments employing time-to-digital converters (TDCs) as part of their detection system. The approach of formulating detailed and highly accurate mathematical models of LC/MS data via probability distributions that are parameterized by quantities of analytical interest does not appear to have been fully explored before. However, we believe it could lead to a statistically rigorous framework for addressing many of the data analytical problems that arise in LC/MS studies. In this article, we present new procedures for correcting for TDC saturation using such an approach and demonstrate that there is potential for significant improvements in the effective dynamic range of TDC-based mass spectrometers, which could make them much more competitive with the alternative analog-to-digital converters (ADCs). The degree of improvement depends on our ability to generate mass and chromatographic peaks that conform to known mathematical functions and our ability to accurately describe the state of the detector dead time-tasks that may be best addressed through engineering efforts.

  15. Orders of Magnitude Extension of the Effective Dynamic Range of TDC-Based TOFMS Data Through Maximum Likelihood Estimation

    Science.gov (United States)

    Ipsen, Andreas; Ebbels, Timothy M. D.

    2014-10-01

    In a recent article, we derived a probability distribution that was shown to closely approximate that of the data produced by liquid chromatography time-of-flight mass spectrometry (LC/TOFMS) instruments employing time-to-digital converters (TDCs) as part of their detection system. The approach of formulating detailed and highly accurate mathematical models of LC/MS data via probability distributions that are parameterized by quantities of analytical interest does not appear to have been fully explored before. However, we believe it could lead to a statistically rigorous framework for addressing many of the data analytical problems that arise in LC/MS studies. In this article, we present new procedures for correcting for TDC saturation using such an approach and demonstrate that there is potential for significant improvements in the effective dynamic range of TDC-based mass spectrometers, which could make them much more competitive with the alternative analog-to-digital converters (ADCs). The degree of improvement depends on our ability to generate mass and chromatographic peaks that conform to known mathematical functions and our ability to accurately describe the state of the detector dead time—tasks that may be best addressed through engineering efforts.

  16. On divergences tests for composite hypotheses under composite likelihood

    OpenAIRE

    Martin, Nirian; Pardo, Leandro; Zografos, Konstantinos

    2016-01-01

    It is well-known that in some situations it is not easy to compute the likelihood function as the datasets might be large or the model is too complex. In that contexts composite likelihood, derived by multiplying the likelihoods of subjects of the variables, may be useful. The extension of the classical likelihood ratio test statistics to the framework of composite likelihoods is used as a procedure to solve the problem of testing in the context of composite likelihood. In this paper we intro...

  17. Optimizing the Quality of Dynamic Context Subscriptions for Scarce Network Resources

    DEFF Research Database (Denmark)

    Shawky, Ahmed; Olsen, Rasmus Løvenstein; Pedersen, Jens Myrup

    2012-01-01

    Scalable access to dynamic context information is a key challenge for future context-sensitive systems. When increasing the access frequency, the information accuracy can improve but at the same time the additional context management traffic may reduce network performance, which creates the oppos......Scalable access to dynamic context information is a key challenge for future context-sensitive systems. When increasing the access frequency, the information accuracy can improve but at the same time the additional context management traffic may reduce network performance, which creates...... the opposite effect on information reliability. In order to understand and control this trade-off, this paper develops a model that allows to calculate context reliability, captured by the so-called mismatch probability, in relation to the network load. The model is subsequently used for a real time algorithm...

  18. An Analysis of Discourse Production and Cognitive Context Dynamism Based on Relevance-Adaptation Model

    Institute of Scientific and Technical Information of China (English)

    YANG Jing

    2014-01-01

    Taken discourse production as the research objective, it holds that discourse production is dynamic in human communi-cation. It attempts to analyze the dynamics on the basis of Relevance-adaption model from the perspective of cognitive pragmat-ics and explain the role of the context dynamics that plays in the discourse production.

  19. Rising Above Chaotic Likelihoods

    CERN Document Server

    Du, Hailiang

    2014-01-01

    Berliner (Likelihood and Bayesian prediction for chaotic systems, J. Am. Stat. Assoc. 1991) identified a number of difficulties in using the likelihood function within the Bayesian paradigm for state estimation and parameter estimation of chaotic systems. Even when the equations of the system are given, he demonstrated "chaotic likelihood functions" of initial conditions and parameter values in the 1-D Logistic Map. Chaotic likelihood functions, while ultimately smooth, have such complicated small scale structure as to cast doubt on the possibility of identifying high likelihood estimates in practice. In this paper, the challenge of chaotic likelihoods is overcome by embedding the observations in a higher dimensional sequence-space, which is shown to allow good state estimation with finite computational power. An Importance Sampling approach is introduced, where Pseudo-orbit Data Assimilation is employed in the sequence-space in order first to identify relevant pseudo-orbits and then relevant trajectories. Es...

  20. Motivational Dynamics in Language Learning: Change, Stability, and Context

    Science.gov (United States)

    Waninge, Freerkien; Dörnyei, Zoltán; De Bot, Kees

    2014-01-01

    Motivation as a variable in L2 development is no longer seen as the stable individual difference factor it was once believed to be: Influenced by process-oriented models and principles, and especially by the growing understanding of how complex dynamic systems work, researchers have been focusing increasingly on the dynamic and changeable nature…

  1. Participatory Climate Research in a Dynamic Urban Context

    Science.gov (United States)

    Horton, R. M.

    2016-12-01

    The Consortium for Climate Risk in the Urban Northeast (CCRUN), one of ten NOAA-RISA's, supports resilience efforts in the urban corridor stretching from Philadelphia to Boston. Challenges and opportunities include the diverse set of needs in broad urban contexts, as well as the integration of interdisciplinary perspectives. CCRUN is addressing these challenges through 1) stakeholder surveys, 2) webinar series that enable scientists to engage with stakeholders, 3) leveraging extreme events as focusing opportunities, and 4) the development of an integrated project framework. Moving forward, increasing extreme events can lead to unexpected detours, and further effort is needed around facilitating place-based research in an interdisciplinary context.

  2. Relevance of context and time-frame in bursty dynamics

    CERN Document Server

    Jo, Hang-Hyun; Perotti, Juan I; Kaski, Kimmo

    2013-01-01

    Inhomogeneous temporal processes in natural and social phenomena have been described by bursts that are rapidly occurring events within short periods alternating with long periods of low activity. Such a temporal process can be decomposed into sub-processes, according to the contexts, i.e. circumstances in which the events occur. Then contextual bursts for each sub-process are related to context-blind bursts for the original process. This requires to study contextual bursts in real time-frame as well as in ordinal time-frame, where the real timings of events are replaced by their orders in the event sequence. By analyzing a model of uncorrelated inter-event times we find that contextual bursts in real time-frame can be dominated by either context-blind bursts or contextual bursts in ordinal time-frame, or be characterized by both. These results on the relevance of context and time-frame give insight into the origin of bursts.

  3. Product of Likelihood Ratio Scores Fusion of Dynamic Face, Text Independent Speech and On-line Signature Based Biometrics Verification Application Systems

    Directory of Open Access Journals (Sweden)

    Mohamed SOLTANE

    2015-09-01

    Full Text Available In this paper, the use of finite Gaussian mixture modal (GMM tuned using Expectation Maximization (EM estimating algorithms for score level data fusion is proposed. Automated biometric systems for human identification measure a “signature” of the human body, compare the resulting characteristic to a database, and render an application dependent decision. These biometric systems for personal authentication and identification are based upon physiological or behavioral features which are typically distinctive, Multi-biometric systems, which consolidate information from multiple biometric sources, are gaining popularity because they are able to overcome limitations such as non-universality, noisy sensor data, large intra-user variations and susceptibility to spoof attacks that are commonly encountered in mono modal biometric systems. Simulation result show that finite mixture modal (GMM is quite effective in modelling the genuine and impostor score densities, fusion based the product of Likelihood Ratio achieves a significant performance on eNTERFACE 2005 multi-biometric database based on dynamic face, on-line signature and text independent speech modalities.

  4. UNESCO and World Heritage: National Contexts, International Dynamics

    DEFF Research Database (Denmark)

    Andersen, Casper; Kozymka, Irena

    contexts that shape the ideas and practices of the World Heritage system. This includes examining the influence of the World Heritage system on the behaviour of nation states, both domestically and internationally, but also shedding light on how national traditions of heritage management and national...... particularly patchy. This includes studies of East and North Africa, Asia and the Pacific, Europe and North America, and the former Soviet Union....

  5. DynamicSchema: a lightweight persistency framework for context-oriented data management

    OpenAIRE

    Castro, Sergio; González, Sebastián; Mens, Kim; Denker, Marcus

    2012-01-01

    International audience; While context-oriented programming technology so far has focused mostly on behavioral adaptation, context-oriented data management has received much less attention. In this paper we make a case for the problem of context-oriented data management, using a concrete example of a mobile application. We illustrate some of the issues involved and propose a lightweight persistency framework, called DynamicSchema, that resolves some of these issues. The solution consists in a ...

  6. Dynamic Planet Mercury in the Context of Its Environment

    CERN Document Server

    Clark, Pamela Elizabeth

    2007-01-01

    We are in a time of transition in our understanding of Mercury. Of particular interest here is the emerging picture of the planet as a system, with interactions between interior, surface, exosphere, and magnetosphere that have influenced and constrained the evolution of each part of the system. Previous books have emphasized the results of Mariner 10 and current ground-based measurements, with very little discussion of the nature and influence of the magnetosphere. This book will present the planet in the context of its surroundings, thus providing a foundation for the next major influx of information from Mercury and contributing to the planning for future missions.

  7. Algorithm to illustrate context using dynamic lighting effects

    Science.gov (United States)

    John, Roshy M.; Balasubramanian, T.

    2007-09-01

    With the invention of Ultra-Bright LED, solid state lighting has come to something which is much more efficient and energy saving when compared to conventional incandescent or fluorescent lighting. With the use of proper driver electronics now a days it is possible to install solid state lighting systems with the cost same as that of any other lighting technology. This paper is a part of the research project we are doing in our lab, which deals with using ultra bright LEDs of different colors for lighting applications. The driver electronics are made in such a way that, the color and brightness of the lights will change according to context. For instance, if one of the users is reading a story or listening to music in a Personal Computer or in a hand held device such as a PDA, the lighting systems and the HVAC (Heating Ventilation Air-conditioning) systems will change dramatically according to the content of the story or the music. The vulnerability of solid-state lighting helps to accomplish such an effect. Such a type of system will help the reader to feel the story mentally and physically as well. We developed complete driver electronics for the system using multiple microcomputers and a full software suite which uses complex algorithms to decode the context from text or music and synchronize it to lighting and HVAC information. The paper also presents some case-study statistics which shows the advantage of using the system to teach kindergarten children, deaf and dumb children and for language learning classes.

  8. Dynamic landscapes: a model of context and contingency in evolution.

    Science.gov (United States)

    Foster, David V; Rorick, Mary M; Gesell, Tanja; Feeney, Laura M; Foster, Jacob G

    2013-10-01

    Although the basic mechanics of evolution have been understood since Darwin, debate continues over whether macroevolutionary phenomena are driven by the fitness structure of genotype space or by ecological interaction. In this paper we propose a simple model capturing key features of fitness-landscape and ecological models of evolution. Our model describes evolutionary dynamics in a high-dimensional, structured genotype space with interspecies interaction. We find promising qualitative similarity with the empirical facts about macroevolution, including broadly distributed extinction sizes and realistic exploration of the genotype space. The abstraction of our model permits numerous applications beyond macroevolution, including protein and RNA evolution.

  9. Dynamic Bayesian Networks for Context-Aware Fall Risk Assessment

    Directory of Open Access Journals (Sweden)

    Gregory Koshmak

    2014-05-01

    Full Text Available Fall incidents among the elderly often occur in the home and can cause serious injuries affecting their independent living. This paper presents an approach where data from wearable sensors integrated in a smart home environment is combined using a dynamic Bayesian network. The smart home environment provides contextual data, obtained from environmental sensors, and contributes to assessing a fall risk probability. The evaluation of the developed system is performed through simulation. Each time step is represented by a single user activity and interacts with a fall sensors located on a mobile device. A posterior probability is calculated for each recognized activity or contextual information. The output of the system provides a total risk assessment of falling given a response from the fall sensor.

  10. A single-rate context-dependent learning process underlies rapid adaptation to familiar object dynamics.

    Directory of Open Access Journals (Sweden)

    James N Ingram

    2011-09-01

    Full Text Available Motor learning has been extensively studied using dynamic (force-field perturbations. These induce movement errors that result in adaptive changes to the motor commands. Several state-space models have been developed to explain how trial-by-trial errors drive the progressive adaptation observed in such studies. These models have been applied to adaptation involving novel dynamics, which typically occurs over tens to hundreds of trials, and which appears to be mediated by a dual-rate adaptation process. In contrast, when manipulating objects with familiar dynamics, subjects adapt rapidly within a few trials. Here, we apply state-space models to familiar dynamics, asking whether adaptation is mediated by a single-rate or dual-rate process. Previously, we reported a task in which subjects rotate an object with known dynamics. By presenting the object at different visual orientations, adaptation was shown to be context-specific, with limited generalization to novel orientations. Here we show that a multiple-context state-space model, with a generalization function tuned to visual object orientation, can reproduce the time-course of adaptation and de-adaptation as well as the observed context-dependent behavior. In contrast to the dual-rate process associated with novel dynamics, we show that a single-rate process mediates adaptation to familiar object dynamics. The model predicts that during exposure to the object across multiple orientations, there will be a degree of independence for adaptation and de-adaptation within each context, and that the states associated with all contexts will slowly de-adapt during exposure in one particular context. We confirm these predictions in two new experiments. Results of the current study thus highlight similarities and differences in the processes engaged during exposure to novel versus familiar dynamics. In both cases, adaptation is mediated by multiple context-specific representations. In the case of familiar

  11. Measurement of the Top Quark Mass with the Dynamical Likelihood Method using Lepton plus Jets Events with b-tags in ppbar Collisions at s**(1/2) = 1.96 TeV

    CERN Document Server

    Abulencia, A; Adelman, J; Affolder, T; Akimoto, T; Albrow, M G; Ambrose, D; Amerio, S; Amidei, D; Anastassov, A; Anikeev, K; Annovi, A; Antos, J; Aoki, M; Apollinari, G; Arguin, J F; Arisawa, T; Artikov, A; Ashmanskas, W; Attal, A; Azfar, F; Azzi-Bacchetta, P; Azzurri, P; Bacchetta, N; Bachacou, H; Badgett, W; Barbaro-Galtieri, A; Barnes, V E; Barnett, B A; Baroiant, S; Bartsch, V; Bauer, G; Bedeschi, F; Behari, S; Belforte, S; Bellettini, G; Bellinger, J; Belloni, A; Ben-Haim, E; Benjamin, D; Beretvas, A; Beringer, J; Berry, T; Bhatti, A; Binkley, M; Bisello, D; Bishai, M; Blair, R E; Blocker, C; Bloom, K; Blumenfeld, B; Bocci, A; Bodek, A; Boisvert, V; Bölla, G; Bolshov, A; Bortoletto, D; Boudreau, J; Bourov, S; Boveia, A; Brau, B; Bromberg, C; Brubaker, E; Budagov, Yu A; Budd, H S; Budd, S; Burkett, K; Busetto, G; Bussey, P; Byrum, K L; Cabrera, S; Campanelli, M; Campbell, M; Canelli, F; Canepa, A; Carlsmith, D; Carosi, R; Carron, S; Casarsa, M; Castro, A; Catastini, P; Cauz, D; Cavalli-Sforza, M; Cerri, A; Cerrito, L; Chang, S H; Chapman, J; Chen, Y C; Chertok, M; Chiarelli, G; Chlachidze, G; Chlebana, F; Cho, I; Cho, K; Chokheli, D; Chou, J P; Chu, P H; Chuang, S H; Chung, K; Chung, W H; Chung, Y S; Ciljak, M; Ciobanu, C I; Ciocci, M A; Clark, A; Clark, D; Coca, M; Connolly, A; Convery, M E; Conway, J; Cooper, B; Copic, K; Cordelli, M; Cortiana, G; Cruz, A; Cuevas-Maestro, J; Culbertson, R; Cyr, D; D'Auria, S; D'onofrio, M; Da Ronco, S; Dagenhart, D; De Barbaro, P; De Cecco, S; De Lentdecker, G; De Pedis, D; Deisher, A; Dell'Orso, Mauro; Demers, S; Demortier, L; Deng, J; Deninno, M; Derwent, P F; Di Giovanni, G P; Di Turo, P; Dionisi, C; Dittmann, J R; Dominguez, A; Donati, S; Donega, M; Dong, P; Donini, J; Dorigo, T; Dorr, C; Dube, S; Ebina, K; Efron, J; Ehlers, J; Erbacher, R; Errede, D; Errede, S; Eusebi, R; Fang, H C; Farrington, S; Fedorko, I; Fedorko, W T; Feild, R G; Feindt, M; Fernández, J P; Field, R; Flanagan, G; Flores-Castillo, L R; Foland, A; Forrester, S; Foster, G W; Franklin, M; Freeman, J C; Fujii, Y; Furic, I; Gajjar, A; Gallinaro, M; Galyardt, J; García, J E; García-Sciveres, M; Garfinkel, A F; Gay, C; Gerberich, H; Gerchtein, E; Gerdes, D; Giagu, S; Giannetti, P; Gibson, A; Gibson, K; Ginsburg, C; Giokaris, N; Giolo, K; Giordani, M; Giunta, M; Giurgiu, G; Glagolev, V; Glenzinski, D A; Gold, M; Goldschmidt, N; Goldstein, J; Goncharov, M; González, O; Gorelov, I; Goshaw, A T; Gotra, Yu; Goulianos, K; Gresele, A; Griffiths, M; Grinstein, S; Grosso-Pilcher, C; Grundler, U; Guimarães da Costa, J; Gómez, G; Gómez-Ceballos, G; Haber, C; Hahn, K; Hahn, S R; Halkiadakis, E; Hamilton, A; Han, B Y; Handler, R; Happacher, F; Hara, K; Hare, M; Harper, S; Harr, R F; Harris, R M; Hatakeyama, K; Hauser, J; Hays, C; Hayward, H; Heijboer, A; Heinemann, B; Heinrich, J; Hennecke, M; Herndon, M; Heuser, J; Hidas, D; Hill, C S; Hirschbuehl, D; Holloway, A; Hou, S; Houlden, M; Hsu, S C; Huffman, B T; Hughes, R E; Huston, J; Höcker, A; Ikado, K; Incandela, J R; Introzzi, G; Iori, M; Ishizawa, Y; Ivanov, A; Iyutin, B; James, E; Jang, D; Jayatilaka, B; Jeans, D; Jensen, H; Jeon, E J; Jones, M; Joo, K K; Jun, S Y; Junk, T R; Kamon, T; Kang, J; Karagoz-Unel, M; Karchin, P E; Kato, Y; Kemp, Y; Kephart, R; Kerzel, U; Khotilovich, V; Kilminster, B; Kim, D H; Kim, H S; Kim, J E; Kim, M J; Kim, M S; Kim, S B; Kim, S H; Kim, Y K; Kirby, M; Kirsch, L; Klimenko, S; Klute, M; Knuteson, B; Ko, B R; Kobayashi, H; Kondo, K; Kong, D J; Konigsberg, J; Kordas, K; Korytov, A; Kotwal, A V; Kovalev, A; Kraus, J; Kravchenko, I; Kreps, M; Kreymer, A; Kroll, J; Krumnack, N; Kruse, M; Krutelyov, V; Kuhlmann, S E; Kusakabe, Y; Kwang, S; Laasanen, A T; Lai, S; Lami, S; Lammel, S; Lancaster, M; Lander, R L; Lannon, K; Lath, A; Latino, G; Lazzizzera, I; LeCompte, T; Lecci, C; Lee, J; Lee, S W; Lefèvre, R; Leonardo, N; Leone, S; Levy, S; Lewis, J D; Li, K; Lin, C; Lin, C S; Lindgren, M; Lipeles, E; Liss, T M; Lister, A; Litvintsev, D O; Liu, T; Liu, Y; Lockyer, N S; Loginov, A; Loreti, M; Loverre, P F; Lu, R S; Lucchesi, D; Lujan, P; Lukens, P; Lungu, G; Lyons, L; Lys, J; Lysak, R; Lytken, E; MacQueen, D; Mack, P; Madrak, R; Maeshima, K; Maksimovic, P; Manca, G; Margaroli, F; Marginean, R; Marino, C; Martin, A; Martin, M; Martin, V; Martínez, M; Maruyama, T; Matsunaga, H; Mattson, M E; Mazini, R; Mazzanti, P; McFarland, K S; McGivern, D; McIntyre, P; McNamara, P; McNulty, R; Mehta, A; Menzemer, S; Menzione, A; Merkel, P; Mesropian, C; Messina, A; Miao, T; Miladinovic, N; Miles, J; Miller, J S; Miller, R; Mills, C; Milnik, M; Miquel, R; Miscetti, S; Mitselmakher, G; Miyamoto, A; Moggi, N; Mohr, B; Moore, R; Morello, M; Movilla-Fernández, P A; Mukherjee, A; Mulhearn, M; Mumford, R; Murat, P; Müller, T; Mülmenstädt, J; Nachtman, J; Nahn, S; Nakano, I; Napier, A; Naumov, D; Necula, V; Neu, C; Neubauer, M S; Nielsen, J; Nigmanov, T; Nodulman, L; Norniella, O; Ogawa, T; Oh, S H; Oh, Y D; Okusawa, T; Oldeman, R; Orava, R; Pagliarone, C; Palencia, E; Paoletti, R; Papadimitriou, V; Papikonomou, A; Paramonov, A A; Parks, B; Pashapour, S; Patrick, J; Pauletta, G; Paulini, M; Paus, C; Pellett, D E; Penzo, Aldo L; Phillips, T J; Piacentino, G; Piedra, J; Pitts, K; Plager, C; Pondrom, L; Pope, G; Portell, X; Poukhov, O; Pounder, N; Prakoshyn, F; Pronko, A; Proudfoot, J; Ptohos, F; Punzi, G; Pursley, J; Rademacker, J; Rahaman, A; Rakitine, A; Rappoccio, S; Ratnikov, F; Reisert, B; Rekovic, V; Renton, P B; Rescigno, M; Richter, S; Rimondi, F; Rinnert, K; Ristori, L; Robertson, W J; Robson, A; Rodrigo, T; Rogers, E; Rolli, S; Roser, R; Rossi, M; Rossin, R; Rott, C; Ruiz, A; Russ, J; Rusu, V; Ryan, D; Saarikko, H; Sabik, S; Safonov, A; Saint-Denis, R; Sakumoto, W K; Salamanna, G; Salto, O; Saltzberg, D; Santi, L; Sarkar, S; Sato, K; Savard, P; Savoy-Navarro, A; Scheidle, T; Schlabach, P; Schmidt, E E; Schmidt, M P; Schmitt, M; Schwarz, T; Scodellaro, L; Scott, A L; Scribano, A; Scuri, F; Sedov, A; Seidel, S; Seiya, Y; Semenov, A; Semeria, F; Sexton-Kennedy, L; Sfiligoi, I; Shapiro, M D; Shears, T G; Shepard, P F; Sherman, D; Shimojima, M; Shochet, M; Shon, Y; Shreyber, I; Sidoti, A; Sill, A; Sinervo, P; Sisakian, A; Sjölin, J; Skiba, A; Slaughter, A J; Sliwa, K; Smirnov, D; Smith, J R; Snider, F D; Snihur, R; Soha, A; Somalwar, S; Sorin, V; Spalding, J; Spinella, F; Squillacioti, P; Stanitzki, M; Staveris-Polykalas, A; Stelzer, B; Stelzer-Chilton, O; Stentz, D; Strologas, J; Stuart, D; Suh, J S; Sukhanov, A; Sumorok, K; Sun, H; Suzuki, T; Sánchez, C; Söderberg, M; Taffard, A; Tafirout, R; Takashima, R; Takeuchi, Y; Takikawa, K; Tanaka, M; Tanaka, R; Tecchio, M; Teng, P K; Terashi, K; Tether, S; Thom, J; Thompson, A S; Thomson, E; Tipton, P; Tiwari, V; Tkaczyk, S; Toback, D; Tokar, S; Tollefson, K; Tomura, T; Tonelli, D; Tonnesmann, M; Torre, S; Torretta, D; Tourneur, S; Trischuk, W; Tsuchiya, R; Tsuno, S; Turini, N; Ukegawa, F; Unverhau, T; Uozumi, S; Usynin, D; Vacavant, L; Vaiciulis, A W; Vallecorsa, S; Van Remortel, N; Varganov, A; Vataga, E; Velev, G; Veramendi, G; Veszpremi, V; Vickey, T; Vidal, R; Vila, I; Vilar, R; Vollrath, I; Volobuev, I P; Von der Mey, M; Wagner, P; Wagner, R G; Wagner, R L; Wagner, W; Wallny, R; Walter, T; Wan, Z; Wang, M J; Wang, S M; Warburton, A; Ward, B; Waschke, S; Waters, D; Watts, T; Weber, M; Wester, W C; Whitehouse, B; Whiteson, D; Wicklund, A B; Wicklund, E; Williams, H H; Wilson, P; Winer, B L; Wittich, P; Wolbers, S; Wolfe, C; Worm, S; Wright, T; Wu, X; Wynne, S M; Würthwein, F; Yagil, A; Yamamoto, K; Yamaoka, J; Yamashita, Y; Yang, C; Yang, U K; Yao, W M; Yeh, G P; Yoh, J; Yorita, K; Yoshida, T; Yu, I; Yu, S S; Yun, J C; Zanello, L; Zanetti, A; Zaw, I; Zetti, F; Zhang, X; Zhou, J; Zucchelli, S; Österberg, K

    2006-01-01

    This report describes a measurement of the top quark mass, M_{top}, with the dynamical likelihood method (DLM) using the CDF II detector at the Fermilab Tevatron. The Tevatron produces top/anti-top pairs in protons and anti-protons collisions at a center-of-mass energy of 1.96 TeV. The data sample used in this analysis was accumulated from March 2002 through August 2004, which corresponds to an integrated luminosity of 318 pb^{-1}. We use the top/anti-top candidates in the ``lepton+jets'' decay channel, requiring at least one jet identified as a b quark by finding a displaced secondary vertex. The DLM defines a likelihood for each event based on the differential cross section as a function of M_{top} per unit phase space volume of the final partons, multiplied by the transfer functions from jet to parton energies. The method takes into account all possible jet combinations in an event, and the likelihood is multiplied event by event to derive the top quark mass by the maximum likelihood method. Using 63 top q...

  12. Enhancing Learners' Emotions in an L2 Context through Emotionalized Dynamic Assessment

    Science.gov (United States)

    Abdolrezapour, Parisa; Tavakoli, Mansoor; Ketabi, Saeed

    2013-01-01

    The aim of this study was to gain more in-depth understanding of students' emotions in an EFL context by applying dynamic assessment (DA) procedures to the development of learners' emotional intelligence. The study with 50 intermediate learners aged 12-15 used three modalities: a control group, which was taught under institute's normal procedures;…

  13. Evaluation of the inelastic heat fraction in the context of microstructure supported dynamic plasticity modelling

    OpenAIRE

    Longère, Patrice; Dragon, A. André

    2008-01-01

    Evaluation of the inelastic heat fraction in the context of microstructure supported dynamic plasticity modelling correspondence: Corresponding author. (Longere, Patrice) (Longere, Patrice) (Dragon, A. Andre) Laboratoire de Genie Mecanique et Materiaux ? Universite de Bretagne Sud ? Rue de Saint-Maude - BP 92116--> , 56321 LORIENT Cedex--> - FRANCE (Longere, Patrice)...

  14. Equalized near maximum likelihood detector

    OpenAIRE

    2012-01-01

    This paper presents new detector that is used to mitigate intersymbol interference introduced by bandlimited channels. This detector is named equalized near maximum likelihood detector which combines nonlinear equalizer and near maximum likelihood detector. Simulation results show that the performance of equalized near maximum likelihood detector is better than the performance of nonlinear equalizer but worse than near maximum likelihood detector.

  15. Dynamic Shared Context Processing in an E-Collaborative Learning Environment

    CERN Document Server

    Peng, Jing; Deniaud, Samuel; Ferney, Michel

    2012-01-01

    In this paper, we propose a dynamic shared context processing method based on DSC (Dynamic Shared Context) model, applied in an e-collaborative learning environment. Firstly, we present the model. This is a way to measure the relevance between events and roles in collaborative environments. With this method, we can share the most appropriate event information for each role instead of sharing all information to all roles in a collaborative work environment. Then, we apply and verify this method in our project with Google App supported e-learning collaborative environment. During this experiment, we compared DSC method measured relevance of events and roles to manual measured relevance. And we describe the favorable points from this comparison and our finding. Finally, we discuss our future research of a hybrid DSC method to make dynamical information shared more effective in a collaborative work environment.

  16. Temporal dynamics of Arc gene induction in hippocampus: relationship to context memory formation.

    Science.gov (United States)

    Pevzner, Aleksandr; Miyashita, Teiko; Schiffman, Aaron J; Guzowski, John F

    2012-03-01

    Past studies have proposed a role for the hippocampus in the rapid encoding of context memories. Despite this, there is little data regarding the molecular processes underlying the stable formation of a context representation that occurs in the time window established through such behavioral studies. One task that is useful for investigating the rapid encoding of context is contextual fear conditioning (CFC). Behavioral studies demonstrate that animals require approximately 30 s of exploration prior to a footshock to form a contextual representation supporting CFC. Thus, any potential molecular process required for the stabilization of the cellular representation for context must be activated within this narrow and behaviorally defined time window. Detection of the immediate-early gene Arc presents an ideal method to assess the activation of specific neuronal ensembles, given past studies showing the context specific expression of Arc in CA3 and CA1 subfields and the role of Arc in hippocampal long-term synaptic plasticity. Therefore, we examined the temporal dynamics of Arc induction within the hippocampus after brief context exposure to determine whether experience-dependent Arc expression could be involved in the rapid encoding of incidental context memories. We found that the duration of context exposure differentially activated Arc expression in hippocampal subfields, with CA3 showing rapid engagement within as little as 3 s of exposure. By contrast, Arc induction in CA1 required 30 s of context exposure to reach maximal levels. A parallel behavioral experiment revealed that 30 s, but not 3 s, exposure to a context resulted in strong conditioned freezing 24 h later, consistent with past studies from other laboratories. The current study is the first to examine the rapid temporal dynamics of Arc induction in hippocampus in a well-defined context memory paradigm. These studies demonstrate within 30 s of context exposure Arc is fully activated in CA3 and CA1

  17. Context and group dynamics in a CBPR-developed HIV prevention intervention.

    Science.gov (United States)

    Dickson-Gomez, Julia; Corbett, A Michelle; Bodnar, Gloria; Zuniga, Maria Ofelia; Guevara, Carmen Eugenia; Rodriguez, Karla; Navas, Verónica

    2016-03-01

    This paper will explore in detail the effects of context and group dynamics on the development of a multi-level community-based HIV prevention intervention for crack cocaine users in the San Salvador Metropolitan Area, El Salvador. Community partners included residents from marginal communities, service providers from the historic center of San Salvador and research staff from a non-profit organization. The community contexts from which partners came varied considerably and affected structural group dynamics, i.e. who was identified as community partners, their research and organizational capacity, and their ability to represent their communities, with participants from marginal communities most likely to hold community leadership positions and be residents, and those from the center of San Salvador most likely to work in religious organizations dedicated to HIV prevention or feeding indigent drug users. These differences also affected the intervention priorities of different partners. The context of communities changed over time, particularly levels of violence, and affected group dynamics and the intervention developed. Finally, strategies were needed to elicit input from stakeholders under-represented in the community advisory board, in particular active crack users, in order to check the feasibility of the proposed intervention and revise it as necessary. Because El Salvador is a very different context than that in which most CBPR studies have been conducted, our results reveal important contextual factors and their effects on partnerships not often considered in the literature.

  18. Ubiquitous Geo-Sensing for Context-Aware Analysis: Exploring Relationships between Environmental and Human Dynamics

    Directory of Open Access Journals (Sweden)

    Euro Beinat

    2012-07-01

    Full Text Available Ubiquitous geo-sensing enables context-aware analyses of physical and social phenomena, i.e., analyzing one phenomenon in the context of another. Although such context-aware analysis can potentially enable a more holistic understanding of spatio-temporal processes, it is rarely documented in the scientific literature yet. In this paper we analyzed the collective human behavior in the context of the weather. We therefore explored the complex relationships between these two spatio-temporal phenomena to provide novel insights into the dynamics of urban systems. Aggregated mobile phone data, which served as a proxy for collective human behavior, was linked with the weather data from climate stations in the case study area, the city of Udine, Northern Italy. To identify and characterize potential patterns within the weather-human relationships, we developed a hybrid approach which integrates several spatio-temporal statistical analysis methods. Thereby we show that explanatory factor analysis, when applied to a number of meteorological variables, can be used to differentiate between normal and adverse weather conditions. Further, we measured the strength of the relationship between the ‘global’ adverse weather conditions and the spatially explicit effective variations in user-generated mobile network traffic for three distinct periods using the Maximal Information Coefficient (MIC. The analyses result in three spatially referenced maps of MICs which reveal interesting insights into collective human dynamics in the context of weather, but also initiate several new scientific challenges.

  19. Maximum Likelihood Associative Memories

    OpenAIRE

    Gripon, Vincent; Rabbat, Michael

    2013-01-01

    Associative memories are structures that store data in such a way that it can later be retrieved given only a part of its content -- a sort-of error/erasure-resilience property. They are used in applications ranging from caches and memory management in CPUs to database engines. In this work we study associative memories built on the maximum likelihood principle. We derive minimum residual error rates when the data stored comes from a uniform binary source. Second, we determine the minimum amo...

  20. A Likelihood-Based SLIC Superpixel Algorithm for SAR Images Using Generalized Gamma Distribution

    Directory of Open Access Journals (Sweden)

    Huanxin Zou

    2016-07-01

    Full Text Available The simple linear iterative clustering (SLIC method is a recently proposed popular superpixel algorithm. However, this method may generate bad superpixels for synthetic aperture radar (SAR images due to effects of speckle and the large dynamic range of pixel intensity. In this paper, an improved SLIC algorithm for SAR images is proposed. This algorithm exploits the likelihood information of SAR image pixel clusters. Specifically, a local clustering scheme combining intensity similarity with spatial proximity is proposed. Additionally, for post-processing, a local edge-evolving scheme that combines spatial context and likelihood information is introduced as an alternative to the connected components algorithm. To estimate the likelihood information of SAR image clusters, we incorporated a generalized gamma distribution (GГD. Finally, the superiority of the proposed algorithm was validated using both simulated and real-world SAR images.

  1. Dynamical control of quantum systems in the context of mean ergodic theorems

    Science.gov (United States)

    Bernád, J. Z.

    2017-02-01

    Equidistant and non-equidistant single pulse ‘bang-bang’ dynamical controls are investigated in the context of mean ergodic theorems. We show the requirements in which the limit of infinite pulse control for both the equidistant and the non-equidistant dynamical control converges to the same unitary evolution. It is demonstrated that the generator of this evolution can be obtained by projecting the generator of the free evolution onto the commutant of the unitary operator representing the pulse. Inequalities are derived to prove this statement and in the case of non-equidistant approach these inequalities are optimised as a function of the time intervals.

  2. Context-aware adaptation for group communication support applications with dynamic architecture

    CERN Document Server

    Rodriguez, Ismael Bouassida; Chassot, Christophe; Jmaiel, Mohamed

    2008-01-01

    In this paper, we propose a refinement-based adaptation approach for the architecture of distributed group communication support applications. Unlike most of previous works, our approach reaches implementable, context-aware and dynamically adaptable architectures. To model the context, we manage simultaneously four parameters that influence Qos provided by the application. These parameters are: the available bandwidth, the exchanged data communication priority, the energy level and the available memory for processing. These parameters make it possible to refine the choice between the various architectural configurations when passing from a given abstraction level to the lower level which implements it. Our approach allows the importance degree associated with each parameter to be adapted dynamically. To implement adaptation, we switch between the various configurations of the same level, and we modify the state of the entities of a given configuration when necessary. We adopt the direct and mediated Producer-...

  3. Measurement of the top quark mass with the dynamical likelihood method using lepton plus jets events with b-tags in p anti-p collisions at s**(1/2) = 1.96-TeV

    Energy Technology Data Exchange (ETDEWEB)

    Abulencia, A.; Acosta, D.; Adelman, Jahred A.; Affolder, Anthony A.; Akimoto, T.; Albrow, M.G.; Ambrose, D.; Amerio, S.; Amidei, D.; Anastassov, A.; Anikeev, K.; /Taiwan,

    2005-12-01

    This report describes a measurement of the top quark mass, M{sub top}, with the dynamical likelihood method (DLM) using the CDF II detector at the Fermilab Tevatron. The Tevatron produces top/anti-top (t{bar t}) pairs in p{bar p} collisions at a center-of-mass energy of 1.96 TeV. The data sample used in this analysis was accumulated from March 2002 through August 2004, which corresponds to an integrated luminosity of 318 pb{sup -1}. They use the t{bar t} candidates in the ''lepton+jets'' decay channel, requiring at least one jet identified as a b quark by finding an displaced secondary vertex. The DLM defines a likelihood for each event based on the differential cross section as a function of M{sub top} per unit phase space volume of the final partons, multiplied by the transfer functions from jet to parton energies. The method takes into account all possible jet combinations in an event, and the likelihood is multiplied event by event to derive the top quark mass by the maximum likelihood method. Using 63 t{bar t} candidates observed in the data, with 9.2 events expected from background, they measure the top quark mass to be 173.2{sub -2.4}{sup +2.6}(stat.) {+-} 3.2(syst.) GeV/c{sup 2}, or 173.2{sub -4.0}{sup +4.1} GeV/c{sup 2}.

  4. When neutral turns significant: brain dynamics of rapidly formed associations between neutral stimuli and emotional contexts.

    Science.gov (United States)

    Ventura-Bort, Carlos; Löw, Andreas; Wendt, Julia; Dolcos, Florin; Hamm, Alfons O; Weymar, Mathias

    2016-09-01

    The ability to associate neutral stimuli with motivationally relevant outcomes is an important survival strategy. In this study, we used event-related potentials (ERPs) to investigate brain dynamics of associative emotional learning when participants were confronted with multiple heterogeneous information. Participants viewed 144 different objects in the context of 144 different emotional and neutral background scenes. During each trial, neutral objects were shown in isolation and then paired with the background scene. All pairings were presented twice to compare ERPs in response to neutral objects before and after single association. After single pairing, neutral objects previously encoded in the context of emotional scenes evoked a larger P100 over occipital electrodes compared to objects that were previously paired with neutral scenes. Likewise, larger late positive potentials (LPPs) were observed over parieto-occipital electrodes (450-750 ms) for objects previously associated with emotional relative to neutral contexts. The LPP - but not P100 - enhancement was also related to subjective object/context binding. Taken together, our ERP data provide evidence for fast emotional associative learning, as reflected by heightened perceptual and sustained elaborative processing for neutral information previously encountered in emotional contexts. These findings could assist in understanding binding mechanisms in stress and anxiety, as well as in addiction and eating-related disorders. © 2016 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.

  5. Dynamic impact of temporal context of Ca²⁺ signals on inhibitory synaptic plasticity.

    Science.gov (United States)

    Kawaguchi, Shin-Ya; Nagasaki, Nobuhiro; Hirano, Tomoo

    2011-01-01

    Neuronal activity-dependent synaptic plasticity, a basis for learning and memory, is tightly correlated with the pattern of increase in intracellular Ca(2+) concentration ([Ca(2+)](i)). Here, using combined application of electrophysiological experiments and systems biological simulation, we show that such a correlation dynamically changes depending on the context of [Ca(2+)](i) increase. In a cerebellar Purkinje cell, long-term potentiation of inhibitory GABA(A) receptor responsiveness (called rebound potentiation; RP) was induced by [Ca(2+)](i) increase in a temporally integrative manner through sustained activation of Ca(2+)/calmodulin-dependent protein kinase II (CaMKII). However, the RP establishment was canceled by coupling of two patterns of RP-inducing [Ca(2+)](i) increase depending on the temporal sequence. Negative feedback signaling by phospho-Thr305/306 CaMKII detected the [Ca(2+)](i) context, and assisted the feedforward inhibition of CaMKII through PDE1, resulting in the RP impairment. The [Ca(2+)](i) context-dependent dynamic regulation of synaptic plasticity might contribute to the temporal refinement of information flow in neuronal networks.

  6. Indoor Localisation Using a Context-Aware Dynamic Position Tracking Model

    Directory of Open Access Journals (Sweden)

    Montserrat Ros

    2012-01-01

    Full Text Available Indoor wireless localisation is a widely sought feature for use in logistics, health, and social networking applications. Low-powered localisation will become important for the next generation of pervasive media applications that operate on mobile platforms. We present an inexpensive and robust context-aware tracking system that can track the position of users in an indoor environment, using a wireless smart meter network. Our context-aware tracking system combines wireless trilateration with a dynamic position tracking model and a probability density map to estimate indoor positions. The localisation network consisted of power meter nodes placed at known positions in a building. The power meter nodes are tracked by mobile nodes which are carried by users to localise their position. We conducted an extensive trial of the context-aware tracking system and performed a comparison analysis with existing localisation techniques. The context-aware tracking system was able to localise a person's indoor position with an average error of 1.21 m.

  7. Task and Context Sensitive Gripper Design Learning Using Dynamic Grasp Simulation

    DEFF Research Database (Denmark)

    Wolniakowski, Adam; Miatliuk, Konstantsin; Gosiewski, Z.

    2017-01-01

    given a CAD model of an object and a task description. These quality indices are then used to learn task-specific finger designs based on dynamic simulation. We demonstrate our gripper optimization on a parallel finger type gripper described by twelve parameters. We furthermore present a parametrization...... for different task contexts. We provide a qualitative evaluation of the obtained results based on existing design guidelines and our engineering experience. In addition, we show that with our method we achieve superior alignment properties compared to a naive approach with a cutout based on the “inverse...... of the grasping task and context, which is essential as an input to the computation of gripper performance. We exemplify important aspects of the indices by looking at their performance on subsets of the parameter space by discussing the decoupling of parameters and show optimization results for two use cases...

  8. Augmented Likelihood Image Reconstruction.

    Science.gov (United States)

    Stille, Maik; Kleine, Matthias; Hägele, Julian; Barkhausen, Jörg; Buzug, Thorsten M

    2016-01-01

    The presence of high-density objects remains an open problem in medical CT imaging. Data of projections passing through objects of high density, such as metal implants, are dominated by noise and are highly affected by beam hardening and scatter. Reconstructed images become less diagnostically conclusive because of pronounced artifacts that manifest as dark and bright streaks. A new reconstruction algorithm is proposed with the aim to reduce these artifacts by incorporating information about shape and known attenuation coefficients of a metal implant. Image reconstruction is considered as a variational optimization problem. The afore-mentioned prior knowledge is introduced in terms of equality constraints. An augmented Lagrangian approach is adapted in order to minimize the associated log-likelihood function for transmission CT. During iterations, temporally appearing artifacts are reduced with a bilateral filter and new projection values are calculated, which are used later on for the reconstruction. A detailed evaluation in cooperation with radiologists is performed on software and hardware phantoms, as well as on clinically relevant patient data of subjects with various metal implants. Results show that the proposed reconstruction algorithm is able to outperform contemporary metal artifact reduction methods such as normalized metal artifact reduction.

  9. Large-Scale Context-Aware Volume Navigation using Dynamic Insets

    KAUST Repository

    Al-Awami, Ali

    2012-07-01

    Latest developments in electron microscopy (EM) technology produce high resolution images that enable neuro-scientists to identify and put together the complex neural connections in a nervous system. However, because of the massive size and underlying complexity of this kind of data, processing, navigation and analysis suffer drastically in terms of time and effort. In this work, we propose the use of state-of- the-art navigation techniques, such as dynamic insets, built on a peta-scale volume visualization framework to provide focus and context-awareness to help neuro-scientists in their mission to analyze, reconstruct, navigate and explore EM neuroscience data.

  10. Socio-Cultural Dynamics of Education in the Context of the Post-Non-Classical Science

    Directory of Open Access Journals (Sweden)

    V. A. Ignatova

    2012-01-01

    Full Text Available The paper deals with the interrelations between society, education and culture. Using the comparative analysis of classical approaches to defining the above spheres, the author comes to conclusion that the nature of socio-cultural processes can be explored and described most consistently by applying comprehensive models of the post-non-classical science and considering civilization, education and culture in the context of the unified dynamic flow of socio-cultural genesis. The research investigates the dialectics of socio-cultural processes in the light of systematic synergetic approach, the advancing role of education in socio-cultural dynamics being revealed and substantiated. The author emphasizes its inevitably rising priority due to sustained development of civilization bringing about the new environmentally-oriented meta-culture.The obtained results can be used in pedagogic research methodology, designing and modeling the educational process, its content, technology and organization. 

  11. Socio-Cultural Dynamics of Education in the Context of the Post-Non-Classical Science

    Directory of Open Access Journals (Sweden)

    V. A. Ignatova

    2015-02-01

    Full Text Available The paper deals with the interrelations between society, education and culture. Using the comparative analysis of classical approaches to defining the above spheres, the author comes to conclusion that the nature of socio-cultural processes can be explored and described most consistently by applying comprehensive models of the post-non-classical science and considering civilization, education and culture in the context of the unified dynamic flow of socio-cultural genesis. The research investigates the dialectics of socio-cultural processes in the light of systematic synergetic approach, the advancing role of education in socio-cultural dynamics being revealed and substantiated. The author emphasizes its inevitably rising priority due to sustained development of civilization bringing about the new environmentally-oriented meta-culture.The obtained results can be used in pedagogic research methodology, designing and modeling the educational process, its content, technology and organization. 

  12. Enhancing learners’ emotions in an L2 context through emotionalized dynamic assessment

    Directory of Open Access Journals (Sweden)

    Parisa Abdolrezapour

    2013-10-01

    Full Text Available The aim of this study was to gain more in-depth understanding of students’ emotions in an EFL context by applying dynamic assessment (DA procedures to the development of learners’ emotional intelligence. The study with 50 intermediate learners aged 12-15 used three modalities: a control group, which was taught under institute’s normal procedures; a comparison group, which received DA; and an experimental group, which received emotionalized dynamic assessment (EDA procedures, in the form of an intervention focusing on emotional characteristics of Goleman's emotional intelligence framework with the express purpose of inducing them to work with their emotions. The study shows the potential of EDA for increasing one’s emotional intelligence and affords practical guidelines to language teachers as to how to incorporate behaviors relating to emotional intelligence into assessment procedures

  13. Mechanobiology of cell migration in the context of dynamic two-way cell-matrix interactions.

    Science.gov (United States)

    Kurniawan, Nicholas A; Chaudhuri, Parthiv Kant; Lim, Chwee Teck

    2016-05-24

    Migration of cells is integral in various physiological processes in all facets of life. These range from embryonic development, morphogenesis, and wound healing, to disease pathology such as cancer metastasis. While cell migratory behavior has been traditionally studied using simple assays on culture dishes, in recent years it has been increasingly realized that the physical, mechanical, and chemical aspects of the matrix are key determinants of the migration mechanism. In this paper, we will describe the mechanobiological changes that accompany the dynamic cell-matrix interactions during cell migration. Furthermore, we will review what is to date known about how these changes feed back to the dynamics and biomechanical properties of the cell and the matrix. Elucidating the role of these intimate cell-matrix interactions will provide not only a better multi-scale understanding of cell motility in its physiological context, but also a more holistic perspective for designing approaches to regulate cell behavior.

  14. Nested sampling for Bayesian model comparison in the context of Salmonella disease dynamics.

    Science.gov (United States)

    Dybowski, Richard; McKinley, Trevelyan J; Mastroeni, Pietro; Restif, Olivier

    2013-01-01

    Understanding the mechanisms underlying the observed dynamics of complex biological systems requires the statistical assessment and comparison of multiple alternative models. Although this has traditionally been done using maximum likelihood-based methods such as Akaike's Information Criterion (AIC), Bayesian methods have gained in popularity because they provide more informative output in the form of posterior probability distributions. However, comparison between multiple models in a Bayesian framework is made difficult by the computational cost of numerical integration over large parameter spaces. A new, efficient method for the computation of posterior probabilities has recently been proposed and applied to complex problems from the physical sciences. Here we demonstrate how nested sampling can be used for inference and model comparison in biological sciences. We present a reanalysis of data from experimental infection of mice with Salmonella enterica showing the distribution of bacteria in liver cells. In addition to confirming the main finding of the original analysis, which relied on AIC, our approach provides: (a) integration across the parameter space, (b) estimation of the posterior parameter distributions (with visualisations of parameter correlations), and (c) estimation of the posterior predictive distributions for goodness-of-fit assessments of the models. The goodness-of-fit results suggest that alternative mechanistic models and a relaxation of the quasi-stationary assumption should be considered.

  15. Nested sampling for Bayesian model comparison in the context of Salmonella disease dynamics.

    Directory of Open Access Journals (Sweden)

    Richard Dybowski

    Full Text Available Understanding the mechanisms underlying the observed dynamics of complex biological systems requires the statistical assessment and comparison of multiple alternative models. Although this has traditionally been done using maximum likelihood-based methods such as Akaike's Information Criterion (AIC, Bayesian methods have gained in popularity because they provide more informative output in the form of posterior probability distributions. However, comparison between multiple models in a Bayesian framework is made difficult by the computational cost of numerical integration over large parameter spaces. A new, efficient method for the computation of posterior probabilities has recently been proposed and applied to complex problems from the physical sciences. Here we demonstrate how nested sampling can be used for inference and model comparison in biological sciences. We present a reanalysis of data from experimental infection of mice with Salmonella enterica showing the distribution of bacteria in liver cells. In addition to confirming the main finding of the original analysis, which relied on AIC, our approach provides: (a integration across the parameter space, (b estimation of the posterior parameter distributions (with visualisations of parameter correlations, and (c estimation of the posterior predictive distributions for goodness-of-fit assessments of the models. The goodness-of-fit results suggest that alternative mechanistic models and a relaxation of the quasi-stationary assumption should be considered.

  16. Accurate structural correlations from maximum likelihood superpositions.

    Directory of Open Access Journals (Sweden)

    Douglas L Theobald

    2008-02-01

    Full Text Available The cores of globular proteins are densely packed, resulting in complicated networks of structural interactions. These interactions in turn give rise to dynamic structural correlations over a wide range of time scales. Accurate analysis of these complex correlations is crucial for understanding biomolecular mechanisms and for relating structure to function. Here we report a highly accurate technique for inferring the major modes of structural correlation in macromolecules using likelihood-based statistical analysis of sets of structures. This method is generally applicable to any ensemble of related molecules, including families of nuclear magnetic resonance (NMR models, different crystal forms of a protein, and structural alignments of homologous proteins, as well as molecular dynamics trajectories. Dominant modes of structural correlation are determined using principal components analysis (PCA of the maximum likelihood estimate of the correlation matrix. The correlations we identify are inherently independent of the statistical uncertainty and dynamic heterogeneity associated with the structural coordinates. We additionally present an easily interpretable method ("PCA plots" for displaying these positional correlations by color-coding them onto a macromolecular structure. Maximum likelihood PCA of structural superpositions, and the structural PCA plots that illustrate the results, will facilitate the accurate determination of dynamic structural correlations analyzed in diverse fields of structural biology.

  17. In all likelihood statistical modelling and inference using likelihood

    CERN Document Server

    Pawitan, Yudi

    2001-01-01

    Based on a course in the theory of statistics this text concentrates on what can be achieved using the likelihood/Fisherian method of taking account of uncertainty when studying a statistical problem. It takes the concept ot the likelihood as providing the best methods for unifying the demands of statistical modelling and the theory of inference. Every likelihood concept is illustrated by realistic examples, which are not compromised by computational problems. Examples range from asimile comparison of two accident rates, to complex studies that require generalised linear or semiparametric mode

  18. ECONOMIC DYNAMICS OF THE REPUBLIC OF MOLDOVA IN THE CONTEXT OF EUROPEAN INTEGRATION

    Directory of Open Access Journals (Sweden)

    Igor PRISAC

    2015-04-01

    Full Text Available This article provides an analysis of the economic evolution of the Republic of Moldova by defining three evolution and development periods in the regional and global context. Studying the perspectives of integration into the Commonwealth of Independent States (CIS economy, on one hand, and European integration, on the other hand, is another objective of this chapter. The analysis of the investment climate, the business environment and the foreign trade has a major contribution to the perception of the economic dynamics of the Republic of Moldova. Another research direction of this paper is the analysis of the economic reform efforts undertaken by the government and their impact on the economic system of the Republic of Moldova. The research methodology includes a classical approach, such as methods of comparative, historical-analytical, systemic analysis and quantitative and qualitative analysis.

  19. Ultra-Relativistic Magneto-Hydro-Dynamic Jets in the context of Gamma Ray Bursts

    CERN Document Server

    Fendt, C; Fendt, Christian; Ouyed, Rachid

    2004-01-01

    We present a detailed numerical study of the dynamics and evolution of ultrarelativistic magnetohydrodynamic jets in the black hole-disk system under extreme magnetization conditions. We find that Lorentz factors of up to 3000 are achieved and derived a modifiedMichel scaling (Gamma ~ sigma) which allows for a wide variation in the flow Lorentz factor. Pending contamination induced by mass-entrainment, the linear Michel scaling links modulations in the ultrarelativistic wind to variations in mass accretion in the disk for a given magnetization. The jet is asymptotically dominated by the toroidal magnetic field allowing for efficient collimation. We discuss our solutions (jets) in the context of Gamma ray bursts and describe the relevant features such as the high variability in the Lorentz factor and how high collimation angles (~ 0-5 degrees), or cylindrical jets, can be achieved. We isolate a jet instability mechanism we refer to as the "bottle-neck" instability which essentially relies on a high magnetizati...

  20. Task and Context Sensitive Gripper Design Learning Using Dynamic Grasp Simulation

    DEFF Research Database (Denmark)

    Wolniakowski, Adam; Miatliuk, Konstantsin; Gosiewski, Z.

    2017-01-01

    given a CAD model of an object and a task description. These quality indices are then used to learn task-specific finger designs based on dynamic simulation. We demonstrate our gripper optimization on a parallel finger type gripper described by twelve parameters. We furthermore present a parametrization......In this work, we present a generic approach to optimize the design of a parametrized robot gripper including both selected gripper mechanism parameters, and parameters of the finger geometry. We suggest six gripper quality indices that indicate different aspects of the performance of a gripper...... for different task contexts. We provide a qualitative evaluation of the obtained results based on existing design guidelines and our engineering experience. In addition, we show that with our method we achieve superior alignment properties compared to a naive approach with a cutout based on the “inverse...

  1. Likelihood Analysis of Seasonal Cointegration

    DEFF Research Database (Denmark)

    Johansen, Søren; Schaumburg, Ernst

    1999-01-01

    The error correction model for seasonal cointegration is analyzed. Conditions are found under which the process is integrated of order 1 and cointegrated at seasonal frequency, and a representation theorem is given. The likelihood function is analyzed and the numerical calculation of the maximum...... likelihood estimators is discussed. The asymptotic distribution of the likelihood ratio test for cointegrating rank is given. It is shown that the estimated cointegrating vectors are asymptotically mixed Gaussian. The results resemble the results for cointegration at zero frequency when expressed in terms...

  2. Dynamic "inline" images: context-sensitive retrieval and integration of images into Web documents.

    Science.gov (United States)

    Kahn, Charles E

    2008-09-01

    Integrating relevant images into web-based information resources adds value for research and education. This work sought to evaluate the feasibility of using "Web 2.0" technologies to dynamically retrieve and integrate pertinent images into a radiology web site. An online radiology reference of 1,178 textual web documents was selected as the set of target documents. The ARRS GoldMiner image search engine, which incorporated 176,386 images from 228 peer-reviewed journals, retrieved images on demand and integrated them into the documents. At least one image was retrieved in real-time for display as an "inline" image gallery for 87% of the web documents. Each thumbnail image was linked to the full-size image at its original web site. Review of 20 randomly selected Collaborative Hypertext of Radiology documents found that 69 of 72 displayed images (96%) were relevant to the target document. Users could click on the "More" link to search the image collection more comprehensively and, from there, link to the full text of the article. A gallery of relevant radiology images can be inserted easily into web pages on any web server. Indexing by concepts and keywords allows context-aware image retrieval, and searching by document title and subject metadata yields excellent results. These techniques allow web developers to incorporate easily a context-sensitive image gallery into their documents.

  3. The likelihood of Latino women to seek help in response to interpersonalvictimization: An examination of individual, interpersonal and socioculturalinfluences

    Directory of Open Access Journals (Sweden)

    Chiara Sabina

    2014-07-01

    Full Text Available Help-seeking is a process that is influenced by individual, interpersonal, and sociocultural factors. Thecurrent study examined these influences on the likelihood of seeking help (police, pressing charges,medical services, social services, and informal help for interpersonal violence among a national sample ofLatino women. Women living in high-density Latino neighborhoods in the USA were interviewed by phonein their preferred language. Women reporting being, on average, between "somewhat likely" and "verylikely" to seek help should they experience interpersonal victimization. Sequential linear regression resultsindicated that individual (age, depression, interpersonal (having children, past victimization, andsociocultural factors (immigrant status, acculturation were associated with the self-reported likelihood ofseeking help for interpersonal violence. Having children was consistently related to a greater likelihood toseek all forms of help. Overall, women appear to respond to violence in ways that reflects their ecologicalcontext. Help-seeking is best understood within a multi-layered and dynamic context.

  4. Analytic Methods for Cosmological Likelihoods

    OpenAIRE

    Taylor, A. N.; Kitching, T. D.

    2010-01-01

    We present general, analytic methods for Cosmological likelihood analysis and solve the "many-parameters" problem in Cosmology. Maxima are found by Newton's Method, while marginalization over nuisance parameters, and parameter errors and covariances are estimated by analytic marginalization of an arbitrary likelihood function with flat or Gaussian priors. We show that information about remaining parameters is preserved by marginalization. Marginalizing over all parameters, we find an analytic...

  5. ANALYSIS DYNAMICS VALUES FORMULATION IN THE CONTEXT OF THE BUSINESS ORGANIZATION’ S MISSION

    Directory of Open Access Journals (Sweden)

    Marius Costel Esi

    2015-02-01

    Full Text Available The economic activity goals reveal a number of aspects which express the need for re-evaluating the way in which the dynamics analysis values may be correlated with the wording business mission. Under these conditions, managerial undertaken strategies at the level of business may be validated in so far as they reveal purpose/ objectives assumed/ undertaken by decision makers (particularly top-managers. Moreover, compliance with eligibility criteria according to which management strategies are reflected, should be aimed at in our opinion improving decision-making process. But such a decision-making process involves an understanding of the judicious economic actors/ labor with regard to the way in which it is possible to analyze the dynamics values in relation to formulation  of  business organization's mission. In these circumstances, a first objective of this research is analysis dynamics values in the context of formulation of business mission. In this way, by this approach, we strive to show you those conditionings that make it possible formulation of business mission in relation to organizational culture.  On the other hand, a second objective that we have in view is given of the way in which is to bring about the process of defining and statement of organizational mission, a process linked to the size of axiological mission statement of business organization. This status as a matter of fact, in the light of the analysis we take into account,  a business model in which the objectives, strategies, organization mission business become materialized in so far as that contextuality  venture is validated in relation to socio-economic prospects. Therefore, the existence of phenomena such as social and economic situation involves a series of connections between different levels of displacing of the organization of business which provides, in fact, its legitimacy

  6. Regions of constrained maximum likelihood parameter identifiability

    Science.gov (United States)

    Lee, C.-H.; Herget, C. J.

    1975-01-01

    This paper considers the parameter identification problem of general discrete-time, nonlinear, multiple-input/multiple-output dynamic systems with Gaussian-white distributed measurement errors. Knowledge of the system parameterization is assumed to be known. Regions of constrained maximum likelihood (CML) parameter identifiability are established. A computation procedure employing interval arithmetic is proposed for finding explicit regions of parameter identifiability for the case of linear systems. It is shown that if the vector of true parameters is locally CML identifiable, then with probability one, the vector of true parameters is a unique maximal point of the maximum likelihood function in the region of parameter identifiability and the CML estimation sequence will converge to the true parameters.

  7. Database likelihood ratios and familial DNA searching

    CERN Document Server

    Slooten, Klaas

    2012-01-01

    Familial Searching is the process of searching in a DNA database for relatives of a given individual. It is well known that in order to evaluate the genetic evidence in favour of a certain given form of relatedness between two individuals, one needs to calculate the appropriate likelihood ratio, which is in this context called a Kinship Index. Suppose that the database contains, for a given type of relative, at most one related individual. Given prior probabilities of being the relative for all persons in the database, we derive the likelihood ratio for each database member in favour of being that relative. This likelihood ratio takes all the Kinship Indices between target and members of the database into account. We also compute the corresponding posterior probabilities. We then discuss two ways of selecting a subset from the database that contains the relative with a known probability, or at least a useful lower bound thereof. We discuss the relation between these approaches and illustrate them with Familia...

  8. Dynamics of wages in the region and the problem of measurement of wages in the context of economic instability

    Directory of Open Access Journals (Sweden)

    S. S. Gordeev

    2010-12-01

    Full Text Available The paper deals with the analysis of current state and basic tendencies in the dynamics of wages. The authors consider the basic contradictions in the context of establishment of the market institution of wages in the subjects of the Russian Federation. The dynamics of wages is appraised on the basis of the tax accounting of the regions. This approach, according to the authors, allows reflecting the current processes in the sphere of remuneration of labor in the context of economic instability more objectively.

  9. Groups, information theory, and Einstein's likelihood principle

    Science.gov (United States)

    Sicuro, Gabriele; Tempesta, Piergiulio

    2016-04-01

    We propose a unifying picture where the notion of generalized entropy is related to information theory by means of a group-theoretical approach. The group structure comes from the requirement that an entropy be well defined with respect to the composition of independent systems, in the context of a recently proposed generalization of the Shannon-Khinchin axioms. We associate to each member of a large class of entropies a generalized information measure, satisfying the additivity property on a set of independent systems as a consequence of the underlying group law. At the same time, we also show that Einstein's likelihood function naturally emerges as a byproduct of our informational interpretation of (generally nonadditive) entropies. These results confirm the adequacy of composable entropies both in physical and social science contexts.

  10. 基于动态规划算法的最优 Context 量化器设计%Optimized Context Quantizer Design Based on Dynamic Programming Algorithm

    Institute of Scientific and Technical Information of China (English)

    王付艳; 卜春芬; 陈旻

    2015-01-01

    提出一种针对多进制信源的最优 Context 量化器设计方法。该方法不仅综合考虑了量化前后条件概率分布的相似性,同时又将条件位符号的取值相关性作为量化合并的依据,从而使得量化后的 Context 模型能够最大限度地利用信源间相关性,然后动态规划算法被应用于合并相似的条件概率分布,从而实现 Context量化。最后量化器被用于图像的小波压缩编码应用。实验结果表明,量化器能够获得与其他优化量化器相近甚至更好的压缩效果。%The optimal Context quantizer design method aiming at multi-system information sources was put forward.This method not only focused on the similarity of conditional probability distribution before and after quantization,but at the same time,takes the value correlation of conditional source symbols as the gist of quantization combination so as to use greatly information source correlation of Con-text model after quantization.Then the dynamic programming algorithm was used to merge similar conditional probability distribution to realize Context quantization.Last,the quantizer was used into image wavelet compression coding application.The experiment results show that the quantizer can obtain the similar or even better compression effect compared with others.

  11. Maximum likelihood estimation for social network dynamics

    NARCIS (Netherlands)

    Snijders, T.A.B.; Koskinen, J.; Schweinberger, M.

    2010-01-01

    A model for network panel data is discussed, based on the assumption that the observed data are discrete observations of a continuous-time Markov process on the space of all directed graphs on a given node set, in which changes in tie variables are independent conditional on the current graph. The m

  12. On the likelihood of forests

    Science.gov (United States)

    Shang, Yilun

    2016-08-01

    How complex a network is crucially impacts its function and performance. In many modern applications, the networks involved have a growth property and sparse structures, which pose challenges to physicists and applied mathematicians. In this paper, we introduce the forest likelihood as a plausible measure to gauge how difficult it is to construct a forest in a non-preferential attachment way. Based on the notions of admittable labeling and path construction, we propose algorithms for computing the forest likelihood of a given forest. Concrete examples as well as the distributions of forest likelihoods for all forests with some fixed numbers of nodes are presented. Moreover, we illustrate the ideas on real-life networks, including a benzenoid tree, a mathematical family tree, and a peer-to-peer network.

  13. Obtaining reliable Likelihood Ratio tests from simulated likelihood functions

    DEFF Research Database (Denmark)

    Andersen, Laura Mørch

    It is standard practice by researchers and the default option in many statistical programs to base test statistics for mixed models on simulations using asymmetric draws (e.g. Halton draws). This paper shows that when the estimated likelihood functions depend on standard deviations of mixed...

  14. Combining point context and dynamic time warping for online gesture recognition

    Science.gov (United States)

    Mao, Xia; Li, Chen

    2017-05-01

    Previous gesture recognition methods usually focused on recognizing gestures after the entire gesture sequences were obtained. However, in many practical applications, a system has to identify gestures before they end to give instant feedback. We present an online gesture recognition approach that can realize early recognition of unfinished gestures with low latency. First, a curvature buffer-based point context (CBPC) descriptor is proposed to extract the shape feature of a gesture trajectory. The CBPC descriptor is a complete descriptor with a simple computation, and thus has its superiority in online scenarios. Then, we introduce an online windowed dynamic time warping algorithm to realize online matching between the ongoing gesture and the template gestures. In the algorithm, computational complexity is effectively decreased by adding a sliding window to the accumulative distance matrix. Lastly, the experiments are conducted on the Australian sign language data set and the Kinect hand gesture (KHG) data set. Results show that the proposed method outperforms other state-of-the-art methods especially when gesture information is incomplete.

  15. Dynamic monitoring and prediction of Dianchi Lake cyanobacteria outbreaks in the context of rapid urbanization.

    Science.gov (United States)

    Luo, Yi; Yang, Kun; Yu, Zhenyu; Chen, Junyi; Xu, Yufei; Zhou, Xiaolu; Yang, Yang

    2017-02-01

    Water crises have been among the most serious environmental problems worldwide since the twenty-first century. A water crisis is marked by a severe shortage of water resources and deteriorating water quality. As an important component of water resources, lake water quality has deteriorated rapidly in the context of fast urbanization and climate change. This deterioration has altered the water ecosystem structure and influenced lake functionality. To curb these trends, various strategies and procedures have been used in many urban lakes. Among these procedures, accurate and responsive water environment monitoring is the basis of the forecasting and prevention of large-scale cyanobacteria outbreaks and improvement of water quality. To dynamically monitor and predict the outbreak of cyanobacteria in Dianchi Lake, in this study, wireless sensors networks (WSNs) and the geographic information system (GIS) are used to monitor water quality at the macro-scale and meso-scale. Historical, real-time water quality and weather condition data were collected, and a combination prediction model (adaptive grey model (AGM) and back propagation artificial neural network (BPANN)) was proposed. The correlation coefficient (R) of the simulation experiment reached 0.995. Moreover, we conducted an empirical experiment in Dianchi Lake, Yunnan, China using the proposed method. R was 0.93, and the predicting error was 4.77. The results of the experiment suggest that our model has good performance for water quality prediction and can forecast cyanobacteria outbreaks. This system provides responsive forewarning and data support for lake protection and pollution control.

  16. Trust and community: Exploring the meanings, contexts and dynamics of community renewable energy

    Energy Technology Data Exchange (ETDEWEB)

    Walker, Gordon, E-mail: g.p.walker@lancaster.ac.u [University of Lancaster, Department of Geography, Lancaster LA1 4YN (United Kingdom); Devine-Wright, Patrick [University of Manchester, School of Environment and Development, Humanities Bridgeford Street Building, Oxford Road, Manchester M13 9PL (United Kingdom); Hunter, Sue; High, Helen; Evans, Bob [University of Lancaster, Department of Geography, Lancaster LA1 4YN (United Kingdom); University of Manchester, School of Environment and Development, Humanities Bridgeford Street Building, Oxford Road, Manchester M13 9PL (United Kingdom)

    2010-06-15

    Community renewable energy projects have recently been promoted and supported in the UK by government policy. A community approach, it is argued in the rhetoric of both government and grassroots activists will change the experience and outcomes of the energy sustainable technology implementation. In this paper, we consider how interpersonal and social trust is implicated in the different meanings given to community in RE programmes and projects, and in the qualities and outcomes that are implied or assumed by taking a community approach. We examine how these meanings play out in examples of projects on the ground, focusing on two contrasting cases in which the relationships between those involved locally have exhibited different patterns of cohesiveness and fracture. We argue that trust does have a necessary part to play in the contingencies and dynamics of community RE projects and in the outcomes they can achieve. Trust between local people and groups that take projects forward is part of the package of conditions which can help projects work. Whilst trust may therefore be functional for the development of community RE and potentially can be enhanced by the adoption of a community approach, this cannot be either assured or assumed under the wide diversity of contexts, conditions and arrangements under which community RE is being pursued and practiced.

  17. Religious Affiliation and Fertility in a Sub-Saharan Context: Dynamic and Lifetime Perspectives.

    Science.gov (United States)

    Agadjanian, Victor; Yabiku, Scott T

    2014-10-01

    We use uniquely detailed data from a predominantly Christian high-fertility area in Mozambique to examine denominational differentials in fertility from two complementary perspectives-dynamic and cumulative. First, we use event-history analysis to predict yearly risks of birth from denominational affiliation. Then, we employ Poisson regression to model the association between the number of children ever born and share of reproductive life spent in particular denominations or outside organized religion. Both approaches detect a significant increase in fertility associated with membership in a particular type of African-initiated churches which is characterized by strong organizational identity, rigid hierarchy, and insular corporate culture. Membership in the Catholic Church is also associated with elevated completed fertility. We relate these results to extant theoretical perspectives on the relationship between religion and fertility by stressing the interplay between ideological, social, and organizational characteristics of different types of churches and situate our findings within the context of fertility transition and religious demographics in Mozambique and elsewhere in sub-Saharan Africa.

  18. 基于广义似然比法的化工非线性动态过程过失误差侦破%Gross Errors Detection for Nonlinear Dynamic Chemical Process Based on Generalized Likelihood Ratios

    Institute of Scientific and Technical Information of China (English)

    王莉; 金思毅; 黄兆杰

    2013-01-01

    广义似然比法(GLR)是一种有效适用于线性稳态化工过程的过失误差侦破方法.通过将动态化工数据协调模型中的微分约束和代数约束转化为矩阵形式和非线性约束线性化方法,成功将GLR应用到连续搅拌釜(CSTR)非线性动态系统中,同时计算了GLR在该系统中的过失误差侦破性能.统计结果表明,GLR的过失误差侦破率与过失误差大小和窗口长度有关:侦破率随过失误差增大而增大,随窗口长度增大而增大.%Generalized likelihood ratios (GLR) is an effective gross errors detection method for linear steady data reconciliation.In the paper,the differential constraints and algebraic constraints of dynamic data reconciliation model were transformed into the form of matrix,and the nonlinear constraints were linearized.Based on the two methods,GLR was successfully applied to a continuous stirred tank reactor (CSTR) system.The performance of gross errors detection of GLR in the nonlinear dynamic system was also calculated.Statistic results show that gross error detection rate relates to the size of gross error and the length of moving window.With the increase of gross error,the detection rate is improved; with the increase of length of moving window,the detection rate is also improved.

  19. A dynamic and context-aware semantic mediation service for discovering and fusion of heterogeneous sensor data

    Directory of Open Access Journals (Sweden)

    Mohamed Bakillah

    2013-06-01

    Full Text Available Sensors play an increasingly critical role in capturing and distributing observation of phenomena in our environment. The Semantic Sensor Web enables interoperability to support various applications that use data made available by semantically heterogeneous sensor services. However, several challenges still need to be addressed to achieve this vision. More particularly, mechanisms that can support context-aware semantic mapping that adapts to dynamic metadata of sensors are required. Semantic mapping for Sensor Web is required to support sensor data fusion, sensor data discovery and retrieval, and automatic semantic annotation, to name only a few applications. This paper presents a context-aware ontology-based semantic mediation service for heterogeneous sensor services. The semantic mediation service is context-aware and dynamic because it takes into account the real-time variability of thematic, spatial and temporal features that describe sensor data in different contexts. The semantic mediation service integrates rule-based reasoning to support resolution of semantic heterogeneities. An application scenario is presented showing how the semantic mediation service can improve sensor data interpretation, reuse, and sharing in static and dynamic settings.

  20. Local Contexts

    Directory of Open Access Journals (Sweden)

    Philippe Schlenker

    2009-07-01

    Full Text Available The dynamic approach posits that a presupposition must be satisfied in its local context. But how is a local context derived from the global one? Extant dynamic analyses must specify in the lexical entry of any operator what its 'Context Change Potential' is, and for this very reason they fail to be sufficiently explanatory. To circumvent the problem, we revise two assumptions of the dynamic approach: we take the update process to be derivative from a classical, non-dynamic semantics -- which obviates the need for dynamic lexical entries; and we deny that a local context encodes what the speech act participants 'take for granted.' Instead, we take the local context of an expression E in a sentence S to be the smallest domain that one may restrict attention to when assessing E without jeopardizing the truth conditions of S. To match the results of dynamic semantics, local contexts must be computed incrementally, using only information about the expressions that precede E. This version of the theory can be shown to be nearly equivalent to the dynamic theory of Heim 1983 -- but unlike the latter, it is entirely predictive. We also suggest that local contexts can, at some cost, be computed symmetrically, taking into account information about all of S (except E; this leads to gradient predictions, whose assessment is left for future research. doi:10.3765/sp.2.3 BibTeX info

  1. Exclusion probabilities and likelihood ratios with applications to kinship problems.

    Science.gov (United States)

    Slooten, Klaas-Jan; Egeland, Thore

    2014-05-01

    In forensic genetics, DNA profiles are compared in order to make inferences, paternity cases being a standard example. The statistical evidence can be summarized and reported in several ways. For example, in a paternity case, the likelihood ratio (LR) and the probability of not excluding a random man as father (RMNE) are two common summary statistics. There has been a long debate on the merits of the two statistics, also in the context of DNA mixture interpretation, and no general consensus has been reached. In this paper, we show that the RMNE is a certain weighted average of inverse likelihood ratios. This is true in any forensic context. We show that the likelihood ratio in favor of the correct hypothesis is, in expectation, bigger than the reciprocal of the RMNE probability. However, with the exception of pathological cases, it is also possible to obtain smaller likelihood ratios. We illustrate this result for paternity cases. Moreover, some theoretical properties of the likelihood ratio for a large class of general pairwise kinship cases, including expected value and variance, are derived. The practical implications of the findings are discussed and exemplified.

  2. Architecture of firm dynamic capabilities across inter-organizational activities: Explaining innovativeness in the context of nanotechnology

    Science.gov (United States)

    Petricevic, Olga

    In this dissertation I first develop a theoretical framework that explores different components of dynamic capabilities related to firm's boundary-spanning linkages across two different types of inter-organizational activities---alliances and networks. I argue that there are four different subsets of dynamic capabilities simultaneously at work: alliance opportunity-sensing, alliance opportunity-seizing, network opportunity-sensing and network opportunity-seizing. Furthermore, I argue that there are significant interaction effects between these distinctive subsets driving the firm's overall effectiveness in sensing and seizing of novel and innovative external opportunities. In order to explore potential interdependencies and draw distinctions among different dynamic capability subsets I integrate concepts from the two theoretical perspectives that often neglect the emphasis of the other---the dynamic capability view and the social network perspective. I then test the hypothesized relationships in the context of firms actively patenting in nanotechnology. Nanotechnology innovations are multidisciplinary in nature and require search and discovery across multiple inter-organizational, scientific, geographic, industry, or technological domains by a particular firm. The findings offer support for the conceptualizations of dynamic capabilities as consisting of distinct subsets of capabilities for the sensing and the seizing of external new-knowledge opportunities. The findings suggest that firm's innovativeness in an interdisciplinary scientific field such as nanotechnology is the function of the vector of multi-dimensional dynamic capabilities that are context-specific. Furthermore, the findings also suggest that there are inherent trade-offs embedded in different dimensions of dynamic capabilities when deployed across a wide range of inter-organizational relationships.

  3. The Likelihood of Experiencing Relative Poverty over the Life Course.

    Science.gov (United States)

    Rank, Mark R; Hirschl, Thomas A

    2015-01-01

    Research on poverty in the United States has largely consisted of examining cross-sectional levels of absolute poverty. In this analysis, we focus on understanding relative poverty within a life course context. Specifically, we analyze the likelihood of individuals falling below the 20th percentile and the 10th percentile of the income distribution between the ages of 25 and 60. A series of life tables are constructed using the nationally representative Panel Study of Income Dynamics data set. This includes panel data from 1968 through 2011. Results indicate that the prevalence of relative poverty is quite high. Consequently, between the ages of 25 to 60, 61.8 percent of the population will experience a year below the 20th percentile, and 42.1 percent will experience a year below the 10th percentile. Characteristics associated with experiencing these levels of poverty include those who are younger, nonwhite, female, not married, with 12 years or less of education, or who have a work disability.

  4. The Likelihood of Experiencing Relative Poverty over the Life Course.

    Directory of Open Access Journals (Sweden)

    Mark R Rank

    Full Text Available Research on poverty in the United States has largely consisted of examining cross-sectional levels of absolute poverty. In this analysis, we focus on understanding relative poverty within a life course context. Specifically, we analyze the likelihood of individuals falling below the 20th percentile and the 10th percentile of the income distribution between the ages of 25 and 60. A series of life tables are constructed using the nationally representative Panel Study of Income Dynamics data set. This includes panel data from 1968 through 2011. Results indicate that the prevalence of relative poverty is quite high. Consequently, between the ages of 25 to 60, 61.8 percent of the population will experience a year below the 20th percentile, and 42.1 percent will experience a year below the 10th percentile. Characteristics associated with experiencing these levels of poverty include those who are younger, nonwhite, female, not married, with 12 years or less of education, or who have a work disability.

  5. The Likelihood of Experiencing Relative Poverty over the Life Course

    Science.gov (United States)

    Rank, Mark R.; Hirschl, Thomas A.

    2015-01-01

    Research on poverty in the United States has largely consisted of examining cross-sectional levels of absolute poverty. In this analysis, we focus on understanding relative poverty within a life course context. Specifically, we analyze the likelihood of individuals falling below the 20th percentile and the 10th percentile of the income distribution between the ages of 25 and 60. A series of life tables are constructed using the nationally representative Panel Study of Income Dynamics data set. This includes panel data from 1968 through 2011. Results indicate that the prevalence of relative poverty is quite high. Consequently, between the ages of 25 to 60, 61.8 percent of the population will experience a year below the 20th percentile, and 42.1 percent will experience a year below the 10th percentile. Characteristics associated with experiencing these levels of poverty include those who are younger, nonwhite, female, not married, with 12 years or less of education, or who have a work disability. PMID:26200781

  6. Binding neutral information to emotional contexts: Brain dynamics of long-term recognition memory.

    Science.gov (United States)

    Ventura-Bort, Carlos; Löw, Andreas; Wendt, Julia; Moltó, Javier; Poy, Rosario; Dolcos, Florin; Hamm, Alfons O; Weymar, Mathias

    2016-04-01

    There is abundant evidence in memory research that emotional stimuli are better remembered than neutral stimuli. However, effects of an emotionally charged context on memory for associated neutral elements is also important, particularly in trauma and stress-related disorders, where strong memories are often activated by neutral cues due to their emotional associations. In the present study, we used event-related potentials (ERPs) to investigate long-term recognition memory (1-week delay) for neutral objects that had been paired with emotionally arousing or neutral scenes during encoding. Context effects were clearly evident in the ERPs: An early frontal ERP old/new difference (300-500 ms) was enhanced for objects encoded in unpleasant compared to pleasant and neutral contexts; and a late central-parietal old/new difference (400-700 ms) was observed for objects paired with both pleasant and unpleasant contexts but not for items paired with neutral backgrounds. Interestingly, objects encoded in emotional contexts (and novel objects) also prompted an enhanced frontal early (180-220 ms) positivity compared to objects paired with neutral scenes indicating early perceptual significance. The present data suggest that emotional--particularly unpleasant--backgrounds strengthen memory for items encountered within these contexts and engage automatic and explicit recognition processes. These results could help in understanding binding mechanisms involved in the activation of trauma-related memories by neutral cues.

  7. The Sherpa Maximum Likelihood Estimator

    Science.gov (United States)

    Nguyen, D.; Doe, S.; Evans, I.; Hain, R.; Primini, F.

    2011-07-01

    A primary goal for the second release of the Chandra Source Catalog (CSC) is to include X-ray sources with as few as 5 photon counts detected in stacked observations of the same field, while maintaining acceptable detection efficiency and false source rates. Aggressive source detection methods will result in detection of many false positive source candidates. Candidate detections will then be sent to a new tool, the Maximum Likelihood Estimator (MLE), to evaluate the likelihood that a detection is a real source. MLE uses the Sherpa modeling and fitting engine to fit a model of a background and source to multiple overlapping candidate source regions. A background model is calculated by simultaneously fitting the observed photon flux in multiple background regions. This model is used to determine the quality of the fit statistic for a background-only hypothesis in the potential source region. The statistic for a background-plus-source hypothesis is calculated by adding a Gaussian source model convolved with the appropriate Chandra point spread function (PSF) and simultaneously fitting the observed photon flux in each observation in the stack. Since a candidate source may be located anywhere in the field of view of each stacked observation, a different PSF must be used for each observation because of the strong spatial dependence of the Chandra PSF. The likelihood of a valid source being detected is a function of the two statistics (for background alone, and for background-plus-source). The MLE tool is an extensible Python module with potential for use by the general Chandra user.

  8. Likelihood estimators for multivariate extremes

    KAUST Repository

    Huser, Raphaël

    2015-11-17

    The main approach to inference for multivariate extremes consists in approximating the joint upper tail of the observations by a parametric family arising in the limit for extreme events. The latter may be expressed in terms of componentwise maxima, high threshold exceedances or point processes, yielding different but related asymptotic characterizations and estimators. The present paper clarifies the connections between the main likelihood estimators, and assesses their practical performance. We investigate their ability to estimate the extremal dependence structure and to predict future extremes, using exact calculations and simulation, in the case of the logistic model.

  9. A Predictive Likelihood Approach to Bayesian Averaging

    Directory of Open Access Journals (Sweden)

    Tomáš Jeřábek

    2015-01-01

    Full Text Available Multivariate time series forecasting is applied in a wide range of economic activities related to regional competitiveness and is the basis of almost all macroeconomic analysis. In this paper we combine multivariate density forecasts of GDP growth, inflation and real interest rates from four various models, two type of Bayesian vector autoregression (BVAR models, a New Keynesian dynamic stochastic general equilibrium (DSGE model of small open economy and DSGE-VAR model. The performance of models is identified using historical dates including domestic economy and foreign economy, which is represented by countries of the Eurozone. Because forecast accuracy of observed models are different, the weighting scheme based on the predictive likelihood, the trace of past MSE matrix, model ranks are used to combine the models. The equal-weight scheme is used as a simple combination scheme. The results show that optimally combined densities are comparable to the best individual models.

  10. Section 9: Ground Water - Likelihood of Release

    Science.gov (United States)

    HRS training. the ground water pathway likelihood of release factor category reflects the likelihood that there has been, or will be, a release of hazardous substances in any of the aquifers underlying the site.

  11. Enhancement of Wide-Area Service Discovery using Dynamic Context Information

    DEFF Research Database (Denmark)

    Olsen, Rasmus Løvenstein

    Eksisterende service discovery løsninger kan i dag ikke levere informationer om hvorvidt fundne services nu også er relevante for brugeren, hvorom denne afhandling omhandler hvorledes man kan forbedre service discovery ved brug af context informationer. De væsentligste dele af projektet omhandler 1......) modellering af tilgang til dynamisk informationer i distribuerede netværk med henblik på ydelsesanalyse af netværkstraffik, tilgangstider og såakaldt mismatch probability, 2) system koncepter der tillader context sensitive service discovery og 3) evaluering af implementeret prototype....

  12. A guideline for the validation of likelihood ratio methods used for forensic evidence evaluation

    NARCIS (Netherlands)

    Meuwly, Didier; Ramos, Daniel; Haraksim, Rudolf

    2016-01-01

    This Guideline proposes a protocol for the validation of forensic evaluation methods at the source level, using the Likelihood Ratio framework as defined within the Bayes’ inference model. In the context of the inference of identity of source, the Likelihood Ratio is used to evaluate the strength of

  13. Phylogenetic estimation with partial likelihood tensors

    CERN Document Server

    Sumner, J G

    2008-01-01

    We present an alternative method for calculating likelihoods in molecular phylogenetics. Our method is based on partial likelihood tensors, which are generalizations of partial likelihood vectors, as used in Felsenstein's approach. Exploiting a lexicographic sorting and partial likelihood tensors, it is possible to obtain significant computational savings. We show this on a range of simulated data by enumerating all numerical calculations that are required by our method and the standard approach.

  14. Workshop on Likelihoods for the LHC Searches

    CERN Document Server

    2013-01-01

    The primary goal of this 3‐day workshop is to educate the LHC community about the scientific utility of likelihoods. We shall do so by describing and discussing several real‐world examples of the use of likelihoods, including a one‐day in‐depth examination of likelihoods in the Higgs boson studies by ATLAS and CMS.

  15. Context-sensitive dynamic ordinal regression for intensity estimation of facial action units

    NARCIS (Netherlands)

    Rudovic, Ognjen; Pavlovic, Vladimir; Pantic, Maja

    2015-01-01

    Modeling intensity of facial action units from spontaneously displayed facial expressions is challenging mainly because of high variability in subject-specific facial expressiveness, head-movements, illumination changes, etc. These factors make the target problem highly context-sensitive. However, e

  16. Context-sensitive Dynamic Ordinal Regression for Intensity Estimation of Facial Action Units

    NARCIS (Netherlands)

    Rudovic, Ognjen; Pavlovic, Vladimir; Pantic, Maja

    2015-01-01

    Modeling intensity of facial action units from spontaneously displayed facial expressions is challenging mainly because of high variability in subject-specific facial expressiveness, head-movements, illumination changes, etc. These factors make the target problem highly context-sensitive. However,

  17. Does Santa Exist? Children's Everyday Narratives as Dynamic Meeting Places in a Day Care Centre Context

    Science.gov (United States)

    Puroila, Anna-Maija; Estola, Eila; Syrjala, Leena

    2012-01-01

    The article attempts to answer the question: What is the nature of children's everyday narratives in a day care centre context? The theoretical framework of this study is based on a narrative approach. The research material was gathered through applying the methodology of narrative ethnography. The article is based on observational material…

  18. Global Innovation Systems—A conceptual framework for innovation dynamics in transnational contexts

    NARCIS (Netherlands)

    Binz, Christian; Truffer, Bernhard

    2017-01-01

    This paper proposes a framework for the analysis of technological innovation processes in transnational contexts. By drawing on existing innovation system concepts and recent elaborations on the globalization of innovation, we develop a multi-scalar conceptualization of innovation systems. Two key

  19. Landscape context affects use of restored grasslands by mammals in a dynamic agroecosystem

    National Research Council Canada - National Science Library

    Berry, Brian; Schooley, Robert L; Ward, Michael P

    2017-01-01

    ... agroecosystem in Illinois from 2014 to 2015. We tested hypotheses about the effects of local habitat conditions and landscape context on use of restored grasslands by four focal species: raccoons (Proryon lotor), eastern cottontails (Sylvilagus floridanus), coyotes (Canis latrans), and white-tailed deer (Odocoileus virginianus). Most species showed s...

  20. Nonlinear dynamic response analysis of localized damaged laminated composite structures in the context of component mode synthesis

    Science.gov (United States)

    Mahmoudi, S.; Trivaudey, F.; Bouhaddi, N.

    2015-07-01

    The aim of this study is the prediction of the dynamic response of damaged laminated composite structures in the context of component mode synthesis. Hence, a method of damage localization of complex structures is proposed. The dynamic behavior of transversely isotropic layers is expressed through elasticity coupled with damage based on an existing macro model for cracked structures. The damage is located only in some regions of the whole structure, which is decomposed on substructures. The incremental linear dynamic governing equations are obtained by using the classical linear Kirchhoff-Love theory of plates. Then, considering the damage-induced nonlinearity, the obtained nonlinear dynamic equations are solved in time domain. However, a detailed finite element modelling of such structure on the scale of localized damage would generate very high computational costs. To reduce this cost, Component Mode Synthesis method (CMS) is used for modelling a nonlinear fine-scale substructure damaged, connected to linear dynamic models of the remaining substructures, which can be condensed and not updated at each iteration. Numerical results show that the mechanical properties of the structure highly change when damage is taken into account. Under an impact load, damage increases and reaches its highest value with the maximum of the applied load and then remains unchanged. Besides, the eigenfrequencies of the damaged structure decrease comparing with those of an undamaged one. This methodology can be used for monitoring strategies and lifetime estimations of hybrid complex structures due to the damage state is known in space and time.

  1. Some dynamic generalized information measures in the context of weighted models

    Directory of Open Access Journals (Sweden)

    S. S. Maya

    2013-05-01

    Full Text Available In this paper, we study some dynamic generalized information measures between a true distribution and an observed (weighted distribution, useful in life length studies. Further, some bounds and inequalities related to these measures are also studied.

  2. Some dynamic generalized information measures in the context of weighted models

    OpenAIRE

    S. S. Maya; S. M. Sunoj

    2013-01-01

    In this paper, we study some dynamic generalized information measures between a true distribution and an observed (weighted) distribution, useful in life length studies. Further, some bounds and inequalities related to these measures are also studied.

  3. Empirical likelihood method in survival analysis

    CERN Document Server

    Zhou, Mai

    2015-01-01

    Add the Empirical Likelihood to Your Nonparametric ToolboxEmpirical Likelihood Method in Survival Analysis explains how to use the empirical likelihood method for right censored survival data. The author uses R for calculating empirical likelihood and includes many worked out examples with the associated R code. The datasets and code are available for download on his website and CRAN.The book focuses on all the standard survival analysis topics treated with empirical likelihood, including hazard functions, cumulative distribution functions, analysis of the Cox model, and computation of empiric

  4. Gaussian maximum likelihood and contextual classification algorithms for multicrop classification

    Science.gov (United States)

    Di Zenzo, Silvano; Bernstein, Ralph; Kolsky, Harwood G.; Degloria, Stephen D.

    1987-01-01

    The paper reviews some of the ways in which context has been handled in the remote-sensing literature, and additional possibilities are introduced. The problem of computing exhaustive and normalized class-membership probabilities from the likelihoods provided by the Gaussian maximum likelihood classifier (to be used as initial probability estimates to start relaxation) is discussed. An efficient implementation of probabilistic relaxation is proposed, suiting the needs of actual remote-sensing applications. A modified fuzzy-relaxation algorithm using generalized operations between fuzzy sets is presented. Combined use of the two relaxation algorithms is proposed to exploit context in multispectral classification of remotely sensed data. Results on both one artificially created image and one MSS data set are reported.

  5. Maximum-likelihood estimation prevents unphysical Mueller matrices

    CERN Document Server

    Aiello, A; Voigt, D; Woerdman, J P

    2005-01-01

    We show that the method of maximum-likelihood estimation, recently introduced in the context of quantum process tomography, can be applied to the determination of Mueller matrices characterizing the polarization properties of classical optical systems. Contrary to linear reconstruction algorithms, the proposed method yields physically acceptable Mueller matrices even in presence of uncontrolled experimental errors. We illustrate the method on the case of an unphysical measured Mueller matrix taken from the literature.

  6. Dynamics of a low-density tiger population in Southeast Asia in the context of improved law enforcement.

    Science.gov (United States)

    Duangchantrasiri, Somphot; Umponjan, Mayuree; Simcharoen, Saksit; Pattanavibool, Anak; Chaiwattana, Soontorn; Maneerat, Sompoch; Kumar, N Samba; Jathanna, Devcharan; Srivathsa, Arjun; Karanth, K Ullas

    2016-06-01

    Recovering small populations of threatened species is an important global conservation strategy. Monitoring the anticipated recovery, however, often relies on uncertain abundance indices rather than on rigorous demographic estimates. To counter the severe threat from poaching of wild tigers (Panthera tigris), the Government of Thailand established an intensive patrolling system in 2005 to protect and recover its largest source population in Huai Kha Khaeng Wildlife Sanctuary. Concurrently, we assessed the dynamics of this tiger population over the next 8 years with rigorous photographic capture-recapture methods. From 2006 to 2012, we sampled across 624-1026 km(2) with 137-200 camera traps. Cameras deployed for 21,359 trap days yielded photographic records of 90 distinct individuals. We used closed model Bayesian spatial capture-recapture methods to estimate tiger abundances annually. Abundance estimates were integrated with likelihood-based open model analyses to estimate rates of annual and overall rates of survival, recruitment, and changes in abundance. Estimates of demographic parameters fluctuated widely: annual density ranged from 1.25 to 2.01 tigers/100 km(2) , abundance from 35 to 58 tigers, survival from 79.6% to 95.5%, and annual recruitment from 0 to 25 tigers. The number of distinct individuals photographed demonstrates the value of photographic capture-recapture methods for assessments of population dynamics in rare and elusive species that are identifiable from natural markings. Possibly because of poaching pressure, overall tiger densities at Huai Kha Khaeng were 82-90% lower than in ecologically comparable sites in India. However, intensified patrolling after 2006 appeared to reduce poaching and was correlated with marginal improvement in tiger survival and recruitment. Our results suggest that population recovery of low-density tiger populations may be slower than anticipated by current global strategies aimed at doubling the number of wild tigers

  7. Adaptive Parallel Tempering for Stochastic Maximum Likelihood Learning of RBMs

    CERN Document Server

    Desjardins, Guillaume; Bengio, Yoshua

    2010-01-01

    Restricted Boltzmann Machines (RBM) have attracted a lot of attention of late, as one the principle building blocks of deep networks. Training RBMs remains problematic however, because of the intractibility of their partition function. The maximum likelihood gradient requires a very robust sampler which can accurately sample from the model despite the loss of ergodicity often incurred during learning. While using Parallel Tempering in the negative phase of Stochastic Maximum Likelihood (SML-PT) helps address the issue, it imposes a trade-off between computational complexity and high ergodicity, and requires careful hand-tuning of the temperatures. In this paper, we show that this trade-off is unnecessary. The choice of optimal temperatures can be automated by minimizing average return time (a concept first proposed by [Katzgraber et al., 2006]) while chains can be spawned dynamically, as needed, thus minimizing the computational overhead. We show on a synthetic dataset, that this results in better likelihood ...

  8. IMPROVING VOICE ACTIVITY DETECTION VIA WEIGHTING LIKELIHOOD AND DIMENSION REDUCTION

    Institute of Scientific and Technical Information of China (English)

    Wang Huanliang; Han Jiqing; Li Haifeng; Zheng Tieran

    2008-01-01

    The performance of the traditional Voice Activity Detection (VAD) algorithms declines sharply in lower Signal-to-Noise Ratio (SNR) environments. In this paper, a feature weighting likelihood method is proposed for noise-robust VAD. The contribution of dynamic features to likelihood score can be increased via the method, which improves consequently the noise robustness of VAD.Divergence based dimension reduction method is proposed for saving computation, which reduces these feature dimensions with smaller divergence value at the cost of degrading the performance a little.Experimental results on Aurora Ⅱ database show that the detection performance in noise environments can remarkably be improved by the proposed method when the model trained in clean data is used to detect speech endpoints. Using weighting likelihood on the dimension-reduced features obtains comparable, even better, performance compared to original full-dimensional feature.

  9. Dynamic Processes of Speech Development by Seven Adult Learners of Japanese in a Domestic Immersion Context

    Science.gov (United States)

    Fukuda, Makiko

    2014-01-01

    The present study revealed the dynamic process of speech development in a domestic immersion program by seven adult beginning learners of Japanese. The speech data were analyzed with fluency, accuracy, and complexity measurements at group, interindividual, and intraindividual levels. The results revealed the complex nature of language development…

  10. A theory and dynamic model of dyadic interaction : Concerns, appraisals, and contagiousness in a developmental context

    NARCIS (Netherlands)

    Steenbeek, Henderien W.; van Geert, Paul L. C.

    A theory of the dynamics of dyadic interaction is presented, based on the concepts of "concern" (i.e., intentions, goals, and interests), "appraisal" and "contagiousness". Differences between children who participate in a specific interaction are linked to differences in social competence and social

  11. Dynamic Processes of Speech Development by Seven Adult Learners of Japanese in a Domestic Immersion Context

    Science.gov (United States)

    Fukuda, Makiko

    2014-01-01

    The present study revealed the dynamic process of speech development in a domestic immersion program by seven adult beginning learners of Japanese. The speech data were analyzed with fluency, accuracy, and complexity measurements at group, interindividual, and intraindividual levels. The results revealed the complex nature of language development…

  12. Phase field modelling of dynamic thermal fracture in the context of irradiation damage

    CERN Document Server

    Schlüter, Alexander; Müller, Ralf; Tomut, Marilena; Trautmann , Christina; Weick, Helmut; Plate, Carolin

    2015-01-01

    This work presents a continuum mechanics approach to model fracturing processes in brittle materials that are subjected to rapidly applied high-temperature gradients. Such a type of loading typically occurs when a solid is exposed to an intense high-energy particle beam that deposits a large amount of energy into a small sample volume. Given the rapid energy deposition leading to a fast temperature increase, dynamic effects have to be considered. Our existing phase field model for dynamic fracture is thus extended in a way that allows modelling of thermally induced fracture. A finite element scheme is employed to solve the governing partial differential equations numerically. Finally, the functionality of our model is illustrated by two examples.

  13. Infant expressions of aggressiveness in educational context. Interpretation from dynamic psychology and family relationships

    Directory of Open Access Journals (Sweden)

    Laura Victoria Londoño

    2012-01-01

    Full Text Available This article is a product of the research interdisciplinary perspectives of intervention with families. Case of Medellin and the Municipality of Rionegro. An understanding from Psychology, Education and Families. It Describes children’s speeches on the phenomenon of aggression, experienced in the school Colegio Bello Oriente in Medellin. Its aim is to detail roles and limits in families where there are children who behave aggressively in educational settings. The methodological approach was qualitative research. The results show an understanding of children’s aggression from the theoretical perspective of dynamic psychology, and an analysis of the roles and limits as dimensions of family dynamics in which children. In conclusion, it can be said that the children can take responsibility for their aggressive behavior and process symbolically this aggressiveness when they find appropriate mechanisms in their families and educational institutions. © Revista Colombiana de Ciencias Sociales.

  14. [Family dynamics and chronic illness: children with diabetes in the context of their families].

    Science.gov (United States)

    Wirlach-Bartosik, S; Schubert, M T; Freilinger, M; Schober, E

    2005-01-01

    The present study is based on the assumption of an interaction between family functioning and chronic illness. Using a systemic approach, the intra-familial situation of families with a diabetes-affected child is examined. 44 families were evaluated using a family diagnostic instrument ("Familienbögen") and compared with 31 control families with a healthy child. Furthermore, the study looked at the influence of the level of family functioning on glycemic control, as measured by HbA1c values, and vice versa. Families with a child affected by diabetes showed significantly more dysfunctional domains and higher discrepancies of the ratings in the family diagnostic instrument (p family functioning and glycemic control was found. Poor glycemic control therefore did not have any negative effects on the family dynamics, in fact, the opposite was often the case. Also, the relationship between siblings was judged more positively when one of the siblings was chronically ill (p familial dynamics, it may, at the same time, offer opportunities for an improvement of family relationships. However, if physiological parameters deteriorate in the child (poor glycemic control), family problems seem to become less important. Success in the treatment of diabetes patients should therefore not only be measured by the quality of glycemic control, but also by considering psychological factors and aspects of family dynamics.

  15. Theoretical aspects of synthetic measurement of the development dynamics in the context of city

    Directory of Open Access Journals (Sweden)

    Zbyszko Pawlak

    2012-12-01

    Full Text Available  Background:  The paper presents the theoretical basis for the proposal of modeling of the dynamics of the modern cities’ development by the use of a properly constructed synthetic indicator. Additionally to the possibility of the quantification of the development of social and economic systems of cities, its implementation allows the identification of nonlinear processes as phase transitions, which occur e.g. under influence of technological and social innovations. The economic and physical approach to this allows to learn more about the nature of these processes and to set new instruments supporting the management of urban areas in conditions of an increasing competiveness.  Methods: The mathematical modeling of social and economical processes and economical and physical approach to dynamics of systems of nonlinear development. Results and conclusions: Based on conducted simulation researches, it can be concluded that the synthetic measure of the development of urban areas can be a good tool supporting the city management by local authorities. The economical and physical approach to the nonlinear dynamics of urban systems marks out new areas for further researches, the determination of minimum required conditions (the necessary level for stimulation of the phase transition and the analysis of factors allowing to avoid the negative consequences of a phase transition, especially in smaller cities areas, seems to be the most important ones.  

  16. Vestige: Maximum likelihood phylogenetic footprinting

    Directory of Open Access Journals (Sweden)

    Maxwell Peter

    2005-05-01

    Full Text Available Abstract Background Phylogenetic footprinting is the identification of functional regions of DNA by their evolutionary conservation. This is achieved by comparing orthologous regions from multiple species and identifying the DNA regions that have diverged less than neutral DNA. Vestige is a phylogenetic footprinting package built on the PyEvolve toolkit that uses probabilistic molecular evolutionary modelling to represent aspects of sequence evolution, including the conventional divergence measure employed by other footprinting approaches. In addition to measuring the divergence, Vestige allows the expansion of the definition of a phylogenetic footprint to include variation in the distribution of any molecular evolutionary processes. This is achieved by displaying the distribution of model parameters that represent partitions of molecular evolutionary substitutions. Examination of the spatial incidence of these effects across regions of the genome can identify DNA segments that differ in the nature of the evolutionary process. Results Vestige was applied to a reference dataset of the SCL locus from four species and provided clear identification of the known conserved regions in this dataset. To demonstrate the flexibility to use diverse models of molecular evolution and dissect the nature of the evolutionary process Vestige was used to footprint the Ka/Ks ratio in primate BRCA1 with a codon model of evolution. Two regions of putative adaptive evolution were identified illustrating the ability of Vestige to represent the spatial distribution of distinct molecular evolutionary processes. Conclusion Vestige provides a flexible, open platform for phylogenetic footprinting. Underpinned by the PyEvolve toolkit, Vestige provides a framework for visualising the signatures of evolutionary processes across the genome of numerous organisms simultaneously. By exploiting the maximum-likelihood statistical framework, the complex interplay between mutational

  17. The Laplace Likelihood Ratio Test for Heteroscedasticity

    Directory of Open Access Journals (Sweden)

    J. Martin van Zyl

    2011-01-01

    Full Text Available It is shown that the likelihood ratio test for heteroscedasticity, assuming the Laplace distribution, gives good results for Gaussian and fat-tailed data. The likelihood ratio test, assuming normality, is very sensitive to any deviation from normality, especially when the observations are from a distribution with fat tails. Such a likelihood test can also be used as a robust test for a constant variance in residuals or a time series if the data is partitioned into groups.

  18. Corporate governance effect on financial distress likelihood: Evidence from Spain

    Directory of Open Access Journals (Sweden)

    Montserrat Manzaneque

    2016-01-01

    Full Text Available The paper explores some mechanisms of corporate governance (ownership and board characteristics in Spanish listed companies and their impact on the likelihood of financial distress. An empirical study was conducted between 2007 and 2012 using a matched-pairs research design with 308 observations, with half of them classified as distressed and non-distressed. Based on the previous study by Pindado, Rodrigues, and De la Torre (2008, a broader concept of bankruptcy is used to define business failure. Employing several conditional logistic models, as well as to other previous studies on bankruptcy, the results confirm that in difficult situations prior to bankruptcy, the impact of board ownership and proportion of independent directors on business failure likelihood are similar to those exerted in more extreme situations. These results go one step further, to offer a negative relationship between board size and the likelihood of financial distress. This result is interpreted as a form of creating diversity and to improve the access to the information and resources, especially in contexts where the ownership is highly concentrated and large shareholders have a great power to influence the board structure. However, the results confirm that ownership concentration does not have a significant impact on financial distress likelihood in the Spanish context. It is argued that large shareholders are passive as regards an enhanced monitoring of management and, alternatively, they do not have enough incentives to hold back the financial distress. These findings have important implications in the Spanish context, where several changes in the regulatory listing requirements have been carried out with respect to corporate governance, and where there is no empirical evidence regarding this respect.

  19. Context Aware Adaptive Service based Dynamic Channel Allocation Approach for Providing an Optimal QoS over MANET

    Directory of Open Access Journals (Sweden)

    A. Ayyasamy

    2014-07-01

    Full Text Available Large variations in network Quality of Service (QoS in terms of bandwidth, latency and jitter may occur during media transfer over mobile ad-hoc networks. Applications need to adapt their functionality according to dynamic change of their QoS update. This paper proposes an enhanced service based platform to provide adaptive network management services to higher level application layer components. The Context Aware Adaptive Service (COAAS is a middleware architecture for service adaptation based on ad hoc network and service awareness. COAAS is structured in such a way that it can provide QoS awareness to streaming applications as well manage dynamic ad hoc network resources using an adaptive channel allocation approach. The overall architecture of COAAS framework includes core components to connection establishment, connection monitor, connection controller and policy manager. Adaptive channel allocation defined as object based component helps in dynamic binding during run time implemented using JXTA and J2ME using CDC [15] toolkit to demonstrate the performance of a mobile setup as a conference application.

  20. Revisiting the Body-Schema Concept in the Context of Whole-Body Postural-Focal Dynamics

    Science.gov (United States)

    Morasso, Pietro; Casadio, Maura; Mohan, Vishwanathan; Rea, Francesco; Zenzeri, Jacopo

    2015-01-01

    The body-schema concept is revisited in the context of embodied cognition, further developing the theory formulated by Marc Jeannerod that the motor system is part of a simulation network related to action, whose function is not only to shape the motor system for preparing an action (either overt or covert) but also to provide the self with information on the feasibility and the meaning of potential actions. The proposed computational formulation is based on a dynamical system approach, which is linked to an extension of the equilibrium-point hypothesis, called Passive Motor Paradigm: this dynamical system generates goal-oriented, spatio-temporal, sensorimotor patterns, integrating a direct and inverse internal model in a multi-referential framework. The purpose of such computational model is to operate at the same time as a general synergy formation machinery for planning whole-body actions in humanoid robots and/or for predicting coordinated sensory–motor patterns in human movements. In order to illustrate the computational approach, the integration of simultaneous, even partially conflicting tasks will be analyzed in some detail with regard to postural-focal dynamics, which can be defined as the fusion of a focal task, namely reaching a target with the whole-body, and a postural task, namely maintaining overall stability. PMID:25741274

  1. Stylized facts in microalgal growth: interpretation in a dynamic energy budget context.

    Science.gov (United States)

    Lorena, António; Marques, Gonçalo M; Kooijman, S A L M; Sousa, Tânia

    2010-11-12

    A dynamic energy budget (DEB) model for microalgae is proposed. This model deviates from the standard DEB model as it needs more reserves to cope with the variation of assimilation pathways, requiring a different approach to growth based on the synthesizing unit (SU) theory for multiple substrates. It is shown that the model is able to accurately predict experimental data in constant and light-varying conditions with most of the parameter values taken directly from the literature. Also, model simulations are shown to be consistent with stylized facts (SFs) concerning NC ratio. These SFs are reinterpreted and the general conclusion is that all forcing variables (dilution rate, temperature and irradiance) impose changes in the nitrogen or carbon limitation status of the population, and consequently on reserve densities. Model predictions are also evaluated in comparison with SFs on chlorophyll concentration. It is proposed that an extra structure, more dependent on the nitrogen reserve, is required to accurately model chlorophyll dynamics. Finally, SFs concerning extracellular polymeric substances (EPSs) production by benthic diatoms are collected and interpreted and a formulation based on product synthesis and rejection flux is proposed for the EPSs production rate.

  2. Rural Medicare Advantage Market Dynamics and Quality: Historical Context and Current Implications.

    Science.gov (United States)

    Kemper, Leah; Barker, Abigail R; Wilber, Lyndsey; McBride, Timothy D; Mueller, Keith

    2016-07-01

    Purpose. In this policy brief, we assess variation in Medicare’s star quality ratings of Medicare Advantage (MA) plans that are available to rural beneficiaries. Evidence from the recent Centers for Medicare & Medicaid Services (CMS) quality demonstration suggests that market dynamics, i.e., firms entering and exiting the MA marketplace, play a role in quality improvement. Therefore, we also discuss how market dynamics may impact the smaller and less wealthy populations that are characteristic of rural places. Key Data Findings. (1) Highly rated MA plans serving rural Medicare beneficiaries are more likely to be health maintenance organizations (HMOs) and local preferred provider organizations (PPOs), as opposed to regional PPOs. HMOs and local PPOs may be better able to improve their quality scores strategically in response to the bonus payment incentive due to existing internal monitoring mechanisms. (2) On average, the rural enrollment rate is lower in plans with higher quality scores (59 percent) than the corresponding urban rate (71 percent). This differential is likely due, in part, to lack of availability of highly rated plans in rural areas: 17.8 percent of rural counties lacked access to a plan with four or more (out of five) stars, while just 3.7 percent of urban counties lacked such access. (3) MA plans with high quality scores have been operating longer, on average, and have a lower percentage of rural counties within their contract service areas than plans with lower quality scores.

  3. Researching One's Own Field. Interaction Dynamics and Methodological Challenges in the Context of Higher Education Research

    Directory of Open Access Journals (Sweden)

    Gerlinde Malli

    2015-01-01

    Full Text Available In contrast to quantitative approaches, where interaction effects are usually regarded as errors or disruption, we understand interviews as social situations and the interaction dynamics between interviewee and interviewer as constitutive for data collection and interpretation. We conducted interviews with various actors from the academic field for a research project in higher education research. Based on our field experience we assume that interviews also offer opportunities for the respondents to present themselves in a discursive process. In this article we first show that many of our interviewees perceived us as evaluators. We argue that the interviewees' self-presentations and rhetorical strategies were shaped by the evaluative and competitive environment in which they took place, i.e. that of the entrepreneurial university. Furthermore we sum up various types of interactive effects which can occur when researchers interview actors with a higher status in the academic field. These research up-effects as well as the interviewees' perception of us as evaluators influenced both how and what they told us as well as what they kept silent. Therefore we plead that researchers should look out more carefully for interaction dynamics when interpreting data, as they also might be pointers to tensions, conflicts or opposing perspectives. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs1501111

  4. DYNAMIC MATHEMATICS SOFTWARE: A QUANTITATIVE ANALYSIS IN THE CONTEXT OF THE PREPARATION OF MATH TEACHER

    Directory of Open Access Journals (Sweden)

    Olena V. Semenikhina

    2015-09-01

    Full Text Available The results of the pedagogical experiment to clarify the number of dynamic mathematics software, which modern math teacher should know, are described.Quantitative results of the questionnaire survey of math teachers concerning the use of DMS in the learning process are given. Thus, the preferences of the teachers are software Gran and GeoGebra, the preferences of the students are software GeoGebra and MathKit. It is noted that math teachers in determining the number of DMS, which they should know and be able to use in the future, notice 3-5 DMS and students, future math teachers, notice 5-7 DMS. Statistical processing of the results was made based on the sign test. It allowed to justify the conclusion, that the study of 5 DMS is needed, at the significance level of 0.05.

  5. [Family dynamics in the caring context of adults on the fourth age].

    Science.gov (United States)

    Polaro, Sandra Helena Isse; Gonçalves, Lucia Hisako Takase; Nassar, Silvia Modesto; Lopes, Márcia Maria Bragança; Ferreira, Viviane Ferraz; Monteiro, Hellen Karinna

    2013-01-01

    This study aimed to determine the pattern of family functioning on everyday care relationships of adults in the fourth age. This is a study of diagnostic-evaluative nature of adults with 80 or more years old who depend on care, and of their relatives as caregivers. The participants were selected among the registered patients of a Family Health Unit in a district in the suburbs of Belém-PA, Brazil. They were evaluated according to the dynamics of their family, and quality of life related health lifestyle. Most of the elderly rated their families with good functionality. However, data on the elderly and caregivers' quality of life and caregivers' life style only reached the median level, showing some difficulty in the family functioning system. It was concluded that the multiple results obtained through the assessments indicate some practical implications of care to the family unity and confirm the need for multidimensional assessment about the family intervention.

  6. Growth dynamics in the context of pediatric sports injuries and overuse.

    Science.gov (United States)

    Zwick, Ernst B; Kocher, Robert

    2014-11-01

    The onset and timing of the growth of children and adolescents occurs with considerable variability in cohorts of the same chronological age. The musculoskeletal system changes in proportion over time, and lever-arm changes, altered individual flexibility, and strength lead to age-specific injury patterns in youth sports. In sports, juniors are commonly grouped according to their chronological age. Early- and late-maturing children and adolescents might therefore not routinely be trained in relation to their biology. This not only represents a risk for overuse and injury but might limit their development in sports. To obtain information about the biological age of children is challenging. Numerous methods have been studied and validated. However, the implementation of these methods on a large scale is still to come. This report provides a brief overview of growth dynamics in relation to youth sports injuries and describes a few challenges for the future.

  7. Suppliers Dynamic Approach to Invest in R&D with Sunk Costs in Indian Contexts

    Directory of Open Access Journals (Sweden)

    Manoj KUMAR

    2015-05-01

    Full Text Available In this paper we test for the presence of sunk costs in suppliers’ R&D activities by analyzing the persistence of these activities using supplier level panel data. We develop and estimate a dynamic discrete choice model where each supplier’s current R&D expenditure is a function, among other factors, of its previous experience in performing R&D activities. The data used is a panel data of Indian suppliers, for the period 2003 - 2013. We find that prior R&D experience significantly affects the current decision to invest in R&D, and that, although important, the effect of prior R&D experience depreciates fairly quickly over time.

  8. Dynamical Modelling, Stochastic Simulation and Optimization in the Context of Damage Tolerant Design

    Directory of Open Access Journals (Sweden)

    Sergio Butkewitsch

    2006-01-01

    Full Text Available This paper addresses the situation in which some form of damage is induced by cyclic mechanical stresses yielded by the vibratory motion of a system whose dynamical behaviour is, in turn, affected by the evolution of the damage. It is assumed that both phenomena, vibration and damage propagation, can be modeled by means of time depended equations of motion whose coupled solution is sought. A brief discussion about the damage tolerant design philosophy for aircraft structures is presented at the introduction, emphasizing the importance of the accurate definition of inspection intervals and, for this sake, the need of a representative damage propagation model accounting for the actual loading environment in which a structure may operate. For the purpose of illustration, the finite element model of a cantilever beam is formulated, providing that the stiffness matrix can be updated as long as a crack of an assumed initial length spreads in a given location of the beam according to a proper propagation model. This way, it is possible to track how the mechanical vibration, through its varying amplitude stress field, activates and develops the fatigue failure mechanism. Conversely, it is also possible to address how the effect of the fatigue induced stiffness degradation influences the motion of the beam, closing the loop for the analysis of a coupled vibration-degradation dynamical phenomenon. In the possession of this working model, stochastic simulation of the beam behaviour is developed, aiming at the identification of the most influential parameters and at the characterization of the probability distributions of the relevant responses of interest. The knowledge of the parameters and responses allows for the formulation of optimization problems aiming at the improvement of the beam robustness with respect to the fatigue induced stiffness degradation. The overall results are presented and analyzed, conducting to the conclusions and outline of future

  9. Dynamics of animal movement in an ecological context: dragonfly wing damage reduces flight performance and predation success.

    Science.gov (United States)

    Combes, S A; Crall, J D; Mukherjee, S

    2010-06-23

    Much of our understanding of the control and dynamics of animal movement derives from controlled laboratory experiments. While many aspects of animal movement can be probed only in these settings, a more complete understanding of animal locomotion may be gained by linking experiments on relatively simple motions in the laboratory to studies of more complex behaviours in natural settings. To demonstrate the utility of this approach, we examined the effects of wing damage on dragonfly flight performance in both a laboratory drop-escape response and the more natural context of aerial predation. The laboratory experiment shows that hindwing area loss reduces vertical acceleration and average flight velocity, and the predation experiment demonstrates that this type of wing damage results in a significant decline in capture success. Taken together, these results suggest that wing damage may take a serious toll on wild dragonflies, potentially reducing both reproductive success and survival.

  10. Emotional insecurity about the community: A dynamic, within-person mediator of child adjustment in contexts of political violence.

    Science.gov (United States)

    Cummings, E Mark; Merrilees, Christine; Taylor, Laura K; Goeke-Morey, Marcie; Shirlow, Peter

    2017-02-01

    Over 1 billion children worldwide are exposed to political violence and armed conflict. The current conclusions are qualified by limited longitudinal research testing sophisticated process-oriented explanatory models for child adjustment outcomes. In this study, consistent with a developmental psychopathology perspective emphasizing the value of process-oriented longitudinal study of child adjustment in developmental and social-ecological contexts, we tested emotional insecurity about the community as a dynamic, within-person mediating process for relations between sectarian community violence and child adjustment. Specifically, this study explored children's emotional insecurity at a person-oriented level of analysis assessed over 5 consecutive years, with child gender examined as a moderator of indirect effects between sectarian community violence and child adjustment. In the context of a five-wave longitudinal research design, participants included 928 mother-child dyads in Belfast (453 boys, 475 girls) drawn from socially deprived, ethnically homogenous areas that had experienced political violence. Youth ranged in age from 10 to 20 years and were 13.24 (SD = 1.83) years old on average at the initial time point. Greater insecurity about the community measured over multiple time points mediated relations between sectarian community violence and youth's total adjustment problems. The pathway from sectarian community violence to emotional insecurity about the community was moderated by child gender, with relations to emotional insecurity about the community stronger for girls than for boys. The results suggest that ameliorating children's insecurity about community in contexts of political violence is an important goal toward improving adolescents' well-being and adjustment. These results are discussed in terms of their translational research implications, consistent with a developmental psychopathology model for the interface between basic and intervention

  11. Assessing the simple dynamical systems approach in a Mediterranean context: application to the Ardeche catchment (France)

    Science.gov (United States)

    Adamovic, M.; Braud, I.; Branger, F.; Kirchner, J. W.

    2015-05-01

    This study explores how catchment heterogeneity and variability can be summarized in simplified models, representing the dominant hydrological processes. It focuses on Mediterranean catchments, characterized by heterogeneous geology, pedology and land use, as well as steep topography and a rainfall regime in which summer droughts contrast with high-rainfall periods in autumn. The Ardeche catchment (Southeast France), typical of this environment, is chosen to explore the following questions: (1) can such a Mediterranean catchment be adequately characterized by a simple dynamical systems approach and what are the limits of the method under such conditions? (2) what information about dominant predictors of hydrological variability can be retrieved from this analysis in such catchments? In this work we apply the data-driven approach of Kirchner (2009) to estimate discharge sensitivity functions that summarize the behaviour of four sub-catchments of the Ardeche, using low-vegetation periods (November-March) from 9 years of measurements (2000-2008) from operational networks. The relevance of the inferred sensitivity function is assessed through hydrograph simulations, and through estimating precipitation rates from discharge fluctuations. We find that the discharge sensitivity function is downward-curving in double-logarithmic space, thus allowing further simulation of discharge and non-divergence of the model, only during low-vegetation periods. The analysis is complemented by a Monte Carlo sensitivity analysis showing how the parameters summarizing the discharge sensitivity function impact the simulated hydrographs. The resulting discharge simulation results are good for granite catchments, which are likely to be characterized by shallow subsurface flow at the interface between soil and bedrock. The simple dynamical system hypothesis works especially well in wet conditions (peaks and recessions are well modelled). On the other hand, poor model performance is associated

  12. Likelihood analysis of earthquake focal mechanism distributions

    CERN Document Server

    Kagan, Y Y

    2014-01-01

    In our paper published earlier we discussed forecasts of earthquake focal mechanism and ways to test the forecast efficiency. Several verification methods were proposed, but they were based on ad-hoc, empirical assumptions, thus their performance is questionable. In this work we apply a conventional likelihood method to measure a skill of forecast. The advantage of such an approach is that earthquake rate prediction can in principle be adequately combined with focal mechanism forecast, if both are based on the likelihood scores, resulting in a general forecast optimization. To calculate the likelihood score we need to compare actual forecasts or occurrences of predicted events with the null hypothesis that the mechanism's 3-D orientation is random. For double-couple source orientation the random probability distribution function is not uniform, which complicates the calculation of the likelihood value. To better understand the resulting complexities we calculate the information (likelihood) score for two rota...

  13. Toward a dynamic biogeochemical division of the Mediterranean Sea in a context of global climate change

    Science.gov (United States)

    Reygondeau, Gabriel; Olivier Irisson, Jean; Guieu, Cecile; Gasparini, Stephane; Ayata, Sakina; Koubbi, Philippe

    2013-04-01

    In recent decades, it has been found useful to ecoregionalise the pelagic environment assuming that within each partition environmental conditions are distinguishable and unique. Indeed, each partition of the ocean that is proposed aimed to delineate the main oceanographical and ecological patterns to provide a geographical framework of marine ecosystems for ecological studies and management purposes. The aim of the present work is to integrate and process existing data on the pelagic environment of the Mediterranean Sea in order to define biogeochemical regions. Open access databases including remote sensing observations, oceanographic campaign data and physical modeling simulations are used. These various dataset allow the multidisciplinary view required to understand the interactions between climate and Mediterranean marine ecosystems. The first step of our study has consisted in a statistical selection of a set of crucial environmental factors to propose the most parsimonious biogeographical approach that allows detecting the main oceanographic structure of the Mediterranean Sea. Second, based on the identified set of environmental parameters, both non-hierarchical and hierarchical clustering algorithms have been tested. Outputs from each methodology are then inter-compared to propose a robust map of the biotopes (unique range of environmental parameters) of the area. Each biotope was then modeled using a non parametric environmental niche method to infer a dynamic biogeochemical partition. Last, the seasonal, inter annual and long term spatial changes of each biogeochemical regions were investigated. The future of this work will be to perform a second partition to subdivide the biogeochemical regions according to biotic features of the Mediterranean Sea (ecoregions). This second level of division will thus be used as a geographical framework to identify ecosystems that have been altered by human activities (i.e. pollution, fishery, invasive species) for the

  14. A high-content image-based method for quantitatively studying context-dependent cell population dynamics.

    Science.gov (United States)

    Garvey, Colleen M; Spiller, Erin; Lindsay, Danika; Chiang, Chun-Te; Choi, Nathan C; Agus, David B; Mallick, Parag; Foo, Jasmine; Mumenthaler, Shannon M

    2016-01-01

    Tumor progression results from a complex interplay between cellular heterogeneity, treatment response, microenvironment and heterocellular interactions. Existing approaches to characterize this interplay suffer from an inability to distinguish between multiple cell types, often lack environmental context, and are unable to perform multiplex phenotypic profiling of cell populations. Here we present a high-throughput platform for characterizing, with single-cell resolution, the dynamic phenotypic responses (i.e. morphology changes, proliferation, apoptosis) of heterogeneous cell populations both during standard growth and in response to multiple, co-occurring selective pressures. The speed of this platform enables a thorough investigation of the impacts of diverse selective pressures including genetic alterations, therapeutic interventions, heterocellular components and microenvironmental factors. The platform has been applied to both 2D and 3D culture systems and readily distinguishes between (1) cytotoxic versus cytostatic cellular responses; and (2) changes in morphological features over time and in response to perturbation. These important features can directly influence tumor evolution and clinical outcome. Our image-based approach provides a deeper insight into the cellular dynamics and heterogeneity of tumors (or other complex systems), with reduced reagents and time, offering advantages over traditional biological assays.

  15. A dinâmica familiar no contexto da crise suicida The family dynamics in the context of suicide crisis

    Directory of Open Access Journals (Sweden)

    Liara Lopes Krüger

    2010-04-01

    Full Text Available Famílias inseridas no contexto suicida organizam suas relações em torno de histórias opressoras construídas através das gerações, que impedem o desenvolvimento de autonomia e continuidade. Este artigo objetiva pensar sistemicamente sobre a dinâmica familiar da crise gerada pela tentativa de suicídio de um dos seus membros. Neste estudo, seis famílias participaram de uma intervenção breve, desenvolvida com base na teoria sistêmica. Os dados foram analisados com base no Método de Comparação Constante, identificando-se categorias e a construção de hipóteses a respeito da dinâmica familiar no contexto da crise suicida. Os resultados mostram que os participantes estão limitados em sua capacidade de apoiar o desenvolvimento de uma identidade autônoma, porque a dinâmica familiar identifica as novas oportunidades de narrar a si mesmo como ameaça ao sistema de lealdades que mantém a continuidade da família, impedindo a renegociação desses códigos. O sofrimento se apresenta como emoção que limita novas trocas, surgindo o comportamento suicida como alternativa.Families inserted in the suicide context tend to organize their relations around oppressive histories constructed over generations that hinder the development of autonomy and continuity. This paper aims at thinking systematically about the family dynamics of the crisis generated by the suicide attempt of one of its members. In this study, six families participated of a brief intervention developed on the basis of the systemic theory. The Grounded Theory analysis procedures made possible the identification of the categories and the construction of hypotheses regarding the family dynamics within the context of the suicide crisis. The result shows that participants are limited in their capacity of supporting the development of an independent identity because the family dynamics identifies new opportunities of narrating to oneself as a menace to the system of loyalties

  16. Numerical integration methods and layout improvements in the context of dynamic RNA visualization.

    Science.gov (United States)

    Shabash, Boris; Wiese, Kay C

    2017-05-30

    RNA visualization software tools have traditionally presented a static visualization of RNA molecules with limited ability for users to interact with the resulting image once it is complete. Only a few tools allowed for dynamic structures. One such tool is jViz.RNA. Currently, jViz.RNA employs a unique method for the creation of the RNA molecule layout by mapping the RNA nucleotides into vertexes in a graph, which we call the detailed graph, and then utilizes a Newtonian mechanics inspired system of forces to calculate a layout for the RNA molecule. The work presented here focuses on improvements to jViz.RNA that allow the drawing of RNA secondary structures according to common drawing conventions, as well as dramatic run-time performance improvements. This is done first by presenting an alternative method for mapping the RNA molecule into a graph, which we call the compressed graph, and then employing advanced numerical integration methods for the compressed graph representation. Comparing the compressed graph and detailed graph implementations, we find that the compressed graph produces results more consistent with RNA drawing conventions. However, we also find that employing the compressed graph method requires a more sophisticated initial layout to produce visualizations that would require minimal user interference. Comparing the two numerical integration methods demonstrates the higher stability of the Backward Euler method, and its resulting ability to handle much larger time steps, a high priority feature for any software which entails user interaction. The work in this manuscript presents the preferred use of compressed graphs to detailed ones, as well as the advantages of employing the Backward Euler method over the Forward Euler method. These improvements produce more stable as well as visually aesthetic representations of the RNA secondary structures. The results presented demonstrate that both the compressed graph representation, as well as the Backward

  17. The great triangular seismic region in eastern Asia: Thoughts on its dynamic context

    Directory of Open Access Journals (Sweden)

    Xianglin Gao

    2011-01-01

    Full Text Available A huge triangle-shaped tectonic region in eastern Asia plays host to numerous major earthquakes. The three boundaries of this region, which contains plateaus, mountains, and intermountain basins, are roughly the Himalayan arc, the Tianshan-Baikal, and longitude line ∼105°E. Within this triangular region, tectonism is intense and major deformation occurs both between crustal blocks and within most of them. Outside of this region, rigid blocks move as a whole with relatively few major earthquakes and relatively weak Cenozoic deformation. On a large tectonic scale, the presence of this broad region of intraplate deformation results from dynamic interactions between the Indian, Philippine Sea-West Pacific, and Eurasian plates, as well as the influence of deep-level mantle flow. The Indian subcontinent, which continues to move northwards at ∼40 mm/a since its collision with Eurasia, has plunged beneath Tibet, resulting in various movements and deformations along the Himalayan arc that diffuse over a long distance into the hinterland of Asia. The northward crustal escape of Asia from the Himalayan collisional zone turns eastwards and southeastwards along 95°–100°E longitude and defines the eastern Himalayan syntaxis. At the western Himalayan syntaxis, the Pamirs continue to move into central Asia, leading to crustal deformation and earthquakes that are largely accommodated by old EW or NW trending faults in the bordering areas between China, Mongolia, and Russia, and are restricted by the stable landmass northwest of the Tianshan-Altai-Baikal region. The subduction of the Philippine and Pacific plates under the Eurasian continent has generated a very long and narrow seismic zone along trenches and island arcs in the marginal seas while imposing only slight horizontal compression on the Asian continent that does not impede the eastward motion of eastern Asia. In the third dimension, there may be southeastward deep mantle flow beneath most of

  18. The Armc10/SVH gene: genome context, regulation of mitochondrial dynamics and protection against Aβ-induced mitochondrial fragmentation

    Science.gov (United States)

    Serrat, R; Mirra, S; Figueiro-Silva, J; Navas-Pérez, E; Quevedo, M; López-Doménech, G; Podlesniy, P; Ulloa, F; Garcia-Fernàndez, J; Trullas, R; Soriano, E

    2014-01-01

    Mitochondrial function and dynamics are essential for neurotransmission, neural function and neuronal viability. Recently, we showed that the eutherian-specific Armcx gene cluster (Armcx1–6 genes), located in the X chromosome, encodes for a new family of proteins that localise to mitochondria, regulating mitochondrial trafficking. The Armcx gene cluster evolved by retrotransposition of the Armc10 gene mRNA, which is present in all vertebrates and is considered to be the ancestor gene. Here we investigate the genomic organisation, mitochondrial functions and putative neuroprotective role of the Armc10 ancestor gene. The genomic context of the Armc10 locus shows considerable syntenic conservation among vertebrates, and sequence comparisons and CHIP-data suggest the presence of at least three conserved enhancers. We also show that the Armc10 protein localises to mitochondria and that it is highly expressed in the brain. Furthermore, we show that Armc10 levels regulate mitochondrial trafficking in neurons, but not mitochondrial aggregation, by controlling the number of moving mitochondria. We further demonstrate that the Armc10 protein interacts with the KIF5/Miro1-2/Trak2 trafficking complex. Finally, we show that overexpression of Armc10 in neurons prevents Aβ-induced mitochondrial fission and neuronal death. Our data suggest both conserved and differential roles of the Armc10/Armcx gene family in regulating mitochondrial dynamics in neurons, and underscore a protective effect of the Armc10 gene against Aβ-induced toxicity. Overall, our findings support a further degree of regulation of mitochondrial dynamics in the brain of more evolved mammals. PMID:24722288

  19. Dynamics of diversity, distribution patterns and interspecific associations of understory herbs in the city-suburb-exurb context of Wuhan city, China

    Directory of Open Access Journals (Sweden)

    Wu Xiao-Jing

    2013-01-01

    Full Text Available The dynamics of herb diversity, distribution patterns and interspecific associations of dominant herbs in natural forests at three growing stages in a city-suburb-exurb context in Wuhan City were studied using a fixed plot. The results show that the composition, diversity indices, mean and total richness gradually increased with the city-suburb-exurb gradient. Codominant species across temporal dynamics in Qinglong Mountain were stable, however, there were remarkable changes in the city-suburb context. Qinglong Mountain (exurb context had the densest codominant herbs, Woodwardia japonica, Spider brake and Parathelypteris glanduligera, throughout the growing season. However, Shizi Mountain (suburb context and Hongshan Mountain (urban context had low-density and monodominant herbs at three stages. The overall strength of associations among herbs increased with the city-suburb-exurb context, and pairs of positive associations and significant associations (r>0.5 or r<-0.5 were more frequent on Qinglong Mountain. Therefore, the exurb-suburb-city landscape context in response to urbanization had a notable effect on the features of the understory herb layer.

  20. Efficient maximum likelihood parameterization of continuous-time Markov processes

    CERN Document Server

    McGibbon, Robert T

    2015-01-01

    Continuous-time Markov processes over finite state-spaces are widely used to model dynamical processes in many fields of natural and social science. Here, we introduce an maximum likelihood estimator for constructing such models from data observed at a finite time interval. This estimator is drastically more efficient than prior approaches, enables the calculation of deterministic confidence intervals in all model parameters, and can easily enforce important physical constraints on the models such as detailed balance. We demonstrate and discuss the advantages of these models over existing discrete-time Markov models for the analysis of molecular dynamics simulations.

  1. Likelihood inference for unions of interacting discs

    DEFF Research Database (Denmark)

    Møller, Jesper; Helisova, K.

    2010-01-01

    with respect to a given marked Poisson model (i.e. a Boolean model). We show how edge effects and other complications can be handled by considering a certain conditional likelihood. Our methodology is illustrated by analysing Peter Diggle's heather data set, where we discuss the results of simulation......This is probably the first paper which discusses likelihood inference for a random set using a germ-grain model, where the individual grains are unobservable, edge effects occur and other complications appear. We consider the case where the grains form a disc process modelled by a marked point......-based maximum likelihood inference and the effect of specifying different reference Poisson models....

  2. A Comparison of Pseudo-Maximum Likelihood and Asymptotically Distribution-Free Dynamic Factor Analysis Parameter Estimation in Fitting Covariance-Structure Models to Block-Toeplitz Representing Single-Subject Multivariate Time-Series

    NARCIS (Netherlands)

    Molenaar, P.C.M.; Nesselroade, J.R.

    1998-01-01

    The study of intraindividual variability pervades empirical inquiry in virtually all subdisciplines of psychology. The statistical analysis of multivariate time-series data - a central product of intraindividual investigations - requires special modeling techniques. The dynamic factor model (DFM), w

  3. Introductory statistical inference with the likelihood function

    CERN Document Server

    Rohde, Charles A

    2014-01-01

    This textbook covers the fundamentals of statistical inference and statistical theory including Bayesian and frequentist approaches and methodology possible without excessive emphasis on the underlying mathematics. This book is about some of the basic principles of statistics that are necessary to understand and evaluate methods for analyzing complex data sets. The likelihood function is used for pure likelihood inference throughout the book. There is also coverage of severity and finite population sampling. The material was developed from an introductory statistical theory course taught by the author at the Johns Hopkins University’s Department of Biostatistics. Students and instructors in public health programs will benefit from the likelihood modeling approach that is used throughout the text. This will also appeal to epidemiologists and psychometricians.  After a brief introduction, there are chapters on estimation, hypothesis testing, and maximum likelihood modeling. The book concludes with secti...

  4. Maximum-likelihood method in quantum estimation

    CERN Document Server

    Paris, M G A; Sacchi, M F

    2001-01-01

    The maximum-likelihood method for quantum estimation is reviewed and applied to the reconstruction of density matrix of spin and radiation as well as to the determination of several parameters of interest in quantum optics.

  5. Maximum-Likelihood Detection Of Noncoherent CPM

    Science.gov (United States)

    Divsalar, Dariush; Simon, Marvin K.

    1993-01-01

    Simplified detectors proposed for use in maximum-likelihood-sequence detection of symbols in alphabet of size M transmitted by uncoded, full-response continuous phase modulation over radio channel with additive white Gaussian noise. Structures of receivers derived from particular interpretation of maximum-likelihood metrics. Receivers include front ends, structures of which depends only on M, analogous to those in receivers of coherent CPM. Parts of receivers following front ends have structures, complexity of which would depend on N.

  6. Context-dependent Dynamic Processes in Attention Deficit/Hyperactivity Disorder : Differentiating Common and Unique Effects of State Regulation Deficits and Delay Aversion

    NARCIS (Netherlands)

    Sonuga-Barke, Edmund J. S.; Wiersema, Jan R.; van der Meere, Jacob J.; Roeyers, Herbert

    2010-01-01

    The ability to specify differential predictions is a mark of a scientific models' value. State regulation deficits (SRD) and delay aversion (DAv) have both been hypothesized as context-dependent dynamic dysfunctions in ADHD. However, to date there has been no systematic comparison of their common an

  7. Driving the Model to Its Limit: Profile Likelihood Based Model Reduction.

    Science.gov (United States)

    Maiwald, Tim; Hass, Helge; Steiert, Bernhard; Vanlier, Joep; Engesser, Raphael; Raue, Andreas; Kipkeew, Friederike; Bock, Hans H; Kaschek, Daniel; Kreutz, Clemens; Timmer, Jens

    2016-01-01

    In systems biology, one of the major tasks is to tailor model complexity to information content of the data. A useful model should describe the data and produce well-determined parameter estimates and predictions. Too small of a model will not be able to describe the data whereas a model which is too large tends to overfit measurement errors and does not provide precise predictions. Typically, the model is modified and tuned to fit the data, which often results in an oversized model. To restore the balance between model complexity and available measurements, either new data has to be gathered or the model has to be reduced. In this manuscript, we present a data-based method for reducing non-linear models. The profile likelihood is utilised to assess parameter identifiability and designate likely candidates for reduction. Parameter dependencies are analysed along profiles, providing context-dependent suggestions for the type of reduction. We discriminate four distinct scenarios, each associated with a specific model reduction strategy. Iterating the presented procedure eventually results in an identifiable model, which is capable of generating precise and testable predictions. Source code for all toy examples is provided within the freely available, open-source modelling environment Data2Dynamics based on MATLAB available at http://www.data2dynamics.org/, as well as the R packages dMod/cOde available at https://github.com/dkaschek/. Moreover, the concept is generally applicable and can readily be used with any software capable of calculating the profile likelihood.

  8. Likelihood Principle and Maximum Likelihood Estimator of Location Parameter for Cauchy Distribution.

    Science.gov (United States)

    1986-05-01

    consistency (or strong consistency) of maximum likelihood estimator has been studied by many researchers, for example, Wald (1949), Wolfowitz (1953, 1965...20, 595-601. [25] Wolfowitz , J. (1953). The method of maximum likelihood and Wald theory of decision functions. Indag. Math., Vol. 15, 114-119. [26...Probability Letters Vol. 1, No. 3, 197-202. [24] Wald , A. (1949). Note on the consistency of maximum likelihood estimates. Ann. Math. Statist., Vol

  9. Likelihood inference for unions of interacting discs

    DEFF Research Database (Denmark)

    Møller, Jesper; Helisová, Katarina

    is specified with respect to a given marked Poisson model (i.e. a Boolean model). We show how edge effects and other complications can be handled by considering a certain conditional likelihood. Our methodology is illustrated by analyzing Peter Diggle's heather dataset, where we discuss the results......To the best of our knowledge, this is the first paper which discusses likelihood inference or a random set using a germ-grain model, where the individual grains are unobservable edge effects occur, and other complications appear. We consider the case where the grains form a disc process modelled...... of simulation-based maximum likelihood inference and the effect of specifying different reference Poisson models....

  10. Likelihood alarm displays. [for human operator

    Science.gov (United States)

    Sorkin, Robert D.; Kantowitz, Barry H.; Kantowitz, Susan C.

    1988-01-01

    In a likelihood alarm display (LAD) information about event likelihood is computed by an automated monitoring system and encoded into an alerting signal for the human operator. Operator performance within a dual-task paradigm was evaluated with two LADs: a color-coded visual alarm and a linguistically coded synthetic speech alarm. The operator's primary task was one of tracking; the secondary task was to monitor a four-element numerical display and determine whether the data arose from a 'signal' or 'no-signal' condition. A simulated 'intelligent' monitoring system alerted the operator to the likelihood of a signal. The results indicated that (1) automated monitoring systems can improve performance on primary and secondary tasks; (2) LADs can improve the allocation of attention among tasks and provide information integrated into operator decisions; and (3) LADs do not necessarily add to the operator's attentional load.

  11. A quantum framework for likelihood ratios

    CERN Document Server

    Bond, Rachael L; Ormerod, Thomas C

    2015-01-01

    The ability to calculate precise likelihood ratios is fundamental to many STEM areas, such as decision-making theory, biomedical science, and engineering. However, there is no assumption-free statistical methodology to achieve this. For instance, in the absence of data relating to covariate overlap, the widely used Bayes' theorem either defaults to the marginal probability driven "naive Bayes' classifier", or requires the use of compensatory expectation-maximization techniques. Equally, the use of alternative statistical approaches, such as multivariate logistic regression, may be confounded by other axiomatic conditions, e.g., low levels of co-linearity. This article takes an information-theoretic approach in developing a new statistical formula for the calculation of likelihood ratios based on the principles of quantum entanglement. In doing so, it is argued that this quantum approach demonstrates: that the likelihood ratio is a real quality of statistical systems; that the naive Bayes' classifier is a spec...

  12. Water and sediment dynamics in the context of climate change and variability (Cañete river, Peru).

    Science.gov (United States)

    Rosas, Miluska; Vanacker, Veerle; Huggel, Christian; Gutierrez, Ronald R.

    2017-04-01

    Water erosion is one of the main environmental problems in Peru. The elevated rates of soil erosion are related to the rough topography of the Andes, shallow soils, highly erosive climate and the inappropriate land use management. Agricultural activities are directly affected by the elevated soil erosion rates, either through reduced crop production and/or damage to irrigation infrastructure. Similarly, the development of water infrastructure and hydropower facilities can be negatively affected by high sedimentation rates. However, critical information about sediment production, transport and deposition is still mostly lacking. This paper focuses on sediment dynamics in the context of land use and climate change in the Peruvian Andes. Within the Peruvian Coastal Range, the catchment of the Cañete River is studied as it plays an important role in the social and economic development of the region, and due to its provision of water and energy to rural and urban areas. The lower part of the basin is an arid desert, the middle sub-humid part sustains subsistence agriculture, and the upper part of the basin is a treeless high-elevation puna landscape. Snow cover and glaciers are present at its headwaters located above 5000 m asl. The retreat of glaciers due to climate change is expected to have an impact on water availability, and the production and mobilization of sediment within the river channels. Likewise, climate variability and land cover changes might trigger an important increase of erosion and sediment transport rates. The methodology applied to face this issue is principally based on the analysis of sediment samples recollected in the basin in the period 1998 to 2001, and the application of a water and sediment routing model. The paper presents new data on the sensitivity of water infrastructure and hydropower facilities to climate-induced changes in sediment mobilization.

  13. CORA: Emission Line Fitting with Maximum Likelihood

    Science.gov (United States)

    Ness, Jan-Uwe; Wichmann, Rainer

    2011-12-01

    CORA analyzes emission line spectra with low count numbers and fits them to a line using the maximum likelihood technique. CORA uses a rigorous application of Poisson statistics. From the assumption of Poissonian noise, the software derives the probability for a model of the emission line spectrum to represent the measured spectrum. The likelihood function is used as a criterion for optimizing the parameters of the theoretical spectrum and a fixed point equation is derived allowing an efficient way to obtain line fluxes. CORA has been applied to an X-ray spectrum with the Low Energy Transmission Grating Spectrometer (LETGS) on board the Chandra observatory.

  14. Maximum Likelihood Joint Tracking and Association in Strong Clutter

    Directory of Open Access Journals (Sweden)

    Leonid I. Perlovsky

    2013-01-01

    Full Text Available We have developed a maximum likelihood formulation for a joint detection, tracking and association problem. An efficient non-combinatorial algorithm for this problem is developed in case of strong clutter for radar data. By using an iterative procedure of the dynamic logic process “from vague-to-crisp” explained in the paper, the new tracker overcomes the combinatorial complexity of tracking in highly-cluttered scenarios and results in an orders-of-magnitude improvement in signal-to-clutter ratio.

  15. Likelihood-Based Inference in Nonlinear Error-Correction Models

    DEFF Research Database (Denmark)

    Kristensen, Dennis; Rahbæk, Anders

    We consider a class of vector nonlinear error correction models where the transfer function (or loadings) of the stationary relation- ships is nonlinear. This includes in particular the smooth transition models. A general representation theorem is given which establishes the dynamic properties...... and a linear trend in general. Gaussian likelihood-based estimators are considered for the long- run cointegration parameters, and the short-run parameters. Asymp- totic theory is provided for these and it is discussed to what extend asymptotic normality and mixed normaity can be found. A simulation study...

  16. Maximum Likelihood Joint Tracking and Association in Strong Clutter

    Directory of Open Access Journals (Sweden)

    Leonid I. Perlovsky

    2013-01-01

    Full Text Available We have developed a maximum likelihood formulation for a joint detection, tracking and association problem. An efficient non‐combinatorial algorithm for this problem is developed in case of strong clutter for radar data. By using an iterative procedure of the dynamic logic process “from vague‐to‐crisp” explained in the paper, the new tracker overcomes the combinatorial complexity of tracking in highly‐cluttered scenarios and results in an orders‐of‐magnitude improvement in signal‐ to‐clutter ratio.

  17. Composite likelihood estimation of demographic parameters

    Directory of Open Access Journals (Sweden)

    Garrigan Daniel

    2009-11-01

    Full Text Available Abstract Background Most existing likelihood-based methods for fitting historical demographic models to DNA sequence polymorphism data to do not scale feasibly up to the level of whole-genome data sets. Computational economies can be achieved by incorporating two forms of pseudo-likelihood: composite and approximate likelihood methods. Composite likelihood enables scaling up to large data sets because it takes the product of marginal likelihoods as an estimator of the likelihood of the complete data set. This approach is especially useful when a large number of genomic regions constitutes the data set. Additionally, approximate likelihood methods can reduce the dimensionality of the data by summarizing the information in the original data by either a sufficient statistic, or a set of statistics. Both composite and approximate likelihood methods hold promise for analyzing large data sets or for use in situations where the underlying demographic model is complex and has many parameters. This paper considers a simple demographic model of allopatric divergence between two populations, in which one of the population is hypothesized to have experienced a founder event, or population bottleneck. A large resequencing data set from human populations is summarized by the joint frequency spectrum, which is a matrix of the genomic frequency spectrum of derived base frequencies in two populations. A Bayesian Metropolis-coupled Markov chain Monte Carlo (MCMCMC method for parameter estimation is developed that uses both composite and likelihood methods and is applied to the three different pairwise combinations of the human population resequence data. The accuracy of the method is also tested on data sets sampled from a simulated population model with known parameters. Results The Bayesian MCMCMC method also estimates the ratio of effective population size for the X chromosome versus that of the autosomes. The method is shown to estimate, with reasonable

  18. Maintaining symmetry of simulated likelihood functions

    DEFF Research Database (Denmark)

    Andersen, Laura Mørch

    This paper suggests solutions to two different types of simulation errors related to Quasi-Monte Carlo integration. Likelihood functions which depend on standard deviations of mixed parameters are symmetric in nature. This paper shows that antithetic draws preserve this symmetry and thereby...

  19. Likelihood analysis of the I(2) model

    DEFF Research Database (Denmark)

    Johansen, Søren

    1997-01-01

    The I(2) model is defined as a submodel of the general vector autoregressive model, by two reduced rank conditions. The model describes stochastic processes with stationary second difference. A parametrization is suggested which makes likelihood inference feasible. Consistency of the maximum like...

  20. Synthesizing Regression Results: A Factored Likelihood Method

    Science.gov (United States)

    Wu, Meng-Jia; Becker, Betsy Jane

    2013-01-01

    Regression methods are widely used by researchers in many fields, yet methods for synthesizing regression results are scarce. This study proposes using a factored likelihood method, originally developed to handle missing data, to appropriately synthesize regression models involving different predictors. This method uses the correlations reported…

  1. Maximum Likelihood Estimation of Search Costs

    NARCIS (Netherlands)

    J.L. Moraga-Gonzalez (José Luis); M.R. Wildenbeest (Matthijs)

    2006-01-01

    textabstractIn a recent paper Hong and Shum (forthcoming) present a structural methodology to estimate search cost distributions. We extend their approach to the case of oligopoly and present a maximum likelihood estimate of the search cost distribution. We apply our method to a data set of online p

  2. Maintaining symmetry of simulated likelihood functions

    DEFF Research Database (Denmark)

    Andersen, Laura Mørch

    This paper suggests solutions to two different types of simulation errors related to Quasi-Monte Carlo integration. Likelihood functions which depend on standard deviations of mixed parameters are symmetric in nature. This paper shows that antithetic draws preserve this symmetry and thereby...

  3. Likelihood based testing for no fractional cointegration

    DEFF Research Database (Denmark)

    Lasak, Katarzyna

    We consider two likelihood ratio tests, so-called maximum eigenvalue and trace tests, for the null of no cointegration when fractional cointegration is allowed under the alternative, which is a first step to generalize the so-called Johansen's procedure to the fractional cointegration case. The s...

  4. Maximum likelihood estimation of fractionally cointegrated systems

    DEFF Research Database (Denmark)

    Lasak, Katarzyna

    In this paper we consider a fractionally cointegrated error correction model and investigate asymptotic properties of the maximum likelihood (ML) estimators of the matrix of the cointe- gration relations, the degree of fractional cointegration, the matrix of the speed of adjustment...

  5. Maximum likelihood estimation for integrated diffusion processes

    DEFF Research Database (Denmark)

    Baltazar-Larios, Fernando; Sørensen, Michael

    EM-algorithm to obtain maximum likelihood estimates of the parameters in the diffusion model. As part of the algorithm, we use a recent simple method for approximate simulation of diffusion bridges. In simulation studies for the Ornstein-Uhlenbeck process and the CIR process the proposed method works...

  6. Likelihood approaches for proportional likelihood ratio model with right-censored data.

    Science.gov (United States)

    Zhu, Hong

    2014-06-30

    Regression methods for survival data with right censoring have been extensively studied under semiparametric transformation models such as the Cox regression model and the proportional odds model. However, their practical application could be limited because of possible violation of model assumption or lack of ready interpretation for the regression coefficients in some cases. As an alternative, in this paper, the proportional likelihood ratio model introduced by Luo and Tsai is extended to flexibly model the relationship between survival outcome and covariates. This model has a natural connection with many important semiparametric models such as generalized linear model and density ratio model and is closely related to biased sampling problems. Compared with the semiparametric transformation model, the proportional likelihood ratio model is appealing and practical in many ways because of its model flexibility and quite direct clinical interpretation. We present two likelihood approaches for the estimation and inference on the target regression parameters under independent and dependent censoring assumptions. Based on a conditional likelihood approach using uncensored failure times, a numerically simple estimation procedure is developed by maximizing a pairwise pseudo-likelihood. We also develop a full likelihood approach, and the most efficient maximum likelihood estimator is obtained by a profile likelihood. Simulation studies are conducted to assess the finite-sample properties of the proposed estimators and compare the efficiency of the two likelihood approaches. An application to survival data for bone marrow transplantation patients of acute leukemia is provided to illustrate the proposed method and other approaches for handling non-proportionality. The relative merits of these methods are discussed in concluding remarks.

  7. Estimation of stochastic frontier models with fixed-effects through Monte Carlo Maximum Likelihood

    NARCIS (Netherlands)

    Emvalomatis, G.; Stefanou, S.E.; Oude Lansink, A.G.J.M.

    2011-01-01

    Estimation of nonlinear fixed-effects models is plagued by the incidental parameters problem. This paper proposes a procedure for choosing appropriate densities for integrating the incidental parameters from the likelihood function in a general context. The densities are based on priors that are

  8. Error-likelihood prediction in the medial frontal cortex: A critical evaluation

    NARCIS (Netherlands)

    Nieuwenhuis, S.; Scheizer, T.S.; Mars, R.B.; Botvinick, M.M.; Hajcal, G.

    2007-01-01

    A recent study has proposed that posterior regions of the medial frontal cortex (pMFC) learn to predict the likelihood of errors ccurring in a given task context. A key prediction of the errorlZelihood (EL) hypothesis is that the pMFC should exhibit enhanced activity to cues that are predictive of h

  9. Estimation of stochastic frontier models with fixed-effects through Monte Carlo Maximum Likelihood

    NARCIS (Netherlands)

    Emvalomatis, G.; Stefanou, S.E.; Oude Lansink, A.G.J.M.

    2011-01-01

    Estimation of nonlinear fixed-effects models is plagued by the incidental parameters problem. This paper proposes a procedure for choosing appropriate densities for integrating the incidental parameters from the likelihood function in a general context. The densities are based on priors that are upd

  10. Model Selection Through Sparse Maximum Likelihood Estimation

    CERN Document Server

    Banerjee, Onureena; D'Aspremont, Alexandre

    2007-01-01

    We consider the problem of estimating the parameters of a Gaussian or binary distribution in such a way that the resulting undirected graphical model is sparse. Our approach is to solve a maximum likelihood problem with an added l_1-norm penalty term. The problem as formulated is convex but the memory requirements and complexity of existing interior point methods are prohibitive for problems with more than tens of nodes. We present two new algorithms for solving problems with at least a thousand nodes in the Gaussian case. Our first algorithm uses block coordinate descent, and can be interpreted as recursive l_1-norm penalized regression. Our second algorithm, based on Nesterov's first order method, yields a complexity estimate with a better dependence on problem size than existing interior point methods. Using a log determinant relaxation of the log partition function (Wainwright & Jordan (2006)), we show that these same algorithms can be used to solve an approximate sparse maximum likelihood problem for...

  11. Composite likelihood method for inferring local pedigrees

    Science.gov (United States)

    Nielsen, Rasmus

    2017-01-01

    Pedigrees contain information about the genealogical relationships among individuals and are of fundamental importance in many areas of genetic studies. However, pedigrees are often unknown and must be inferred from genetic data. Despite the importance of pedigree inference, existing methods are limited to inferring only close relationships or analyzing a small number of individuals or loci. We present a simulated annealing method for estimating pedigrees in large samples of otherwise seemingly unrelated individuals using genome-wide SNP data. The method supports complex pedigree structures such as polygamous families, multi-generational families, and pedigrees in which many of the member individuals are missing. Computational speed is greatly enhanced by the use of a composite likelihood function which approximates the full likelihood. We validate our method on simulated data and show that it can infer distant relatives more accurately than existing methods. Furthermore, we illustrate the utility of the method on a sample of Greenlandic Inuit. PMID:28827797

  12. Superfast maximum-likelihood reconstruction for quantum tomography

    Science.gov (United States)

    Shang, Jiangwei; Zhang, Zhengyun; Ng, Hui Khoon

    2017-06-01

    Conventional methods for computing maximum-likelihood estimators (MLE) often converge slowly in practical situations, leading to a search for simplifying methods that rely on additional assumptions for their validity. In this work, we provide a fast and reliable algorithm for maximum-likelihood reconstruction that avoids this slow convergence. Our method utilizes the state-of-the-art convex optimization scheme, an accelerated projected-gradient method, that allows one to accommodate the quantum nature of the problem in a different way than in the standard methods. We demonstrate the power of our approach by comparing its performance with other algorithms for n -qubit state tomography. In particular, an eight-qubit situation that purportedly took weeks of computation time in 2005 can now be completed in under a minute for a single set of data, with far higher accuracy than previously possible. This refutes the common claim that MLE reconstruction is slow and reduces the need for alternative methods that often come with difficult-to-verify assumptions. In fact, recent methods assuming Gaussian statistics or relying on compressed sensing ideas are demonstrably inapplicable for the situation under consideration here. Our algorithm can be applied to general optimization problems over the quantum state space; the philosophy of projected gradients can further be utilized for optimization contexts with general constraints.

  13. Factors Associated with Young Adults’ Pregnancy Likelihood

    Science.gov (United States)

    Kitsantas, Panagiota; Lindley, Lisa L.; Wu, Huichuan

    2014-01-01

    OBJECTIVES While progress has been made to reduce adolescent pregnancies in the United States, rates of unplanned pregnancy among young adults (18–29 years) remain high. In this study, we assessed factors associated with perceived likelihood of pregnancy (likelihood of getting pregnant/getting partner pregnant in the next year) among sexually experienced young adults who were not trying to get pregnant and had ever used contraceptives. METHODS We conducted a secondary analysis of 660 young adults, 18–29 years old in the United States, from the cross-sectional National Survey of Reproductive and Contraceptive Knowledge. Logistic regression and classification tree analyses were conducted to generate profiles of young adults most likely to report anticipating a pregnancy in the next year. RESULTS Nearly one-third (32%) of young adults indicated they believed they had at least some likelihood of becoming pregnant in the next year. Young adults who believed that avoiding pregnancy was not very important were most likely to report pregnancy likelihood (odds ratio [OR], 5.21; 95% CI, 2.80–9.69), as were young adults for whom avoiding a pregnancy was important but not satisfied with their current contraceptive method (OR, 3.93; 95% CI, 1.67–9.24), attended religious services frequently (OR, 3.0; 95% CI, 1.52–5.94), were uninsured (OR, 2.63; 95% CI, 1.31–5.26), and were likely to have unprotected sex in the next three months (OR, 1.77; 95% CI, 1.04–3.01). DISCUSSION These results may help guide future research and the development of pregnancy prevention interventions targeting sexually experienced young adults. PMID:25782849

  14. Lessons about likelihood functions from nuclear physics

    CERN Document Server

    Hanson, Kenneth M

    2007-01-01

    Least-squares data analysis is based on the assumption that the normal (Gaussian) distribution appropriately characterizes the likelihood, that is, the conditional probability of each measurement d, given a measured quantity y, p(d | y). On the other hand, there is ample evidence in nuclear physics of significant disagreements among measurements, which are inconsistent with the normal distribution, given their stated uncertainties. In this study the histories of 99 measurements of the lifetimes of five elementary particles are examined to determine what can be inferred about the distribution of their values relative to their stated uncertainties. Taken as a whole, the variations in the data are somewhat larger than their quoted uncertainties would indicate. These data strongly support using a Student t distribution for the likelihood function instead of a normal. The most probable value for the order of the t distribution is 2.6 +/- 0.9. It is shown that analyses based on long-tailed t-distribution likelihood...

  15. Maximum likelihood continuity mapping for fraud detection

    Energy Technology Data Exchange (ETDEWEB)

    Hogden, J.

    1997-05-01

    The author describes a novel time-series analysis technique called maximum likelihood continuity mapping (MALCOM), and focuses on one application of MALCOM: detecting fraud in medical insurance claims. Given a training data set composed of typical sequences, MALCOM creates a stochastic model of sequence generation, called a continuity map (CM). A CM maximizes the probability of sequences in the training set given the model constraints, CMs can be used to estimate the likelihood of sequences not found in the training set, enabling anomaly detection and sequence prediction--important aspects of data mining. Since MALCOM can be used on sequences of categorical data (e.g., sequences of words) as well as real valued data, MALCOM is also a potential replacement for database search tools such as N-gram analysis. In a recent experiment, MALCOM was used to evaluate the likelihood of patient medical histories, where ``medical history`` is used to mean the sequence of medical procedures performed on a patient. Physicians whose patients had anomalous medical histories (according to MALCOM) were evaluated for fraud by an independent agency. Of the small sample (12 physicians) that has been evaluated, 92% have been determined fraudulent or abusive. Despite the small sample, these results are encouraging.

  16. Likelihood methods and classical burster repetition

    CERN Document Server

    Graziani, C; Graziani, Carlo; Lamb, Donald Q

    1995-01-01

    We develop a likelihood methodology which can be used to search for evidence of burst repetition in the BATSE catalog, and to study the properties of the repetition signal. We use a simplified model of burst repetition in which a number N_{\\rm r} of sources which repeat a fixed number of times N_{\\rm rep} are superposed upon a number N_{\\rm nr} of non-repeating sources. The instrument exposure is explicitly taken into account. By computing the likelihood for the data, we construct a probability distribution in parameter space that may be used to infer the probability that a repetition signal is present, and to estimate the values of the repetition parameters. The likelihood function contains contributions from all the bursts, irrespective of the size of their positional errors --- the more uncertain a burst's position is, the less constraining is its contribution. Thus this approach makes maximal use of the data, and avoids the ambiguities of sample selection associated with data cuts on error circle size. We...

  17. Population dynamics throughout the urban context: A case study in sub-Saharan Africa utilizing remotely sensed imagery and GIS

    Science.gov (United States)

    Benza, Magdalena

    The characteristics of places where people live and work play an important role in explaining complex social, political, economic and demographic processes. In sub-Saharan Africa rapid urban growth combined with rising poverty is creating diverse urban environments inhabited by people with a wide variety of lifestyles. This research examines how spatial patterns of land cover in a southern portion of the West African country of Ghana are associated with particular characteristics of family organization and reproduction decisions. Satellite imagery and landscape metrics are used to create an urban context definition based on landscape patterns using a gradient approach. Census data are used to estimate fertility levels and household structure, and the association between urban context, household composition and fertility levels is modeled through OLS regression, spatial autoregressive models and geographically weighted regression. Results indicate that there are significant differences in fertility levels between different urban contexts, with below average fertility levels found in the most urbanized end of the urban context definition and above average fertility levels found on the opposite end. The spatial patterns identified in the association between urban context and fertility levels indicate that, within the city areas with lower fertility have significant impacts on the reproductive levels of adjacent neighborhoods. Findings also indicate that there are clear patterns that link urban context to living arrangements and fertility levels. Female- and single-headed households are associated with below average fertility levels, a result that connects dropping fertility levels with the spread of smaller nuclear households in developing countries. At the same time, larger extended family households are linked to below average fertility levels for highly clustered areas, a finding that points to the prevalence of extended family housing in the West African city.

  18. Application of maximum-likelihood estimation in optical coherence tomography for nanometer-class thickness estimation

    Science.gov (United States)

    Huang, Jinxin; Yuan, Qun; Tankam, Patrice; Clarkson, Eric; Kupinski, Matthew; Hindman, Holly B.; Aquavella, James V.; Rolland, Jannick P.

    2015-03-01

    In biophotonics imaging, one important and quantitative task is layer-thickness estimation. In this study, we investigate the approach of combining optical coherence tomography and a maximum-likelihood (ML) estimator for layer thickness estimation in the context of tear film imaging. The motivation of this study is to extend our understanding of tear film dynamics, which is the prerequisite to advance the management of Dry Eye Disease, through the simultaneous estimation of the thickness of the tear film lipid and aqueous layers. The estimator takes into account the different statistical processes associated with the imaging chain. We theoretically investigated the impact of key system parameters, such as the axial point spread functions (PSF) and various sources of noise on measurement uncertainty. Simulations show that an OCT system with a 1 μm axial PSF (FWHM) allows unbiased estimates down to nanometers with nanometer precision. In implementation, we built a customized Fourier domain OCT system that operates in the 600 to 1000 nm spectral window and achieves 0.93 micron axial PSF in corneal epithelium. We then validated the theoretical framework with physical phantoms made of custom optical coatings, with layer thicknesses from tens of nanometers to microns. Results demonstrate unbiased nanometer-class thickness estimates in three different physical phantoms.

  19. Maximum likelihood estimation for Cox's regression model under nested case-control sampling

    DEFF Research Database (Denmark)

    Scheike, Thomas; Juul, Anders

    2004-01-01

    Nested case-control sampling is designed to reduce the costs of large cohort studies. It is important to estimate the parameters of interest as efficiently as possible. We present a new maximum likelihood estimator (MLE) for nested case-control sampling in the context of Cox's proportional hazards...... model. The MLE is computed by the EM-algorithm, which is easy to implement in the proportional hazards setting. Standard errors are estimated by a numerical profile likelihood approach based on EM aided differentiation. The work was motivated by a nested case-control study that hypothesized that insulin...

  20. Grain-scale numerical modeling of granular mechanics and fluid dynamics and application in a glacial context

    DEFF Research Database (Denmark)

    Damsgaard, Anders; Egholm, David Lundbek; Beem, Lucas H.

    rheology, which limit our ability to predict ice sheet dynamics in a changing climate. In this talk I will present the soft-body Discrete Element Method which is a Lagrangian method I use in order to simulate the unique and diverse nature of granular dynamics in the subglacial environment. However......, the method imposes intense computational requirements on the computational time step. The majority of steps in the granular dynamics algorithm are massively parallel, which makes the DEM an obvious candidate for exploiting the capabilities of modern GPUs. The granular computations are coupled to a fluid...

  1. Dimension-Independent Likelihood-Informed MCMC

    KAUST Repository

    Cui, Tiangang

    2015-01-07

    Many Bayesian inference problems require exploring the posterior distribution of high-dimensional parameters, which in principle can be described as functions. By exploiting low-dimensional structure in the change from prior to posterior [distributions], we introduce a suite of MCMC samplers that can adapt to the complex structure of the posterior distribution, yet are well-defined on function space. Posterior sampling in nonlinear inverse problems arising from various partial di erential equations and also a stochastic differential equation are used to demonstrate the e ciency of these dimension-independent likelihood-informed samplers.

  2. Multi-Channel Maximum Likelihood Pitch Estimation

    DEFF Research Database (Denmark)

    Christensen, Mads Græsbøll

    2012-01-01

    In this paper, a method for multi-channel pitch estimation is proposed. The method is a maximum likelihood estimator and is based on a parametric model where the signals in the various channels share the same fundamental frequency but can have different amplitudes, phases, and noise characteristics....... This essentially means that the model allows for different conditions in the various channels, like different signal-to-noise ratios, microphone characteristics and reverberation. Moreover, the method does not assume that a certain array structure is used but rather relies on a more general model and is hence...

  3. CMB Power Spectrum Likelihood with ILC

    CERN Document Server

    Dick, Jason; Delabrouille, Jacques

    2012-01-01

    We extend the ILC method in harmonic space to include the error in its CMB estimate. This allows parameter estimation routines to take into account the effect of the foregrounds as well as the errors in their subtraction in conjunction with the ILC method. Our method requires the use of a model of the foregrounds which we do not develop here. The reduction of the foreground level makes this method less sensitive to unaccounted for errors in the foreground model. Simulations are used to validate the calculations and approximations used in generating this likelihood function.

  4. An improved likelihood model for eye tracking

    DEFF Research Database (Denmark)

    Hammoud, Riad I.; Hansen, Dan Witzner

    2007-01-01

    approach in such cases is to abandon the tracking routine and re-initialize eye detection. Of course this may be a difficult process due to missed data problem. Accordingly, what is needed is an efficient method of reliably tracking a person's eyes between successively produced video image frames, even...... are challenging. It proposes a log likelihood-ratio function of foreground and background models in a particle filter-based eye tracking framework. It fuses key information from even, odd infrared fields (dark and bright-pupil) and their corresponding subtractive image into one single observation model...

  5. LIKEDM: Likelihood calculator of dark matter detection

    Science.gov (United States)

    Huang, Xiaoyuan; Tsai, Yue-Lin Sming; Yuan, Qiang

    2017-04-01

    With the large progress in searches for dark matter (DM) particles with indirect and direct methods, we develop a numerical tool that enables fast calculations of the likelihoods of specified DM particle models given a number of observational data, such as charged cosmic rays from space-borne experiments (e.g., PAMELA, AMS-02), γ-rays from the Fermi space telescope, and underground direct detection experiments. The purpose of this tool - LIKEDM, likelihood calculator for dark matter detection - is to bridge the gap between a particle model of DM and the observational data. The intermediate steps between these two, including the astrophysical backgrounds, the propagation of charged particles, the analysis of Fermi γ-ray data, as well as the DM velocity distribution and the nuclear form factor, have been dealt with in the code. We release the first version (v1.0) focusing on the constraints from indirect detection of DM with charged cosmic and gamma rays. Direct detection will be implemented in the next version. This manual describes the framework, usage, and related physics of the code.

  6. Multiplicative earthquake likelihood models incorporating strain rates

    Science.gov (United States)

    Rhoades, D. A.; Christophersen, A.; Gerstenberger, M. C.

    2017-01-01

    SUMMARYWe examine the potential for strain-rate variables to improve long-term earthquake likelihood models. We derive a set of multiplicative hybrid earthquake likelihood models in which cell rates in a spatially uniform baseline model are scaled using combinations of covariates derived from earthquake catalogue data, fault data, and strain-rates for the New Zealand region. Three components of the strain rate estimated from GPS data over the period 1991-2011 are considered: the shear, rotational and dilatational strain rates. The hybrid model parameters are optimised for earthquakes of M 5 and greater over the period 1987-2006 and tested on earthquakes from the period 2012-2015, which is independent of the strain rate estimates. The shear strain rate is overall the most informative individual covariate, as indicated by Molchan error diagrams as well as multiplicative modelling. Most models including strain rates are significantly more informative than the best models excluding strain rates in both the fitting and testing period. A hybrid that combines the shear and dilatational strain rates with a smoothed seismicity covariate is the most informative model in the fitting period, and a simpler model without the dilatational strain rate is the most informative in the testing period. These results have implications for probabilistic seismic hazard analysis and can be used to improve the background model component of medium-term and short-term earthquake forecasting models.

  7. CORA - emission line fitting with Maximum Likelihood

    Science.gov (United States)

    Ness, J.-U.; Wichmann, R.

    2002-07-01

    The advent of pipeline-processed data both from space- and ground-based observatories often disposes of the need of full-fledged data reduction software with its associated steep learning curve. In many cases, a simple tool doing just one task, and doing it right, is all one wishes. In this spirit we introduce CORA, a line fitting tool based on the maximum likelihood technique, which has been developed for the analysis of emission line spectra with low count numbers and has successfully been used in several publications. CORA uses a rigorous application of Poisson statistics. From the assumption of Poissonian noise we derive the probability for a model of the emission line spectrum to represent the measured spectrum. The likelihood function is used as a criterion for optimizing the parameters of the theoretical spectrum and a fixed point equation is derived allowing an efficient way to obtain line fluxes. As an example we demonstrate the functionality of the program with an X-ray spectrum of Capella obtained with the Low Energy Transmission Grating Spectrometer (LETGS) on board the Chandra observatory and choose the analysis of the Ne IX triplet around 13.5 Å.

  8. Maximum Likelihood Analysis in the PEN Experiment

    Science.gov (United States)

    Lehman, Martin

    2013-10-01

    The experimental determination of the π+ -->e+ ν (γ) decay branching ratio currently provides the most accurate test of lepton universality. The PEN experiment at PSI, Switzerland, aims to improve the present world average experimental precision of 3 . 3 ×10-3 to 5 ×10-4 using a stopped beam approach. During runs in 2008-10, PEN has acquired over 2 ×107 πe 2 events. The experiment includes active beam detectors (degrader, mini TPC, target), central MWPC tracking with plastic scintillator hodoscopes, and a spherical pure CsI electromagnetic shower calorimeter. The final branching ratio will be calculated using a maximum likelihood analysis. This analysis assigns each event a probability for 5 processes (π+ -->e+ ν , π+ -->μ+ ν , decay-in-flight, pile-up, and hadronic events) using Monte Carlo verified probability distribution functions of our observables (energies, times, etc). A progress report on the PEN maximum likelihood analysis will be presented. Work supported by NSF grant PHY-0970013.

  9. Transfer Entropy as a Log-likelihood Ratio

    CERN Document Server

    Barnett, Lionel

    2012-01-01

    Transfer entropy, an information-theoretic measure of time-directed information transfer between joint processes, has steadily gained popularity in the neurosciences, econometrics and the analysis of complex system dynamics in diverse fields. We show that for a class of parametrised partial Markov models for jointly stochastic processes in discrete time, the log-likelihood ratio test statistic for the null hypothesis of zero transfer entropy is a consistent estimator for the transfer entropy itself. The result generalises the equivalence in the Gaussian case of transfer entropy and Granger causality, a statistical notion of causal influence based on prediction via vector autoregression. In the general case, an asymptotic $\\chi^2$ distribution for the model transfer entropy estimator is established.

  10. Transfer Entropy as a Log-Likelihood Ratio

    Science.gov (United States)

    Barnett, Lionel; Bossomaier, Terry

    2012-09-01

    Transfer entropy, an information-theoretic measure of time-directed information transfer between joint processes, has steadily gained popularity in the analysis of complex stochastic dynamics in diverse fields, including the neurosciences, ecology, climatology, and econometrics. We show that for a broad class of predictive models, the log-likelihood ratio test statistic for the null hypothesis of zero transfer entropy is a consistent estimator for the transfer entropy itself. For finite Markov chains, furthermore, no explicit model is required. In the general case, an asymptotic χ2 distribution is established for the transfer entropy estimator. The result generalizes the equivalence in the Gaussian case of transfer entropy and Granger causality, a statistical notion of causal influence based on prediction via vector autoregression, and establishes a fundamental connection between directed information transfer and causality in the Wiener-Granger sense.

  11. Likelihood-Based Inference in Nonlinear Error-Correction Models

    DEFF Research Database (Denmark)

    Kristensen, Dennis; Rahbæk, Anders

    We consider a class of vector nonlinear error correction models where the transfer function (or loadings) of the stationary relation- ships is nonlinear. This includes in particular the smooth transition models. A general representation theorem is given which establishes the dynamic properties...... of the process in terms of stochastic and deter- ministic trends as well as stationary components. In particular, the behaviour of the cointegrating relations is described in terms of geo- metric ergodicity. Despite the fact that no deterministic terms are included, the process will have both stochastic trends...... and a linear trend in general. Gaussian likelihood-based estimators are considered for the long- run cointegration parameters, and the short-run parameters. Asymp- totic theory is provided for these and it is discussed to what extend asymptotic normality and mixed normaity can be found. A simulation study...

  12. Maximum likelihood polynomial regression for robust speech recognition

    Institute of Scientific and Technical Information of China (English)

    LU Yong; WU Zhenyang

    2011-01-01

    The linear hypothesis is the main disadvantage of maximum likelihood linear re- gression (MLLR). This paper applies the polynomial regression method to model adaptation and establishes a nonlinear model adaptation algorithm using maximum likelihood polyno

  13. Composite likelihood and two-stage estimation in family studies

    DEFF Research Database (Denmark)

    Andersen, Elisabeth Anne Wreford

    2002-01-01

    Composite likelihood; Two-stage estimation; Family studies; Copula; Optimal weights; All possible pairs......Composite likelihood; Two-stage estimation; Family studies; Copula; Optimal weights; All possible pairs...

  14. Nonparametric likelihood based estimation of linear filters for point processes

    DEFF Research Database (Denmark)

    Hansen, Niels Richard

    2015-01-01

    result is a representation of the gradient of the log-likelihood, which we use to derive computable approximations of the log-likelihood and the gradient by time discretization. These approximations are then used to minimize the approximate penalized log-likelihood. For time and memory efficiency...

  15. FlowMax: A Computational Tool for Maximum Likelihood Deconvolution of CFSE Time Courses.

    Directory of Open Access Journals (Sweden)

    Maxim Nikolaievich Shokhirev

    Full Text Available The immune response is a concerted dynamic multi-cellular process. Upon infection, the dynamics of lymphocyte populations are an aggregate of molecular processes that determine the activation, division, and longevity of individual cells. The timing of these single-cell processes is remarkably widely distributed with some cells undergoing their third division while others undergo their first. High cell-to-cell variability and technical noise pose challenges for interpreting popular dye-dilution experiments objectively. It remains an unresolved challenge to avoid under- or over-interpretation of such data when phenotyping gene-targeted mouse models or patient samples. Here we develop and characterize a computational methodology to parameterize a cell population model in the context of noisy dye-dilution data. To enable objective interpretation of model fits, our method estimates fit sensitivity and redundancy by stochastically sampling the solution landscape, calculating parameter sensitivities, and clustering to determine the maximum-likelihood solution ranges. Our methodology accounts for both technical and biological variability by using a cell fluorescence model as an adaptor during population model fitting, resulting in improved fit accuracy without the need for ad hoc objective functions. We have incorporated our methodology into an integrated phenotyping tool, FlowMax, and used it to analyze B cells from two NFκB knockout mice with distinct phenotypes; we not only confirm previously published findings at a fraction of the expended effort and cost, but reveal a novel phenotype of nfkb1/p105/50 in limiting the proliferative capacity of B cells following B-cell receptor stimulation. In addition to complementing experimental work, FlowMax is suitable for high throughput analysis of dye dilution studies within clinical and pharmacological screens with objective and quantitative conclusions.

  16. Integrating Dynamic Data and Sensors with Semantic 3D City Models in the Context of Smart Cities

    Science.gov (United States)

    Chaturvedi, K.; Kolbe, T. H.

    2016-10-01

    Smart cities provide effective integration of human, physical and digital systems operating in the built environment. The advancements in city and landscape models, sensor web technologies, and simulation methods play a significant role in city analyses and improving quality of life of citizens and governance of cities. Semantic 3D city models can provide substantial benefits and can become a central information backbone for smart city infrastructures. However, current generation semantic 3D city models are static in nature and do not support dynamic properties and sensor observations. In this paper, we propose a new concept called Dynamizer allowing to represent highly dynamic data and providing a method for injecting dynamic variations of city object properties into the static representation. The approach also provides direct capability to model complex patterns based on statistics and general rules and also, real-time sensor observations. The concept is implemented as an Application Domain Extension for the CityGML standard. However, it could also be applied to other GML-based application schemas including the European INSPIRE data themes and national standards for topography and cadasters like the British Ordnance Survey Mastermap or the German cadaster standard ALKIS.

  17. Three Levels of Push-Pull Dynamics among Chinese International Students' Decision to Study Abroad in the Canadian Context

    Science.gov (United States)

    Chen, Jun Mian

    2017-01-01

    The extant literature on student migration flows generally focus on the traditional push-pull factors of migration at the individual level. Such a tendency excludes the broader levels affecting international student mobility. This paper proposes a hybrid of three levels of push-pull dynamics (micro-individual decision-making, meso-academic…

  18. Dynamic Geometry Software and Tracing Tangents in the Context of the Mean Value Theorem: Technique and Theory Production

    Science.gov (United States)

    Martínez-Hernández, Cesar; Ulloa-Azpeitia, Ricardo

    2017-01-01

    Based on the theoretical elements of the instrumental approach to tool use known as Task-Technique-Theory (Artigue, 2002), this paper analyses and discusses the performance of graduate students enrolled in a Teacher Training program. The latter performance relates to tracing tangent lines to the curve of a quadratic function in Dynamic Geometry…

  19. Hierarchical Linear Modeling with Maximum Likelihood, Restricted Maximum Likelihood, and Fully Bayesian Estimation

    Science.gov (United States)

    Boedeker, Peter

    2017-01-01

    Hierarchical linear modeling (HLM) is a useful tool when analyzing data collected from groups. There are many decisions to be made when constructing and estimating a model in HLM including which estimation technique to use. Three of the estimation techniques available when analyzing data with HLM are maximum likelihood, restricted maximum…

  20. MLDS: Maximum Likelihood Difference Scaling in R

    Directory of Open Access Journals (Sweden)

    Kenneth Knoblauch

    2008-01-01

    Full Text Available The MLDS package in the R programming language can be used to estimate perceptual scales based on the results of psychophysical experiments using the method of difference scaling. In a difference scaling experiment, observers compare two supra-threshold differences (a,b and (c,d on each trial. The approach is based on a stochastic model of how the observer decides which perceptual difference (or interval (a,b or (c,d is greater, and the parameters of the model are estimated using a maximum likelihood criterion. We also propose a method to test the model by evaluating the self-consistency of the estimated scale. The package includes an example in which an observer judges the differences in correlation between scatterplots. The example may be readily adapted to estimate perceptual scales for arbitrary physical continua.

  1. Parameter likelihood of intrinsic ellipticity correlations

    CERN Document Server

    Capranico, Federica; Schaefer, Bjoern Malte

    2012-01-01

    Subject of this paper are the statistical properties of ellipticity alignments between galaxies evoked by their coupled angular momenta. Starting from physical angular momentum models, we bridge the gap towards ellipticity correlations, ellipticity spectra and derived quantities such as aperture moments, comparing the intrinsic signals with those generated by gravitational lensing, with the projected galaxy sample of EUCLID in mind. We investigate the dependence of intrinsic ellipticity correlations on cosmological parameters and show that intrinsic ellipticity correlations give rise to non-Gaussian likelihoods as a result of nonlinear functional dependencies. Comparing intrinsic ellipticity spectra to weak lensing spectra we quantify the magnitude of their contaminating effect on the estimation of cosmological parameters and find that biases on dark energy parameters are very small in an angular-momentum based model in contrast to the linear alignment model commonly used. Finally, we quantify whether intrins...

  2. Dishonestly increasing the likelihood of winning

    Directory of Open Access Journals (Sweden)

    Shaul Shalvi

    2012-05-01

    Full Text Available People not only seek to avoid losses or secure gains; they also attempt to create opportunities for obtaining positive outcomes. When distributing money between gambles with equal probabilities, people often invest in turning negative gambles into positive ones, even at a cost of reduced expected value. Results of an experiment revealed that (1 the preference to turn a negative outcome into a positive outcome exists when people's ability to do so depends on their performance levels (rather than merely on their choice, (2 this preference is amplified when the likelihood to turn negative into positive is high rather than low, and (3 this preference is attenuated when people can lie about their performance levels, allowing them to turn negative into positive not by performing better but rather by lying about how well they performed.

  3. 多普勒频率变化率快速最大似然估计辅助的高动态载波跟踪环路%Carrier Tracking Loop in High Dynamic Environment Aided by Fast Maximum Likelihood Estimation of Doppler Frequency Rate-of-change

    Institute of Scientific and Technical Information of China (English)

    郇浩; 陶选如; 陶然; 程小康; 董朝; 李鹏飞

    2014-01-01

    To reach a compromise between efficient dynamic performance and high tracking accuracy of carrier tracking loop in high-dynamic circumstance which results in large Doppler frequency and Doppler frequency rate-of-change, a fast maximum likelihood estimation method of Doppler frequency rate-of-change is proposed in this paper, and the estimation value is utilized to aid the carrier tracking loop. First, it is pointed out that the maximum likelihood estimation method of Doppler frequency and Doppler frequency rate-of-change is equivalent to the Fractional Fourier Fransform (FrFT). Second, the estimation method of Doppler frequency rate-of-change, which combines the instant self-correlation and the segmental Discrete Fourier Transform (DFT) is proposed to solve the large two-dimensional search calculation amount of the Doppler frequency and Doppler frequency rate-of-change, and the received coarse estimation value is applied to narrow down the search range. Finally, the estimation value is used in the carrier tracking loop to reduce the dynamic stress and improve the tracking accuracy. Theoretical analysis and computer simulation show that the search calculation amount falls to 5.25 percent of the original amount with Signal to Noise Ratio (SNR)-30 dB, and the Root Mean Sguare Error(RMSE) of frequency tracked is only 8.46 Hz/s, compared with the traditional carrier tracking method the tracking sensitivity can be improved more than 3 dB.%高动态环境下接收信号含有较大的多普勒频率及其变化率,传统载波跟踪方法难以在高动态应力和跟踪精度两方面取得较好折中,针对这一问题该文提出一种多普勒频率变化率快速最大似然估计方法,并利用估计值辅助载波跟踪环路。首先指出了多普勒频率及其变化率的最大似然估计可等效采用分数阶傅里叶变换(FrFT)来实现;其次,针对频率及其变化率2维搜索运算量大的问题,提出一种瞬时自相关与分段离

  4. Class-Based Context Quality Optimization For Context Management Frameworks

    DEFF Research Database (Denmark)

    Shawky, Ahmed; Olsen, Rasmus Løvenstein; Pedersen, Jens Myrup

    2012-01-01

    delay, packet drop probability, information dynamics and the access strategies taken. QoS classification and system configuration of context management traffic is in this aspect important in order to efficiently balance generated access traffic between network delay, packet loss probability, information......Context-awareness is a key requirement in many of today's networks, services and applications. Context Management systems are in this respect used to provide access to distributed, dynamic context information. The reliability of remotely accessed dynamic context information is challenged by network...... dynamics and reliability requirements to the information from the applications. In this paper we develop a QoS Control network concept for context management systems. The concept includes a soft realtime algorithm for model based context access configuration and QoS class assignment, and allows to put...

  5. The Application of Dynamic Capabilities in E-commerce Innovation Context : The Implications for Chinese E-commerce companies

    OpenAIRE

    Chen, YongJia; LIANG, WEIMIN

    2007-01-01

    This study mainly investigated how Chinese E-commerce companies should cope with E-commerce innovation with specific dynamic capabilities. E-commerce (Electronic Commerce) innovation includes three phases of innovation based on technology and time. They are web-based commerce, mobile commerce (M-commerce) and ubiquitous commerce (U-commerce). They caused not only technological changes but also organizational changes. To cope with E-commerce innovation, a prerequisite is to understand the impa...

  6. The Application of Dynamic Capabilities in E-commerce Innovation Context : The Implications for Chinese E-commerce companies

    OpenAIRE

    Chen, YongJia; Liang,Weimin

    2007-01-01

    This study mainly investigated how Chinese E-commerce companies should cope with E-commerce innovation with specific dynamic capabilities. E-commerce (Electronic Commerce) innovation includes three phases of innovation based on technology and time. They are web-based commerce, mobile commerce (M-commerce) and ubiquitous commerce (U-commerce). They caused not only technological changes but also organizational changes. To cope with E-commerce innovation, a prerequisite is to understand the impa...

  7. Computational aspects of maximum likelihood estimation and reduction in sensitivity function calculations

    Science.gov (United States)

    Gupta, N. K.; Mehra, R. K.

    1974-01-01

    This paper discusses numerical aspects of computing maximum likelihood estimates for linear dynamical systems in state-vector form. Different gradient-based nonlinear programming methods are discussed in a unified framework and their applicability to maximum likelihood estimation is examined. The problems due to singular Hessian or singular information matrix that are common in practice are discussed in detail and methods for their solution are proposed. New results on the calculation of state sensitivity functions via reduced order models are given. Several methods for speeding convergence and reducing computation time are also discussed.

  8. Sociocognitive self-regulatory mechanisms governing judgments of the acceptability and likelihood of sport cheating.

    Science.gov (United States)

    d'Arripe-Longueville, Fabienne; Corrion, Karine; Scoffier, Stéphanie; Roussel, Peggy; Chalabaev, Aïna

    2010-10-01

    This study extends previous psychosocial literature (Bandura et al., 2001, 2003) by examining a structural model of the self-regulatory mechanisms governing the acceptability and likelihood of cheating in a sport context. Male and female adolescents (N = 804), aged 15-20 years, took part in this study. Negative affective self-regulatory efficacy influenced the acceptability and likelihood of cheating through the mediating role of moral disengagement, in females and males. Affective efficacy positively influenced prosocial behavior through moral disengagement or through resistive self-regulatory efficacy and social efficacy, in both groups. The direct effects of affective efficacy on beliefs about cheating were only evident in females. These results extend the findings of Bandura et al. (2001, 2003) to the sport context and suggest that affective and resistive self-regulatory efficacy operate in concert in governing adolescents' moral disengagement and transgressive behaviors in sport.

  9. Likelihood analysis of the minimal AMSB model

    Energy Technology Data Exchange (ETDEWEB)

    Bagnaschi, E.; Weiglein, G. [DESY, Hamburg (Germany); Borsato, M.; Chobanova, V.; Lucio, M.; Santos, D.M. [Universidade de Santiago de Compostela, Santiago de Compostela (Spain); Sakurai, K. [Institute for Particle Physics Phenomenology, University of Durham, Science Laboratories, Department of Physics, Durham (United Kingdom); University of Warsaw, Faculty of Physics, Institute of Theoretical Physics, Warsaw (Poland); Buchmueller, O.; Citron, M.; Costa, J.C.; Richards, A. [Imperial College, High Energy Physics Group, Blackett Laboratory, London (United Kingdom); Cavanaugh, R. [Fermi National Accelerator Laboratory, Batavia, IL (United States); University of Illinois at Chicago, Physics Department, Chicago, IL (United States); De Roeck, A. [Experimental Physics Department, CERN, Geneva (Switzerland); Antwerp University, Wilrijk (Belgium); Dolan, M.J. [School of Physics, University of Melbourne, ARC Centre of Excellence for Particle Physics at the Terascale, Melbourne (Australia); Ellis, J.R. [King' s College London, Theoretical Particle Physics and Cosmology Group, Department of Physics, London (United Kingdom); CERN, Theoretical Physics Department, Geneva (Switzerland); Flaecher, H. [University of Bristol, H.H. Wills Physics Laboratory, Bristol (United Kingdom); Heinemeyer, S. [Campus of International Excellence UAM+CSIC, Madrid (Spain); Instituto de Fisica Teorica UAM-CSIC, Madrid (Spain); Instituto de Fisica de Cantabria (CSIC-UC), Cantabria (Spain); Isidori, G. [Physik-Institut, Universitaet Zuerich, Zurich (Switzerland); Luo, F. [Kavli IPMU (WPI), UTIAS, The University of Tokyo, Kashiwa, Chiba (Japan); Olive, K.A. [School of Physics and Astronomy, University of Minnesota, William I. Fine Theoretical Physics Institute, Minneapolis, MN (United States)

    2017-04-15

    We perform a likelihood analysis of the minimal anomaly-mediated supersymmetry-breaking (mAMSB) model using constraints from cosmology and accelerator experiments. We find that either a wino-like or a Higgsino-like neutralino LSP, χ{sup 0}{sub 1}, may provide the cold dark matter (DM), both with similar likelihoods. The upper limit on the DM density from Planck and other experiments enforces m{sub χ{sup 0}{sub 1}} 0) but the scalar mass m{sub 0} is poorly constrained. In the wino-LSP case, m{sub 3/2} is constrained to about 900 TeV and m{sub χ{sup 0}{sub 1}} to 2.9 ± 0.1 TeV, whereas in the Higgsino-LSP case m{sub 3/2} has just a lower limit >or similar 650 TeV (>or similar 480 TeV) and m{sub χ{sup 0}{sub 1}} is constrained to 1.12 (1.13) ± 0.02 TeV in the μ > 0 (μ < 0) scenario. In neither case can the anomalous magnetic moment of the muon, (g-2){sub μ}, be improved significantly relative to its Standard Model (SM) value, nor do flavour measurements constrain the model significantly, and there are poor prospects for discovering supersymmetric particles at the LHC, though there are some prospects for direct DM detection. On the other hand, if the χ{sup 0}{sub 1} contributes only a fraction of the cold DM density, future LHC E{sub T}-based searches for gluinos, squarks and heavier chargino and neutralino states as well as disappearing track searches in the wino-like LSP region will be relevant, and interference effects enable BR(B{sub s,d} → μ{sup +}μ{sup -}) to agree with the data better than in the SM in the case of wino-like DM with μ > 0. (orig.)

  10. Dynamic Analysis of Historic Railway Bridges in Poland in the Context of Adjusting Them to Pendolino Trains

    Directory of Open Access Journals (Sweden)

    Grębowski K.

    2015-05-01

    Full Text Available The article presents the dynamic analysis of the historic railway bridge in Tczew as an example of the usefulness of such type of bridge for high-speed trains. The model of the bridge and the simulation of rolling stock passage was performed in SOFISTIK program. The scope of work includes experimental studies, the solution of the problem concerning the correct solution features of the dynamic model which takes into account the dependencies between the bridge, track and rolling - stock (RBT. The verification of the model was performed by comparing the results obtained on site during the passage of ET-22 locomotive and twenty (20 open goods wagons with the results obtained in the program for the identical type of rolling stock used to the simulation Pendolino train. Then, after the verification, the simulation of high-speed train passage was performed. The speed of the train passage varied from 150 [km/h] to the max. possible speed of 250 [km/h] which PENDOLINO train, approved for the simulation, may reach. Under the analysis of obtained results it was possible to define the conditions for adjusting the historic bridge to high-speed train passage.

  11. Analyzing variations in life-history traits of Pacific salmon in the context of Dynamic Energy Budget (DEB) theory

    Science.gov (United States)

    Pecquerie, Laure; Johnson, Leah R.; Kooijman, Sebastiaan A. L. M.; Nisbet, Roger M.

    2011-11-01

    To determine the response of Pacific salmon ( Oncorhynchus spp.) populations to environmental change, we need to understand impacts on all life stages. However, an integrative and mechanistic approach is particularly challenging for Pacific salmon as they use multiple habitats (river, estuarine and marine) during their life cycle. Here we develop a bioenergetic model that predicts development, growth and reproduction of a Pacific salmon in a dynamic environment, from an egg to a reproducing female, and that links female state to egg traits. This model uses Dynamic Energy Budget (DEB) theory to predict how life history traits vary among five species of Pacific salmon: Pink, Sockeye, Coho, Chum and Chinook. Supplemented with a limited number of assumptions on anadromy and semelparity and external signals for migrations, the model reproduces the qualitative patterns in egg size, fry size and fecundity both at the inter- and intra-species levels. Our results highlight how modeling all life stages within a single framework enables us to better understand complex life-history patterns. Additionally we show that body size scaling relationships implied by DEB theory provide a simple way to transfer model parameters among Pacific salmon species, thus providing a generic approach to study the impact of environmental conditions on the life cycle of Pacific salmon.

  12. Likelihood Analysis of Supersymmetric SU(5) GUTs

    CERN Document Server

    Bagnaschi, E.

    2017-01-01

    We perform a likelihood analysis of the constraints from accelerator experiments and astrophysical observations on supersymmetric (SUSY) models with SU(5) boundary conditions on soft SUSY-breaking parameters at the GUT scale. The parameter space of the models studied has 7 parameters: a universal gaugino mass $m_{1/2}$, distinct masses for the scalar partners of matter fermions in five- and ten-dimensional representations of SU(5), $m_5$ and $m_{10}$, and for the $\\mathbf{5}$ and $\\mathbf{\\bar 5}$ Higgs representations $m_{H_u}$ and $m_{H_d}$, a universal trilinear soft SUSY-breaking parameter $A_0$, and the ratio of Higgs vevs $\\tan \\beta$. In addition to previous constraints from direct sparticle searches, low-energy and flavour observables, we incorporate constraints based on preliminary results from 13 TeV LHC searches for jets + MET events and long-lived particles, as well as the latest PandaX-II and LUX searches for direct Dark Matter detection. In addition to previously-identified mechanisms for bringi...

  13. Dimension-independent likelihood-informed MCMC

    KAUST Repository

    Cui, Tiangang

    2015-10-08

    Many Bayesian inference problems require exploring the posterior distribution of high-dimensional parameters that represent the discretization of an underlying function. This work introduces a family of Markov chain Monte Carlo (MCMC) samplers that can adapt to the particular structure of a posterior distribution over functions. Two distinct lines of research intersect in the methods developed here. First, we introduce a general class of operator-weighted proposal distributions that are well defined on function space, such that the performance of the resulting MCMC samplers is independent of the discretization of the function. Second, by exploiting local Hessian information and any associated low-dimensional structure in the change from prior to posterior distributions, we develop an inhomogeneous discretization scheme for the Langevin stochastic differential equation that yields operator-weighted proposals adapted to the non-Gaussian structure of the posterior. The resulting dimension-independent and likelihood-informed (DILI) MCMC samplers may be useful for a large class of high-dimensional problems where the target probability measure has a density with respect to a Gaussian reference measure. Two nonlinear inverse problems are used to demonstrate the efficiency of these DILI samplers: an elliptic PDE coefficient inverse problem and path reconstruction in a conditioned diffusion.

  14. Likelihood Analysis of Supersymmetric SU(5) GUTs

    CERN Document Server

    Bagnaschi, E.; Sakurai, K.; Borsato, M.; Buchmueller, O.; Cavanaugh, R.; Chobanova, V.; Citron, M.; De Roeck, A.; Dolan, M.J.; Ellis, J.R.; Flächer, H.; Heinemeyer, S.; Isidori, G.; Lucio, M.; Martínez Santos, D.; Olive, K.A.; Richards, A.; de Vries, K.J.; Weiglein, G.

    2016-01-01

    We perform a likelihood analysis of the constraints from accelerator experiments and astrophysical observations on supersymmetric (SUSY) models with SU(5) boundary conditions on soft SUSY-breaking parameters at the GUT scale. The parameter space of the models studied has 7 parameters: a universal gaugino mass $m_{1/2}$, distinct masses for the scalar partners of matter fermions in five- and ten-dimensional representations of SU(5), $m_5$ and $m_{10}$, and for the $\\mathbf{5}$ and $\\mathbf{\\bar 5}$ Higgs representations $m_{H_u}$ and $m_{H_d}$, a universal trilinear soft SUSY-breaking parameter $A_0$, and the ratio of Higgs vevs $\\tan \\beta$. In addition to previous constraints from direct sparticle searches, low-energy and flavour observables, we incorporate constraints based on preliminary results from 13 TeV LHC searches for jets + MET events and long-lived particles, as well as the latest PandaX-II and LUX searches for direct Dark Matter detection. In addition to previously-identified mechanisms for bringi...

  15. REDUCING THE LIKELIHOOD OF LONG TENNIS MATCHES

    Directory of Open Access Journals (Sweden)

    Tristan Barnett

    2006-12-01

    Full Text Available Long matches can cause problems for tournaments. For example, the starting times of subsequent matches can be substantially delayed causing inconvenience to players, spectators, officials and television scheduling. They can even be seen as unfair in the tournament setting when the winner of a very long match, who may have negative aftereffects from such a match, plays the winner of an average or shorter length match in the next round. Long matches can also lead to injuries to the participating players. One factor that can lead to long matches is the use of the advantage set as the fifth set, as in the Australian Open, the French Open and Wimbledon. Another factor is long rallies and a greater than average number of points per game. This tends to occur more frequently on the slower surfaces such as at the French Open. The mathematical method of generating functions is used to show that the likelihood of long matches can be substantially reduced by using the tiebreak game in the fifth set, or more effectively by using a new type of game, the 50-40 game, throughout the match

  16. Reducing the likelihood of long tennis matches.

    Science.gov (United States)

    Barnett, Tristan; Alan, Brown; Pollard, Graham

    2006-01-01

    Long matches can cause problems for tournaments. For example, the starting times of subsequent matches can be substantially delayed causing inconvenience to players, spectators, officials and television scheduling. They can even be seen as unfair in the tournament setting when the winner of a very long match, who may have negative aftereffects from such a match, plays the winner of an average or shorter length match in the next round. Long matches can also lead to injuries to the participating players. One factor that can lead to long matches is the use of the advantage set as the fifth set, as in the Australian Open, the French Open and Wimbledon. Another factor is long rallies and a greater than average number of points per game. This tends to occur more frequently on the slower surfaces such as at the French Open. The mathematical method of generating functions is used to show that the likelihood of long matches can be substantially reduced by using the tiebreak game in the fifth set, or more effectively by using a new type of game, the 50-40 game, throughout the match. Key PointsThe cumulant generating function has nice properties for calculating the parameters of distributions in a tennis matchA final tiebreaker set reduces the length of matches as currently being used in the US OpenA new 50-40 game reduces the length of matches whilst maintaining comparable probabilities for the better player to win the match.

  17. Likelihood Analysis of Supersymmetric SU(5) GUTs

    Energy Technology Data Exchange (ETDEWEB)

    Bagnaschi, E. [DESY; Costa, J. C. [Imperial Coll., London; Sakurai, K. [Warsaw U.; Borsato, M. [Santiago de Compostela U.; Buchmueller, O. [Imperial Coll., London; Cavanaugh, R. [Illinois U., Chicago; Chobanova, V. [Santiago de Compostela U.; Citron, M. [Imperial Coll., London; De Roeck, A. [Antwerp U.; Dolan, M. J. [Melbourne U.; Ellis, J. R. [King' s Coll. London; Flächer, H. [Bristol U.; Heinemeyer, S. [Madrid, IFT; Isidori, G. [Zurich U.; Lucio, M. [Santiago de Compostela U.; Martínez Santos, D. [Santiago de Compostela U.; Olive, K. A. [Minnesota U., Theor. Phys. Inst.; Richards, A. [Imperial Coll., London; de Vries, K. J. [Imperial Coll., London; Weiglein, G. [DESY

    2016-10-31

    We perform a likelihood analysis of the constraints from accelerator experiments and astrophysical observations on supersymmetric (SUSY) models with SU(5) boundary conditions on soft SUSY-breaking parameters at the GUT scale. The parameter space of the models studied has 7 parameters: a universal gaugino mass $m_{1/2}$, distinct masses for the scalar partners of matter fermions in five- and ten-dimensional representations of SU(5), $m_5$ and $m_{10}$, and for the $\\mathbf{5}$ and $\\mathbf{\\bar 5}$ Higgs representations $m_{H_u}$ and $m_{H_d}$, a universal trilinear soft SUSY-breaking parameter $A_0$, and the ratio of Higgs vevs $\\tan \\beta$. In addition to previous constraints from direct sparticle searches, low-energy and flavour observables, we incorporate constraints based on preliminary results from 13 TeV LHC searches for jets + MET events and long-lived particles, as well as the latest PandaX-II and LUX searches for direct Dark Matter detection. In addition to previously-identified mechanisms for bringing the supersymmetric relic density into the range allowed by cosmology, we identify a novel ${\\tilde u_R}/{\\tilde c_R} - \\tilde{\\chi}^0_1$ coannihilation mechanism that appears in the supersymmetric SU(5) GUT model and discuss the role of ${\\tilde \

  18. Maximum likelihood estimates of pairwise rearrangement distances.

    Science.gov (United States)

    Serdoz, Stuart; Egri-Nagy, Attila; Sumner, Jeremy; Holland, Barbara R; Jarvis, Peter D; Tanaka, Mark M; Francis, Andrew R

    2017-06-21

    Accurate estimation of evolutionary distances between taxa is important for many phylogenetic reconstruction methods. Distances can be estimated using a range of different evolutionary models, from single nucleotide polymorphisms to large-scale genome rearrangements. Corresponding corrections for genome rearrangement distances fall into 3 categories: Empirical computational studies, Bayesian/MCMC approaches, and combinatorial approaches. Here, we introduce a maximum likelihood estimator for the inversion distance between a pair of genomes, using a group-theoretic approach to modelling inversions introduced recently. This MLE functions as a corrected distance: in particular, we show that because of the way sequences of inversions interact with each other, it is quite possible for minimal distance and MLE distance to differently order the distances of two genomes from a third. The second aspect tackles the problem of accounting for the symmetries of circular arrangements. While, generally, a frame of reference is locked, and all computation made accordingly, this work incorporates the action of the dihedral group so that distance estimates are free from any a priori frame of reference. The philosophy of accounting for symmetries can be applied to any existing correction method, for which examples are offered. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. THE DEVELOPMENT IN DYNAMICS AND STRUCTURE OF THE ROMANIAN TOURISM IN THE CONTEXT OF THE GLOBAL CRISIS

    Directory of Open Access Journals (Sweden)

    Gabriela STANCIULESCU

    2009-06-01

    Full Text Available The purpose of this study is to make a short presentation of the Romanian tourism in the context of the global economic crisis, highlighting the main data about inbound and outbound tourism. Due to the current global crisis, Romanian tourism disposes of two main options: to ripen or to run low. The empirical results will show the connection between Romanian tourism and the global economic crisis and shall state precisely the decrease or increase for the indicators between 2000 and 2007. In 2007, the year when Romania joined the European Union and when all stipulations regarding free travelling for people who travel in other communitarian countries were passed, Romanian tourists increased their interest in travelling abroad. This also brought to an increased number of persons visiting other countries. The economic crisis might be a chance for Romanian tourism to raise the bid as in such periods tourists usually look for close destinations, developing the incoming indicator that was very low during the last few years. Romania's international touristic feed is characterized by an evolution reflecting the various changes and transformations from the political, economic, and social points of view. The drawn conclusions prove the fact that both the global heating and the global financial crisis take place in the same time and thus we must take actions in order to improve Romanian tourism.

  20. The role of normally hyperbolic invariant manifolds (NHIMS) in the context of the phase space setting for chemical reaction dynamics

    Science.gov (United States)

    Wiggins, Stephen

    2016-11-01

    In this paper we give an introduction to the notion of a normally hyperbolic invariant manifold (NHIM) and its role in chemical reaction dynamics.We do this by considering simple examples for one-, two-, and three-degree-of-freedom systems where explicit calculations can be carried out for all of the relevant geometrical structures and their properties can be explicitly understood. We specifically emphasize the notion of a NHIM as a "phase space concept". In particular, we make the observation that the (phase space) NHIM plays the role of "carrying" the (configuration space) properties of a saddle point of the potential energy surface into phase space. We also consider an explicit example of a 2-degree-of-freedom system where a "global" dividing surface can be constructed using two index one saddles and one index two saddle. Such a dividing surface has arisen in several recent applications and, therefore, such a construction may be of wider interest.

  1. Plasma Instabilities in the Context of Current Helium Sedimentation Models: Dynamical Implications for the ICM in Galaxy Clusters

    CERN Document Server

    Berlok, Thomas

    2015-01-01

    Understanding whether Helium can sediment to the core of galaxy clusters is important for a number of problems in cosmology and astrophysics. All current models addressing this question are one-dimensional and do not account for the fact that magnetic fields can effectively channel ions and electrons, leading to anisotropic transport of momentum, heat, and particle diffusion in the weakly collisional intracluster medium (ICM). This anisotropy can lead to a wide variety of instabilities, which could be relevant for understanding the dynamics of heterogeneous media. In this paper, we consider the radial temperature and composition profiles as obtained from a state-of-the-art Helium sedimentation model and analyze its stability properties. We find that the associated radial profiles are unstable, to different kinds of instabilities depending on the magnetic field orientation, at all radii. The fastest growing modes are usually related to generalizations of the Magnetothermal Instability (MTI) and the Heat-flux-d...

  2. On the shape and likelihood of oceanic rogue waves.

    Science.gov (United States)

    Benetazzo, Alvise; Ardhuin, Fabrice; Bergamasco, Filippo; Cavaleri, Luigi; Guimarães, Pedro Veras; Schwendeman, Michael; Sclavo, Mauro; Thomson, Jim; Torsello, Andrea

    2017-08-15

    We consider the observation and analysis of oceanic rogue waves collected within spatio-temporal (ST) records of 3D wave fields. This class of records, allowing a sea surface region to be retrieved, is appropriate for the observation of rogue waves, which come up as a random phenomenon that can occur at any time and location of the sea surface. To verify this aspect, we used three stereo wave imaging systems to gather ST records of the sea surface elevation, which were collected in different sea conditions. The wave with the ST maximum elevation (happening to be larger than the rogue threshold 1.25H s) was then isolated within each record, along with its temporal profile. The rogue waves show similar profiles, in agreement with the theory of extreme wave groups. We analyze the rogue wave probability of occurrence, also in the context of ST extreme value distributions, and we conclude that rogue waves are more likely than previously reported; the key point is coming across them, in space as well as in time. The dependence of the rogue wave profile and likelihood on the sea state conditions is also investigated. Results may prove useful in predicting extreme wave occurrence probability and strength during oceanic storms.

  3. Maximum likelihood estimation for cytogenetic dose-response curves

    Energy Technology Data Exchange (ETDEWEB)

    Frome, E.L; DuFrain, R.J.

    1983-10-01

    In vitro dose-response curves are used to describe the relation between the yield of dicentric chromosome aberrations and radiation dose for human lymphocytes. The dicentric yields follow the Poisson distribution, and the expected yield depends on both the magnitude and the temporal distribution of the dose for low LET radiation. A general dose-response model that describes this relation has been obtained by Kellerer and Rossi using the theory of dual radiation action. The yield of elementary lesions is kappa(..gamma..d + g(t, tau)d/sup 2/), where t is the time and d is dose. The coefficient of the d/sup 2/ term is determined by the recovery function and the temporal mode of irradiation. Two special cases of practical interest are split-dose and continuous exposure experiments, and the resulting models are intrinsically nonlinear in the parameters. A general purpose maximum likelihood estimation procedure is described and illustrated with numerical examples from both experimental designs. Poisson regression analysis is used for estimation, hypothesis testing, and regression diagnostics. Results are discussed in the context of exposure assessment procedures for both acute and chronic human radiation exposure.

  4. Likelihood analysis of supersymmetric SU(5) GUTs

    Energy Technology Data Exchange (ETDEWEB)

    Bagnaschi, E. [DESY, Hamburg (Germany); Costa, J.C. [Imperial College, London (United Kingdom). Blackett Lab.; Sakurai, K. [Durham Univ. (United Kingdom). Inst. for Particle Physics Phenomonology; Warsaw Univ. (Poland). Inst. of Theoretical Physics; Collaboration: MasterCode Collaboration; and others

    2016-10-15

    We perform a likelihood analysis of the constraints from accelerator experiments and astrophysical observations on supersymmetric (SUSY) models with SU(5) boundary conditions on soft SUSY-breaking parameters at the GUT scale. The parameter space of the models studied has 7 parameters: a universal gaugino mass m{sub 1/2}, distinct masses for the scalar partners of matter fermions in five- and ten-dimensional representations of SU(5), m{sub 5} and m{sub 10}, and for the 5 and anti 5 Higgs representations m{sub H{sub u}} and m{sub H{sub d}}, a universal trilinear soft SUSY-breaking parameter A{sub 0}, and the ratio of Higgs vevs tan β. In addition to previous constraints from direct sparticle searches, low-energy and avour observables, we incorporate constraints based on preliminary results from 13 TeV LHC searches for jets+E{sub T} events and long-lived particles, as well as the latest PandaX-II and LUX searches for direct Dark Matter detection. In addition to previously-identified mechanisms for bringing the supersymmetric relic density into the range allowed by cosmology, we identify a novel u{sub R}/c{sub R}-χ{sup 0}{sub 1} coannihilation mechanism that appears in the supersymmetric SU(5) GUT model and discuss the role of ν{sub T} coannihilation. We find complementarity between the prospects for direct Dark Matter detection and SUSY searches at the LHC.

  5. Likelihood analysis of supersymmetric SU(5) GUTs

    Energy Technology Data Exchange (ETDEWEB)

    Bagnaschi, E.; Weiglein, G. [DESY, Hamburg (Germany); Costa, J.C.; Buchmueller, O.; Citron, M.; Richards, A.; De Vries, K.J. [Imperial College, High Energy Physics Group, Blackett Laboratory, London (United Kingdom); Sakurai, K. [University of Durham, Science Laboratories, Department of Physics, Institute for Particle Physics Phenomenology, Durham (United Kingdom); University of Warsaw, Faculty of Physics, Institute of Theoretical Physics, Warsaw (Poland); Borsato, M.; Chobanova, V.; Lucio, M.; Martinez Santos, D. [Universidade de Santiago de Compostela, Santiago de Compostela (Spain); Cavanaugh, R. [Fermi National Accelerator Laboratory, Batavia, IL (United States); University of Illinois at Chicago, Physics Department, Chicago, IL (United States); Roeck, A. de [CERN, Experimental Physics Department, Geneva (Switzerland); Antwerp University, Wilrijk (Belgium); Dolan, M.J. [University of Melbourne, ARC Centre of Excellence for Particle Physics at the Terascale, School of Physics, Parkville (Australia); Ellis, J.R. [King' s College London, Theoretical Particle Physics and Cosmology Group, Department of Physics, London (United Kingdom); Theoretical Physics Department, CERN, Geneva 23 (Switzerland); Flaecher, H. [University of Bristol, H.H. Wills Physics Laboratory, Bristol (United Kingdom); Heinemeyer, S. [Campus of International Excellence UAM+CSIC, Cantoblanco, Madrid (Spain); Instituto de Fisica Teorica UAM-CSIC, Madrid (Spain); Instituto de Fisica de Cantabria (CSIC-UC), Santander (Spain); Isidori, G. [Universitaet Zuerich, Physik-Institut, Zurich (Switzerland); Olive, K.A. [University of Minnesota, William I. Fine Theoretical Physics Institute, School of Physics and Astronomy, Minneapolis, MN (United States)

    2017-02-15

    We perform a likelihood analysis of the constraints from accelerator experiments and astrophysical observations on supersymmetric (SUSY) models with SU(5) boundary conditions on soft SUSY-breaking parameters at the GUT scale. The parameter space of the models studied has seven parameters: a universal gaugino mass m{sub 1/2}, distinct masses for the scalar partners of matter fermions in five- and ten-dimensional representations of SU(5), m{sub 5} and m{sub 10}, and for the 5 and anti 5 Higgs representations m{sub H{sub u}} and m{sub H{sub d}}, a universal trilinear soft SUSY-breaking parameter A{sub 0}, and the ratio of Higgs vevs tan β. In addition to previous constraints from direct sparticle searches, low-energy and flavour observables, we incorporate constraints based on preliminary results from 13 TeV LHC searches for jets + E{sub T} events and long-lived particles, as well as the latest PandaX-II and LUX searches for direct Dark Matter detection. In addition to previously identified mechanisms for bringing the supersymmetric relic density into the range allowed by cosmology, we identify a novel u{sub R}/c{sub R} - χ{sup 0}{sub 1} coannihilation mechanism that appears in the supersymmetric SU(5) GUT model and discuss the role of ν{sub τ} coannihilation. We find complementarity between the prospects for direct Dark Matter detection and SUSY searches at the LHC. (orig.)

  6. Maximum likelihood molecular clock comb: analytic solutions.

    Science.gov (United States)

    Chor, Benny; Khetan, Amit; Snir, Sagi

    2006-04-01

    Maximum likelihood (ML) is increasingly used as an optimality criterion for selecting evolutionary trees, but finding the global optimum is a hard computational task. Because no general analytic solution is known, numeric techniques such as hill climbing or expectation maximization (EM), are used in order to find optimal parameters for a given tree. So far, analytic solutions were derived only for the simplest model--three taxa, two state characters, under a molecular clock. Four taxa rooted trees have two topologies--the fork (two subtrees with two leaves each) and the comb (one subtree with three leaves, the other with a single leaf). In a previous work, we devised a closed form analytic solution for the ML molecular clock fork. In this work, we extend the state of the art in the area of analytic solutions ML trees to the family of all four taxa trees under the molecular clock assumption. The change from the fork topology to the comb incurs a major increase in the complexity of the underlying algebraic system and requires novel techniques and approaches. We combine the ultrametric properties of molecular clock trees with the Hadamard conjugation to derive a number of topology dependent identities. Employing these identities, we substantially simplify the system of polynomial equations. We finally use tools from algebraic geometry (e.g., Gröbner bases, ideal saturation, resultants) and employ symbolic algebra software to obtain analytic solutions for the comb. We show that in contrast to the fork, the comb has no closed form solutions (expressed by radicals in the input data). In general, four taxa trees can have multiple ML points. In contrast, we can now prove that under the molecular clock assumption, the comb has a unique (local and global) ML point. (Such uniqueness was previously shown for the fork.).

  7. Likelihood inference for a nonstationary fractional autoregressive model

    DEFF Research Database (Denmark)

    Johansen, Søren; Ørregård Nielsen, Morten

    2010-01-01

    the conditional Gaussian likelihood and for the probability analysis we also condition on initial values but assume that the errors in the autoregressive model are i.i.d. with suitable moment conditions. We analyze the conditional likelihood and its derivatives as stochastic processes in the parameters, including...... d and b, and prove that they converge in distribution. We use the results to prove consistency of the maximum likelihood estimator for d,b in a large compact subset of {1/2...

  8. Likelihood ratios: Clinical application in day-to-day practice

    Directory of Open Access Journals (Sweden)

    Parikh Rajul

    2009-01-01

    Full Text Available In this article we provide an introduction to the use of likelihood ratios in clinical ophthalmology. Likelihood ratios permit the best use of clinical test results to establish diagnoses for the individual patient. Examples and step-by-step calculations demonstrate the estimation of pretest probability, pretest odds, and calculation of posttest odds and posttest probability using likelihood ratios. The benefits and limitations of this approach are discussed.

  9. Dynamic effects of self-relevance and task on the neural processing of emotional words in context

    Directory of Open Access Journals (Sweden)

    Eric C. Fields

    2016-01-01

    Full Text Available We used event-related potentials (ERPs to examine the interactions between task, emotion, and contextual self-relevance on processing words in social vignettes. Participants read scenarios that were in either third person (other-relevant or second person (self-relevant and we recorded ERPs to a neutral, pleasant, or unpleasant critical word. In a previously reported study (Fields & Kuperberg, 2012 with these stimuli, participants were tasked with producing a third sentence continuing the scenario. We observed a larger LPC to emotional words than neutral words in both the self-relevant and other-relevant scenarios, but this effect was smaller in the self-relevant scenarios because the LPC was larger on the neutral words (i.e., a larger LPC to self-relevant than other-relevant neutral words. In the present work, participants simply answered comprehension questions that did not refer to the emotional aspects of the scenario. Here we observed quite a different pattern of interaction between self-relevance and emotion: the LPC was larger to emotional versus neutral words in the self-relevant scenarios only, and there was no effect of self-relevance on neutral words. Taken together, these findings suggest that the LPC reflects a dynamic interaction between specific task demands, the emotional properties of a stimulus, and contextual self-relevance. We conclude by discussing implications and future directions for a functional theory of the emotional LPC.

  10. Keep it simple - A case study of model development in the context of the Dynamic Stocks and Flows (DSF) task

    Science.gov (United States)

    Halbrügge, Marc

    2010-12-01

    This paper describes the creation of a cognitive model submitted to the ‘Dynamic Stocks and Flows’ (DSF) modeling challenge. This challenge aims at comparing computational cognitive models for human behavior during an open ended control task. Participants in the modeling competition were provided with a simulation environment and training data for benchmarking their models while the actual specification of the competition task was withheld. To meet this challenge, the cognitive model described here was designed and optimized for generalizability. Only two simple assumptions about human problem solving were used to explain the empirical findings of the training data. In-depth analysis of the data set prior to the development of the model led to the dismissal of correlations or other parametric statistics as goodness-of-fit indicators. A new statistical measurement based on rank orders and sequence matching techniques is being proposed instead. This measurement, when being applied to the human sample, also identifies clusters of subjects that use different strategies for the task. The acceptability of the fits achieved by the model is verified using permutation tests.

  11. Dynamic Effects of Self-Relevance and Task on the Neural Processing of Emotional Words in Context.

    Science.gov (United States)

    Fields, Eric C; Kuperberg, Gina R

    2015-01-01

    We used event-related potentials (ERPs) to examine the interactions between task, emotion, and contextual self-relevance on processing words in social vignettes. Participants read scenarios that were in either third person (other-relevant) or second person (self-relevant) and we recorded ERPs to a neutral, pleasant, or unpleasant critical word. In a previously reported study (Fields and Kuperberg, 2012) with these stimuli, participants were tasked with producing a third sentence continuing the scenario. We observed a larger LPC to emotional words than neutral words in both the self-relevant and other-relevant scenarios, but this effect was smaller in the self-relevant scenarios because the LPC was larger on the neutral words (i.e., a larger LPC to self-relevant than other-relevant neutral words). In the present work, participants simply answered comprehension questions that did not refer to the emotional aspects of the scenario. Here we observed quite a different pattern of interaction between self-relevance and emotion: the LPC was larger to emotional vs. neutral words in the self-relevant scenarios only, and there was no effect of self-relevance on neutral words. Taken together, these findings suggest that the LPC reflects a dynamic interaction between specific task demands, the emotional properties of a stimulus, and contextual self-relevance. We conclude by discussing implications and future directions for a functional theory of the emotional LPC.

  12. Hydrological dynamics of a Mediterranean catchment in a global change context. (Romanyac catchment, Cap de Creus, Girona, Spain)

    Science.gov (United States)

    Latron, J.; Pardini, G.; Gispert, M.; Llorens, P.

    2009-04-01

    Mediterranean regions are characterized by unevenly distributed water resources, and consequently a more precise knowledge of the main hydrological processes and their variability and changes is crucial for a better management of water resources. However, the lack of hydrological information and data in most areas of the Mediterranean basin greatly difficult the analyses of changes in water resources at relevant scales. In this context, the Soil Science Unit GRCT48 from the University of Girona is conducting an integrated study of hydrological response, soil erosion and soil degradation processes in fragile Mediterranean areas undergoing changes in use and management. The study area is located in the Cap de Creus Peninsula (NE Spain), where land abandonment has been the outstanding characteristic over the last decades. The area is covered by terraced soils, most of them abandoned, and stands for a representative Mediterranean environment. Current land cover is a mosaic of areas with different shrubs according to wildfire occurrence. Residual patches of cork and pine trees are also present as well as small extensions of pastures. Finally some localized areas of vineyards and olive trees are still cultivated. The approach is based on the complementary use of plot and catchment scales to assess the effect of land cover and land use change on physical, chemical and biological parameters of soil quality and on rainfall-runoff-erosion relationships. Along the study period, observed rainfall-runoff response at the plot scale was highly variable among sites but also for a given environment, depending on antecedent wetness conditions and rainfall characteristics. Overall, surface runoff responses were low in all environments. Soil loss associated to rainfall-runoff events showed very large variations among sites, and also for a given site, between the different rainfall events. At the catchment scale, preliminary results obtained from the monitoring, of three catchments of

  13. Integration based profile likelihood calculation for PDE constrained parameter estimation problems

    Science.gov (United States)

    Boiger, R.; Hasenauer, J.; Hroß, S.; Kaltenbacher, B.

    2016-12-01

    Partial differential equation (PDE) models are widely used in engineering and natural sciences to describe spatio-temporal processes. The parameters of the considered processes are often unknown and have to be estimated from experimental data. Due to partial observations and measurement noise, these parameter estimates are subject to uncertainty. This uncertainty can be assessed using profile likelihoods, a reliable but computationally intensive approach. In this paper, we present the integration based approach for the profile likelihood calculation developed by (Chen and Jennrich 2002 J. Comput. Graph. Stat. 11 714-32) and adapt it to inverse problems with PDE constraints. While existing methods for profile likelihood calculation in parameter estimation problems with PDE constraints rely on repeated optimization, the proposed approach exploits a dynamical system evolving along the likelihood profile. We derive the dynamical system for the unreduced estimation problem, prove convergence and study the properties of the integration based approach for the PDE case. To evaluate the proposed method, we compare it with state-of-the-art algorithms for a simple reaction-diffusion model for a cellular patterning process. We observe a good accuracy of the method as well as a significant speed up as compared to established methods. Integration based profile calculation facilitates rigorous uncertainty analysis for computationally demanding parameter estimation problems with PDE constraints.

  14. Seasonal species interactions minimize the impact of species turnover on the likelihood of community persistence.

    Science.gov (United States)

    Saavedra, Serguei; Rohr, Rudolf P; Fortuna, Miguel A; Selva, Nuria; Bascompte, Jordi

    2016-04-01

    Many of the observed species interactions embedded in ecological communities are not permanent, but are characterized by temporal changes that are observed along with abiotic and biotic variations. While work has been done describing and quantifying these changes, little is known about their consequences for species coexistence. Here, we investigate the extent to which changes of species composition impact the likelihood of persistence of the predator-prey community in the highly seasonal Białowieza Primeval Forest (northeast Poland), and the extent to which seasonal changes of species interactions (predator diet) modulate the expected impact. This likelihood is estimated extending recent developments on the study of structural stability in ecological communities. We find that the observed species turnover strongly varies the likelihood of community persistence between summer and winter. Importantly, we demonstrate that the observed seasonal interaction changes minimize the variation in the likelihood of persistence associated with species turnover across the year. We find that these community dynamics can be explained as the coupling of individual species to their environment by minimizing both the variation in persistence conditions and the interaction changes between seasons. Our results provide a homeostatic explanation for seasonal species interactions and suggest that monitoring the association of interactions changes with the level of variation in community dynamics can provide a good indicator of the response of species to environmental pressures.

  15. Empirical likelihood estimation of discretely sampled processes of OU type

    Institute of Scientific and Technical Information of China (English)

    SUN ShuGuang; ZHANG XinSheng

    2009-01-01

    This paper presents an empirical likelihood estimation procedure for parameters of the discretely sampled process of Ornstein-Uhlenbeck type. The proposed procedure is based on the condi-tional characteristic function, and the maximum empirical likelihood estimator is proved to be consistent and asymptotically normal. Moreover, this estimator is shown to be asymptotically efficient under some tensity parameter can be exactly recovered, and we study the maximum empirical likelihood estimator with the plug-in estimated intensity parameter. Testing procedures based on the empirical likelihood ratio statistic are developed for parameters and for estimating equations, respectively. Finally, Monte Carlo simulations are conducted to demonstrate the performance of proposed estimators.

  16. Estimating the Effect of Competition on Trait Evolution Using Maximum Likelihood Inference.

    Science.gov (United States)

    Drury, Jonathan; Clavel, Julien; Manceau, Marc; Morlon, Hélène

    2016-07-01

    Many classical ecological and evolutionary theoretical frameworks posit that competition between species is an important selective force. For example, in adaptive radiations, resource competition between evolving lineages plays a role in driving phenotypic diversification and exploration of novel ecological space. Nevertheless, current models of trait evolution fit to phylogenies and comparative data sets are not designed to incorporate the effect of competition. The most advanced models in this direction are diversity-dependent models where evolutionary rates depend on lineage diversity. However, these models still treat changes in traits in one branch as independent of the value of traits on other branches, thus ignoring the effect of species similarity on trait evolution. Here, we consider a model where the evolutionary dynamics of traits involved in interspecific interactions are influenced by species similarity in trait values and where we can specify which lineages are in sympatry. We develop a maximum likelihood based approach to fit this model to combined phylogenetic and phenotypic data. Using simulations, we demonstrate that the approach accurately estimates the simulated parameter values across a broad range of parameter space. Additionally, we develop tools for specifying the biogeographic context in which trait evolution occurs. In order to compare models, we also apply these biogeographic methods to specify which lineages interact sympatrically for two diversity-dependent models. Finally, we fit these various models to morphological data from a classical adaptive radiation (Greater Antillean Anolis lizards). We show that models that account for competition and geography perform better than other models. The matching competition model is an important new tool for studying the influence of interspecific interactions, in particular competition, on phenotypic evolution. More generally, it constitutes a step toward a better integration of interspecific

  17. Cosmic shear measurement with maximum likelihood and maximum a posteriori inference

    CERN Document Server

    Hall, Alex

    2016-01-01

    We investigate the problem of noise bias in maximum likelihood and maximum a posteriori estimators for cosmic shear. We derive the leading and next-to-leading order biases and compute them in the context of galaxy ellipticity measurements, extending previous work on maximum likelihood inference for weak lensing. We show that a large part of the bias on these point estimators can be removed using information already contained in the likelihood when a galaxy model is specified, without the need for external calibration. We test these bias-corrected estimators on simulated galaxy images similar to those expected from planned space-based weak lensing surveys, with very promising results. We find that the introduction of an intrinsic shape prior mitigates noise bias, such that the maximum a posteriori estimate can be made less biased than the maximum likelihood estimate. Second-order terms offer a check on the convergence of the estimators, but are largely sub-dominant. We show how biases propagate to shear estima...

  18. Analysis and prediction of pest dynamics in an agroforestry context using Tiko'n, a generic tool to develop food web models

    Science.gov (United States)

    Rojas, Marcela; Malard, Julien; Adamowski, Jan; Carrera, Jaime Luis; Maas, Raúl

    2017-04-01

    While it is known that climate change will impact future plant-pest population dynamics, potentially affecting crop damage, agroforestry with its enhanced biodiversity is said to reduce the outbreaks of pest insects by providing natural enemies for the control of pest populations. This premise is known in the literature as the natural enemy hypothesis and has been widely studied qualitatively. However, disagreement still exists on whether biodiversity enhancement reduces pest outbreaks, showing the need of quantitatively understanding the mechanisms behind the interactions between pests and natural enemies, also known as trophic interactions. Crop pest models that study insect population dynamics in agroforestry contexts are very rare, and pest models that take trophic interactions into account are even rarer. This may be due to the difficulty of representing complex food webs in a quantifiable model. There is therefore a need for validated food web models that allow users to predict the response of these webs to changes in climate in agroforestry systems. In this study we present Tiko'n, a Python-based software whose API allows users to rapidly build and validate trophic web models; the program uses a Bayesian inference approach to calibrate the models according to field data, allowing for the reuse of literature data from various sources and reducing the need for extensive field data collection. Tiko'n was run using coffee leaf miner (Leucoptera coffeella) and associated parasitoid data from a shaded coffee plantation, showing the mechanisms of insect population dynamics within a tri-trophic food web in an agroforestry system.

  19. ABC of SV: Limited Information Likelihood Inference in Stochastic Volatility Jump-Diffusion Models

    DEFF Research Database (Denmark)

    Creel, Michael; Kristensen, Dennis

    We develop novel methods for estimation and filtering of continuous-time models with stochastic volatility and jumps using so-called Approximate Bayesian Computation which build likelihoods based on limited information. The proposed estimators and filters are computationally attractive relative...... to standard likelihood-based versions since they rely on low-dimensional auxiliary statistics and so avoid computation of high-dimensional integrals. Despite their computational simplicity, we find that estimators and filters perform well in practice and lead to precise estimates of model parameters...... stochastic volatility model for the dynamics of the S&P 500 equity index. We find evidence of the presence of a dynamic jump rate and in favor of a structural break in parameters at the time of the recent financial crisis. We find evidence that possible measurement error in log price is small and has little...

  20. Context updates are hierarchical

    Directory of Open Access Journals (Sweden)

    Anton Karl Ingason

    2016-10-01

    Full Text Available This squib studies the order in which elements are added to the shared context of interlocutors in a conversation. It focuses on context updates within one hierarchical structure and argues that structurally higher elements are entered into the context before lower elements, even if the structurally higher elements are pronounced after the lower elements. The crucial data are drawn from a comparison of relative clauses in two head-initial languages, English and Icelandic, and two head-final languages, Korean and Japanese. The findings have consequences for any theory of a dynamic semantics.

  1. ON THE LIKELIHOOD OF PLANET FORMATION IN CLOSE BINARIES

    Energy Technology Data Exchange (ETDEWEB)

    Jang-Condell, Hannah, E-mail: hjangcon@uwyo.edu [Department of Physics and Astronomy, University of Wyoming, 1000 East University, Department 3905, Laramie, WY 82071 (United States)

    2015-02-01

    To date, several exoplanets have been discovered orbiting stars with close binary companions (a ≲ 30 AU). The fact that planets can form in these dynamically challenging environments implies that planet formation must be a robust process. The initial protoplanetary disks in these systems from which planets must form should be tidally truncated to radii of a few AU, which indicates that the efficiency of planet formation must be high. Here, we examine the truncation of circumstellar protoplanetary disks in close binary systems, studying how the likelihood of planet formation is affected over a range of disk parameters. If the semimajor axis of the binary is too small or its eccentricity is too high, the disk will have too little mass for planet formation to occur. However, we find that the stars in the binary systems known to have planets should have once hosted circumstellar disks that were capable of supporting planet formation despite their truncation. We present a way to characterize the feasibility of planet formation based on binary orbital parameters such as stellar mass, companion mass, eccentricity, and semimajor axis. Using this measure, we can quantify the robustness of planet formation in close binaries and better understand the overall efficiency of planet formation in general.

  2. Approximate Maximum Likelihood Commercial Bank Loan Management Model

    Directory of Open Access Journals (Sweden)

    Godwin N.O.   Asemota

    2009-01-01

    Full Text Available Problem statement: Loan management is a very complex and yet, a vitally important aspect of any commercial bank operations. The balance sheet position shows the main sources of funds as deposits and shareholders contributions. Approach: In order to operate profitably, remain solvent and consequently grow, a commercial bank needs to properly manage its excess cash to yield returns in the form of loans. Results: The above are achieved if the bank can honor depositors withdrawals at all times and also grant loans to credible borrowers. This is so because loans are the main portfolios of a commercial bank that yield the highest rate of returns. Commercial banks and the environment in which they operate are dynamic. So, any attempt to model their behavior without including some elements of uncertainty would be less than desirable. The inclusion of uncertainty factor is now possible with the advent of stochastic optimal control theories. Thus, approximate maximum likelihood algorithm with variable forgetting factor was used to model the loan management behavior of a commercial bank in this study. Conclusion: The results showed that uncertainty factor employed in the stochastic modeling, enable us to adaptively control loan demand as well as fluctuating cash balances in the bank. However, this loan model can also visually aid commercial bank managers planning decisions by allowing them to competently determine excess cash and invest this excess cash as loans to earn more assets without jeopardizing public confidence.

  3. Planck 2013 results. XV. CMB power spectra and likelihood

    DEFF Research Database (Denmark)

    Tauber, Jan; Bartlett, J.G.; Bucher, M.;

    2014-01-01

    This paper presents the Planck 2013 likelihood, a complete statistical description of the two-point correlation function of the CMB temperature fluctuations that accounts for all known relevant uncertainties, both instrumental and astrophysical in nature. We use this likelihood to derive our best...

  4. EMPIRICAL LIKELIHOOD FOR LINEAR MODELS UNDER m-DEPENDENT ERRORS

    Institute of Scientific and Technical Information of China (English)

    QinYongsong; JiangBo; LiYufang

    2005-01-01

    In this paper,the empirical likelihood confidence regions for the regression coefficient in a linear model are constructed under m-dependent errors. It is shown that the blockwise empirical likelihood is a good way to deal with dependent samples.

  5. Empirical likelihood inference for diffusion processes with jumps

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    In this paper, we consider the empirical likelihood inference for the jump-diffusion model. We construct the confidence intervals based on the empirical likelihood for the infinitesimal moments in the jump-diffusion models. They are better than the confidence intervals which are based on the asymptotic normality of point estimates.

  6. TURBO DECODER USING LOCAL SUBSIDIARY MAXIMUM LIKELIHOOD DECODING IN PRIOR ESTIMATION OF THE EXTRINSIC INFORMATION

    Institute of Scientific and Technical Information of China (English)

    Yang Fengfan

    2004-01-01

    A new technique for turbo decoder is proposed by using a local subsidiary maximum likelihood decoding and a probability distributions family for the extrinsic information. The optimal distribution of the extrinsic information is dynamically specified for each component decoder.The simulation results show that the iterative decoder with the new technique outperforms that of the decoder with the traditional Gaussian approach for the extrinsic information under the same conditions.

  7. INTERACTING MULTIPLE MODEL ALGORITHM BASED ON JOINT LIKELIHOOD ESTIMATION

    Institute of Scientific and Technical Information of China (English)

    Sun Jie; Jiang Chaoshu; Chen Zhuming; Zhang Wei

    2011-01-01

    A novel approach is proposed for the estimation of likelihood on Interacting Multiple-Model (IMM) filter.In this approach,the actual innovation,based on a mismatched model,can be formulated as sum of the theoretical innovation based on a matched model and the distance between matched and mismatched models,whose probability distributions are known.The joint likelihood of innovation sequence can be estimated by convolution of the two known probability density functions.The likelihood of tracking models can be calculated by conditional probability formula.Compared with the conventional likelihood estimation method,the proposed method improves the estimation accuracy of likelihood and robustness of IMM,especially when maneuver occurs.

  8. Silence that can be dangerous: a vignette study to assess healthcare professionals' likelihood of speaking up about safety concerns.

    Science.gov (United States)

    Schwappach, David L B; Gehring, Katrin

    2014-01-01

    To investigate the likelihood of speaking up about patient safety in oncology and to clarify the effect of clinical and situational context factors on the likelihood of voicing concerns. 1013 nurses and doctors in oncology rated four clinical vignettes describing coworkers' errors and rule violations in a self-administered factorial survey (65% response rate). Multiple regression analysis was used to model the likelihood of speaking up as outcome of vignette attributes, responder's evaluations of the situation and personal characteristics. Respondents reported a high likelihood of speaking up about patient safety but the variation between and within types of errors and rule violations was substantial. Staff without managerial function provided significantly higher levels of decision difficulty and discomfort to speak up. Based on the information presented in the vignettes, 74%-96% would speak up towards a supervisor failing to check a prescription, 45%-81% would point a coworker to a missed hand disinfection, 82%-94% would speak up towards nurses who violate a safety rule in medication preparation, and 59%-92% would question a doctor violating a safety rule in lumbar puncture. Several vignette attributes predicted the likelihood of speaking up. Perceived potential harm, anticipated discomfort, and decision difficulty were significant predictors of the likelihood of speaking up. Clinicians' willingness to speak up about patient safety is considerably affected by contextual factors. Physicians and nurses without managerial function report substantial discomfort with speaking up. Oncology departments should provide staff with clear guidance and trainings on when and how to voice safety concerns.

  9. Probability calculus for quantitative HREM. Part II: entropy and likelihood concepts.

    Science.gov (United States)

    Möbus, G

    2000-12-01

    The technique of extracting atomic coordinates from HREM images by R-factor refinement via iterative simulation and global optimisation is described in the context of probability density estimations for unknown parameters. In the second part of this two-part paper we discuss in comparison maximum likelihood and maximum entropy techniques with respect to their suitability of application within HREM. We outline practical difficulties of likelihood estimation and present a synthesis of two point-cloud techniques as a recommendable solution. This R-factor refinement with independent Monte-Carlo error calibration is a highly versatile method which allows adaptation to the special needs of HREM. Unlike simple text-book estimation methods, there is no requirement here on the noise being additive, uncorrelated, or Gaussian. It also becomes possible to account for a subset of systematic errors.

  10. Recursive Pathways to Marginal Likelihood Estimation with Prior-Sensitivity Analysis

    CERN Document Server

    Cameron, Ewan

    2013-01-01

    We investigate the utility to contemporary Bayesian studies of recursive, Gauss-Seidel-type pathways to marginal likelihood estimation characterized by reverse logistic regression and the density of states. Through a pair of illustrative, numerical examples (including mixture modeling of the well-known 'galaxy dataset') we highlight both the remarkable diversity of bridging schemes amenable to recursive normalization and the notable efficiency of the resulting pseudo-mixture densities for gauging prior-sensitivity in the model selection context. Our key theoretical contributions show the connection between the nested sampling identity and the density of states. Further, we introduce a novel heuristic ('thermodynamic integration via importance sampling') for qualifying the role of the bridging sequence in marginal likelihood estimation. An efficient pseudo-mixture density scheme for harnessing the information content of otherwise discarded draws in ellipse-based nested sampling is also introduced.

  11. Wine authenticity verification as a forensic problem: an application of likelihood ratio test to label verification.

    Science.gov (United States)

    Martyna, Agnieszka; Zadora, Grzegorz; Stanimirova, Ivana; Ramos, Daniel

    2014-05-01

    The aim of the study was to investigate the applicability of the likelihood ratio (LR) approach for verifying the authenticity of 178 samples of 3 Italian wine brands: Barolo, Barbera, and Grignolino described by 27 parameters describing their chemical compositions. Since the problem of products authenticity may be of forensic interest, the likelihood ratio approach, expressing the role of the forensic expert, was proposed for determining the true origin of wines. It allows us to analyse the evidence in the context of two hypotheses, that the object belongs to one or another wine brand. Various LR models were the subject of the research and their accuracy was evaluated by the Empirical cross entropy (ECE) approach. The rates of correct classifications for the proposed models were higher than 90% and their performance evaluated by ECE was satisfactory.

  12. Maximum-likelihood estimation of haplotype frequencies in nuclear families.

    Science.gov (United States)

    Becker, Tim; Knapp, Michael

    2004-07-01

    The importance of haplotype analysis in the context of association fine mapping of disease genes has grown steadily over the last years. Since experimental methods to determine haplotypes on a large scale are not available, phase has to be inferred statistically. For individual genotype data, several reconstruction techniques and many implementations of the expectation-maximization (EM) algorithm for haplotype frequency estimation exist. Recent research work has shown that incorporating available genotype information of related individuals largely increases the precision of haplotype frequency estimates. We, therefore, implemented a highly flexible program written in C, called FAMHAP, which calculates maximum likelihood estimates (MLEs) of haplotype frequencies from general nuclear families with an arbitrary number of children via the EM-algorithm for up to 20 SNPs. For more loci, we have implemented a locus-iterative mode of the EM-algorithm, which gives reliable approximations of the MLEs for up to 63 SNP loci, or less when multi-allelic markers are incorporated into the analysis. Missing genotypes can be handled as well. The program is able to distinguish cases (haplotypes transmitted to the first affected child of a family) from pseudo-controls (non-transmitted haplotypes with respect to the child). We tested the performance of FAMHAP and the accuracy of the obtained haplotype frequencies on a variety of simulated data sets. The implementation proved to work well when many markers were considered and no significant differences between the estimates obtained with the usual EM-algorithm and those obtained in its locus-iterative mode were observed. We conclude from the simulations that the accuracy of haplotype frequency estimation and reconstruction in nuclear families is very reliable in general and robust against missing genotypes.

  13. Operation Context

    DEFF Research Database (Denmark)

    Stüben, Henning; Tietjen, Anne

    2006-01-01

    Abstract: This paper seeks to challenge the notion of context from an operational perspective. Can we grasp the forces that shape the complex conditions for an architectural or urban design within the notion of context? By shifting the gaze towards the agency of architecture, contextual analysis...

  14. Maximum likelihood estimation for semiparametric density ratio model.

    Science.gov (United States)

    Diao, Guoqing; Ning, Jing; Qin, Jing

    2012-06-27

    In the statistical literature, the conditional density model specification is commonly used to study regression effects. One attractive model is the semiparametric density ratio model, under which the conditional density function is the product of an unknown baseline density function and a known parametric function containing the covariate information. This model has a natural connection with generalized linear models and is closely related to biased sampling problems. Despite the attractive features and importance of this model, most existing methods are too restrictive since they are based on multi-sample data or conditional likelihood functions. The conditional likelihood approach can eliminate the unknown baseline density but cannot estimate it. We propose efficient estimation procedures based on the nonparametric likelihood. The nonparametric likelihood approach allows for general forms of covariates and estimates the regression parameters and the baseline density simultaneously. Therefore, the nonparametric likelihood approach is more versatile than the conditional likelihood approach especially when estimation of the conditional mean or other quantities of the outcome is of interest. We show that the nonparametric maximum likelihood estimators are consistent, asymptotically normal, and asymptotically efficient. Simulation studies demonstrate that the proposed methods perform well in practical settings. A real example is used for illustration.

  15. The Context of Creating Space: Assessing the Likelihood of College LGBT Center Presence

    Science.gov (United States)

    Fine, Leigh E.

    2012-01-01

    LGBT (lesbian, gay, bisexual, and transgender) resource centers are campus spaces dedicated to the success of sexual minority students. However, only a small handful of American colleges and universities have such spaces. Political opportunity and resource mobilization theory can provide a useful framework for understanding what contextual factors…

  16. The Context of Creating Space: Assessing the Likelihood of College LGBT Center Presence

    Science.gov (United States)

    Fine, Leigh E.

    2012-01-01

    LGBT (lesbian, gay, bisexual, and transgender) resource centers are campus spaces dedicated to the success of sexual minority students. However, only a small handful of American colleges and universities have such spaces. Political opportunity and resource mobilization theory can provide a useful framework for understanding what contextual factors…

  17. Marijuana consequences in a motivational context: Goal congruence reduces likelihood of taking steps toward change.

    Science.gov (United States)

    Simons, Jeffrey S; Joseph Clarke, C; Simons, Raluca M; Spelman, Philip J

    2016-01-01

    This study tested a model of marijuana use, problems, and motivation and barriers to change among a sample of 422 undergraduate students ages 18-25 (M=19.68, SD=1.60) who used marijuana at least once in the past 6 months. We tested a structural equation model (SEM) with use motives (i.e., coping, enhancement, and expansion), perceived use utility, and gender as exogenous variables predicting marijuana use behavior (i.e., use and problems), motivation to change (i.e., problem recognition and perceived costs and benefits of change), and the ultimate outcome, taking steps to reduce marijuana use. Controlling for level of use and problems, expansion motives had a direct effect on increased perceived costs of change and enhancement motives had direct inverse effects on problem recognition and perceived benefits of change. However, the total effect of expansion motives on taking steps was not significant. The perceived role of marijuana in achieving personal strivings (i.e., use utility) was inversely associated with problem recognition, perceived benefits of change, and taking steps toward change. In contrast, coping motives, despite being associated with greater perceived costs of change, were positively associated with taking steps. Problem recognition was positively associated with both increased perceived costs and benefits of reducing marijuana use, reflecting individuals' ambivalence about change. As expected, perceived benefits and costs of reducing use were positively and negatively associated with taking steps toward changing marijuana use, respectively. The results identify individual difference factors that contribute to motivation for change and are consistent with motivational models of change readiness. These results highlight the extent to which integration of marijuana use with personal goal achievement may interfere with taking steps to change use patterns despite associated negative consequences.

  18. Context modification in action

    NARCIS (Netherlands)

    Visser, A.

    2008-01-01

    In this paper we develop the positive fragment of Context Modification Logic. This logical system is a variant of Dynamic Predicate Logic that employs multiple-access variables and that treats argument places in the same way as Latin. The positive fragment is purely incremental.

  19. Las dinámicas interactivas en el ámbito universitario: el clima de aula / The interactive dynamics in the university context: the classroom environment

    Directory of Open Access Journals (Sweden)

    Zulay Pereira Pérez

    2010-10-01

    Full Text Available Recibido 28 de julio de 2009 • Aceptado 02 de diciembre de 2009 • Corregido 20 de marzo de 2010   Resumen. Las relaciones interpersonales en el ámbito humano cobran vital sentido desde el nacimiento y se van conformando en los procesos de interacción y socialización. El contexto educativo no está exento de procesos comunicativos e interactivos. Es en el aula de clase, donde, de manera más específica, se pueden identificar dichos procesos. El aula puede ser entendida como el espacio físico-humano, en el cual se desarrollan dinámicas a partir de las interacciones entre el profesorado y el grupo estudiantil, los contenidos, las estrategias de aprendizaje y el clima de aula que de ello se genera; aspectos todos, que como parte de los procesos de enseñanza y de aprendizaje están presentes en el ámbito de la clase. Es interesante analizar el clima de aula y las dinámicas interactivas que en ella se desarrollan, independientemente de la edad del estudiante –ya sea que se trate de infantes, adolescentes o adultos–. En este caso particular, se analiza el clima de aula en el ámbito universitario, entendiendo que las dinámicas interactivas que en el aula se desarrollen, determinan un ambiente propicio o no, para el proceso de enseñanza y de aprendizaje, que ha de ser considerado, si se opta por una educación integral y de calidad.  Abstract. Interpersonal relationships in human communities gained a great value since the begging of mankind, these relationships are constructed on interaction and socialization. The educational context is not exempt of these interactive and communicative processes, and it is specifically in the classroom where they can be found. The classroom can be identified as a physical and a humane space, in which dynamics are developed from the interactions between teachers and students, learning content, learning strategies and the class environment. All of these aspects are presented in the classroom as part

  20. Maximum Likelihood Estimation of the Identification Parameters and Its Correction

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    By taking the subsequence out of the input-output sequence of a system polluted by white noise, anindependent observation sequence and its probability density are obtained and then a maximum likelihood estimation of theidentification parameters is given. In order to decrease the asymptotic error, a corrector of maximum likelihood (CML)estimation with its recursive algorithm is given. It has been proved that the corrector has smaller asymptotic error thanthe least square methods. A simulation example shows that the corrector of maximum likelihood estimation is of higherapproximating precision to the true parameters than the least square methods.

  1. MAXIMUM LIKELIHOOD ESTIMATION IN GENERALIZED GAMMA TYPE MODEL

    Directory of Open Access Journals (Sweden)

    Vinod Kumar

    2010-01-01

    Full Text Available In the present paper, the maximum likelihood estimates of the two parameters of ageneralized gamma type model have been obtained directly by solving the likelihood equationsas well as by reparametrizing the model first and then solving the likelihood equations (as doneby Prentice, 1974 for fixed values of the third parameter. It is found that reparametrization doesneither reduce the bulk nor the complexity of calculations. as claimed by Prentice (1974. Theprocedure has been illustrated with the help of an example. The distribution of MLE of q alongwith its properties has also been obtained.

  2. Context in a wider context

    Directory of Open Access Journals (Sweden)

    John Traxler

    2011-07-01

    Full Text Available This paper attempts to review and reconsider the role of context in mobile learning and starts by outlining definitions of context-aware mobile learning as the technologies have become more mature, more robust and more widely available and as the notion of context has become progressively richer. The future role of context-aware mobile learning is considered within the context of the future of mobile learning as it moves from the challenges and opportunities of pedagogy and technology to the challenges and opportunities of policy, scale, sustainability, equity and engagement with augmented reality, «blended learning», «learner devices», «user-generated contexts» and the «internet of things». This is essentially a perspective on mobile learning, and other forms of technology-enhanced learning (TEL, where educators and their institutions set the agenda and manage change. There are, however, other perspectives on context. The increasing availability and use of smart-phones and other personal mobile devices with similar powerful functionality means that the experience of context for many people, in the form of personalized or location-based services, is an increasingly social and informal experience, rather than a specialist or educational experience. This is part of the transformative impact of mobility and connectedness on our societies brought about by these universal, ubiquitous and pervasive technologies. This paper contributes a revised understanding of context in the wider context (sic of the transformations taking place in our societies. These are subtle but pervasive transformations of jobs, work and the economy, of our sense of time, space and place, of knowing and learning, and of community and identity. This leads to a radical reconsideration of context as the notions of ‹self› and ‹other› are transformed.

  3. Context Awareness

    DEFF Research Database (Denmark)

    Brønsted, Jeppe

    forståelse. Traditionelt har denne form for information ikke været tilstede i informationssystemer, men med bølgen af Pervasive Computing og Communication begynder ting så småt at ændre sig. Et af målene for context aware computing er, at udnytte denne ekstra ressource af informationer der ellers er...... forbeholdt mennesker, til at forbedre interaktionen med og udnyttelsen af IT-systemer. Information om kontekst fra forskellige kilder kombineres og systemet foreslår relevante handlinger eller udfører dem automatisk. I dette dokument beskrives begreberne context og context awareness og hvorfor de er vigtige...

  4. Context Awareness

    DEFF Research Database (Denmark)

    Brønsted, Jeppe

    forståelse. Traditionelt har denne form for information ikke været tilstede i informationssystemer, men med bølgen af Pervasive Computing og Communication begynder ting så småt at ændre sig. Et af målene for context aware computing er, at udnytte denne ekstra ressource af informationer der ellers er...... forbeholdt mennesker, til at forbedre interaktionen med og udnyttelsen af IT-systemer. Information om kontekst fra forskellige kilder kombineres og systemet foreslår relevante handlinger eller udfører dem automatisk. I dette dokument beskrives begreberne context og context awareness og hvorfor de er vigtige...

  5. Maximum Likelihood Factor Structure of the Family Environment Scale.

    Science.gov (United States)

    Fowler, Patrick C.

    1981-01-01

    Presents the maximum likelihood factor structure of the Family Environment Scale. The first bipolar dimension, "cohesion v conflict," measures relationship-centered concerns, while the second unipolar dimension is an index of "organizational and control" activities. (Author)

  6. Likelihood Inference for a Fractionally Cointegrated Vector Autoregressive Model

    DEFF Research Database (Denmark)

    Johansen, Søren; Nielsen, Morten Ørregaard

    2012-01-01

    We consider model based inference in a fractionally cointegrated (or cofractional) vector autoregressive model with a restricted constant term, ¿, based on the Gaussian likelihood conditional on initial values. The model nests the I(d) VAR model. We give conditions on the parameters......likelihood estimators. To this end we prove weak convergence of the conditional likelihood as a continuous stochastic...... process in the parameters when errors are i.i.d. with suitable moment conditions and initial values are bounded. When the limit is deterministic this implies uniform convergence in probability of the conditional likelihood function. If the true value b0>1/2, we prove that the limit distribution of (ß...

  7. Likelihood Inference for a Nonstationary Fractional Autoregressive Model

    DEFF Research Database (Denmark)

    Johansen, Søren; Nielsen, Morten Ørregaard

    values Xº-n, n = 0, 1, ..., under the assumption that the errors are i.i.d. Gaussian. We consider the likelihood and its derivatives as stochastic processes in the parameters, and prove that they converge in distribution when the errors are i.i.d. with suitable moment conditions and the initial values......This paper discusses model based inference in an autoregressive model for fractional processes based on the Gaussian likelihood. The model allows for the process to be fractional of order d or d - b; where d = b > 1/2 are parameters to be estimated. We model the data X¿, ..., X¿ given the initial...... are bounded. We use this to prove existence and consistency of the local likelihood estimator, and to ?find the asymptotic distribution of the estimators and the likelihood ratio test of the associated fractional unit root hypothesis, which contains the fractional Brownian motion of type II...

  8. Likelihood inference for a nonstationary fractional autoregressive model

    DEFF Research Database (Denmark)

    Johansen, Søren; Nielsen, Morten Ørregaard

    values X0-n, n = 0, 1,...,under the assumption that the errors are i.i.d. Gaussian. We consider the likelihood and its derivatives as stochastic processes in the parameters, and prove that they converge in distribution when the errors are i.i.d. with suitable moment conditions and the initial values......This paper discusses model based inference in an autoregressive model for fractional processes based on the Gaussian likelihood. The model allows for the process to be fractional of order d or d-b; where d ≥ b > 1/2 are parameters to be estimated. We model the data X1,...,XT given the initial...... are bounded. We use this to prove existence and consistency of the local likelihood estimator, and to find the asymptotic distribution of the estimators and the likelihood ratio test of the associated fractional unit root hypothesis, which contains the fractional Brownian motion of type II....

  9. Empirical likelihood estimation of discretely sampled processes of OU type

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    This paper presents an empirical likelihood estimation procedure for parameters of the discretely sampled process of Ornstein-Uhlenbeck type. The proposed procedure is based on the condi- tional characteristic function, and the maximum empirical likelihood estimator is proved to be consistent and asymptotically normal. Moreover, this estimator is shown to be asymptotically efficient under some mild conditions. When the background driving Lévy process is of type A or B, we show that the intensity parameter can be exactly recovered, and we study the maximum empirical likelihood estimator with the plug-in estimated intensity parameter. Testing procedures based on the empirical likelihood ratio statistic are developed for parameters and for estimating equations, respectively. Finally, Monte Carlo simulations are conducted to demonstrate the performance of proposed estimators.

  10. Practical likelihood analysis for spatial generalized linear mixed models

    DEFF Research Database (Denmark)

    Bonat, W. H.; Ribeiro, Paulo Justiniano

    2016-01-01

    We investigate an algorithm for maximum likelihood estimation of spatial generalized linear mixed models based on the Laplace approximation. We compare our algorithm with a set of alternative approaches for two datasets from the literature. The Rhizoctonia root rot and the Rongelap are, respectiv......We investigate an algorithm for maximum likelihood estimation of spatial generalized linear mixed models based on the Laplace approximation. We compare our algorithm with a set of alternative approaches for two datasets from the literature. The Rhizoctonia root rot and the Rongelap are...... of Laplace approximation include the computation of the maximized log-likelihood value, which can be used for model selection and tests, and the possibility to obtain realistic confidence intervals for model parameters based on profile likelihoods. The Laplace approximation also avoids the tuning...

  11. Young adult consumers' media usage and online purchase likelihood

    African Journals Online (AJOL)

    Young adult consumers' media usage and online purchase likelihood. ... in new media applications such as the internet, email, blogging, twitter and social networks. ... Convenience sampling resulted in 1 298 completed questionnaires.

  12. Posterior distributions for likelihood ratios in forensic science.

    Science.gov (United States)

    van den Hout, Ardo; Alberink, Ivo

    2016-09-01

    Evaluation of evidence in forensic science is discussed using posterior distributions for likelihood ratios. Instead of eliminating the uncertainty by integrating (Bayes factor) or by conditioning on parameter values, uncertainty in the likelihood ratio is retained by parameter uncertainty derived from posterior distributions. A posterior distribution for a likelihood ratio can be summarised by the median and credible intervals. Using the posterior mean of the distribution is not recommended. An analysis of forensic data for body height estimation is undertaken. The posterior likelihood approach has been criticised both theoretically and with respect to applicability. This paper addresses the latter and illustrates an interesting application area. Copyright © 2016 The Chartered Society of Forensic Sciences. Published by Elsevier Ireland Ltd. All rights reserved.

  13. Empirical Likelihood Ratio Confidence Interval for Positively Associated Series

    Institute of Scientific and Technical Information of China (English)

    Jun-jian Zhang

    2007-01-01

    Empirical likelihood is discussed by using the blockwise technique for strongly stationary,positively associated random variables.Our results show that the statistics is asymptotically chi-square distributed and the corresponding confidence interval can be constructed.

  14. Maximum likelihood estimation of finite mixture model for economic data

    Science.gov (United States)

    Phoong, Seuk-Yen; Ismail, Mohd Tahir

    2014-06-01

    Finite mixture model is a mixture model with finite-dimension. This models are provides a natural representation of heterogeneity in a finite number of latent classes. In addition, finite mixture models also known as latent class models or unsupervised learning models. Recently, maximum likelihood estimation fitted finite mixture models has greatly drawn statistician's attention. The main reason is because maximum likelihood estimation is a powerful statistical method which provides consistent findings as the sample sizes increases to infinity. Thus, the application of maximum likelihood estimation is used to fit finite mixture model in the present paper in order to explore the relationship between nonlinear economic data. In this paper, a two-component normal mixture model is fitted by maximum likelihood estimation in order to investigate the relationship among stock market price and rubber price for sampled countries. Results described that there is a negative effect among rubber price and stock market price for Malaysia, Thailand, Philippines and Indonesia.

  15. Context matters!

    DEFF Research Database (Denmark)

    Bojesen, Anders

    2004-01-01

    This paper explores how the context of competencies affects the way we see and value competence and how it thereby forms communication and strategies of action. The paper puts forward the view that the context of competence is often spoken of in incomprehensible terms and generally taken...... for granted and unproblematic, although it is agreed to be of great importance. By crystallising three different modes of contextualised competence thinking (prescriptive, descriptive and analytical) the paper shows that the underlying assumptions about context - the interaction between the individual...... and the social - has major consequences for the specific enactment of competence. The paper argues in favour of a second order observation strategy for the context of competence. But in doing so it also shows that prevailing second-order competence theories so far, in criticising (counter) positions (and...

  16. Determination of lift and drag characteristics of Space Shuttle Orbiter using maximum likelihood estimation technique

    Science.gov (United States)

    Trujillo, B. M.

    1986-01-01

    This paper presents the technique and results of maximum likelihood estimation used to determine lift and drag characteristics of the Space Shuttle Orbiter. Maximum likelihood estimation uses measurable parameters to estimate nonmeasurable parameters. The nonmeasurable parameters for this case are elements of a nonlinear, dynamic model of the orbiter. The estimated parameters are used to evaluate a cost function that computes the differences between the measured and estimated longitudinal parameters. The case presented is a dynamic analysis. This places less restriction on pitching motion and can provide additional information about the orbiter such as lift and drag characteristics at conditions other than trim, instrument biases, and pitching moment characteristics. In addition, an output of the analysis is an estimate of the values for the individual components of lift and drag that contribute to the total lift and drag. The results show that maximum likelihood estimation is a useful tool for analysis of Space Shuttle Orbiter performance and is also applicable to parameter analysis of other types of aircraft.

  17. Research of W5 Model Based on Dynamic Context Awareness%基于动态情境感知的W5模型研究

    Institute of Scientific and Technical Information of China (English)

    王峰; 李石君

    2016-01-01

    Social network apps, such as Twitter, Sina Micro-blog and etc. have provided mass context-information associated with user IDs (who), check-in time (when), GPS coordinates (where), topic (what) and incentive (why) of tweets (5W for short) for location based services. The availability of such data received from users offers a good opportunity to study the user’s behavior and preference. In this paper, we propose a W5 probabilistic model to exploit such data with context by jointly probability to discover users’ dynamic behaviors from temporal, spatial and activity aspects. Our work is applied to prediction for user and location. Experimental results on two real-world datasets show that W5 model is effective in discovering users’ spatial-temporal prediction, and outperforms state-of-the art baselines, such as W4, on accuracy [UP: (GT: 3.75%, ST: 6.54%) and LP: (GT: 8.7%, ST: 20.6%)] at aspects of user prediction and location prediction based on Geo-Text (GT) and Sina-Tweets (ST).%Twitter、Sina Micro-blog等社交网络应用为基于位置的服务提供了大量的情境信息,如用户ID(who)、签到时间(when)、GPS坐标(where)、微博内容主题词(what)和微博内容诱因词(why)等,简称5W。它们为用户的行为和偏好研究提供了契机。该文提出了基于5W动态情境感知信息的W5概率模型,并采用包含情境信息的联合概率分布分别从时间、空间和活动等方面挖掘用户动态行为,用于用户和位置的预测。该文实验基于两个数据集:Geo-text(GT)和Sina-tweets(ST),在数据集上进行了用户预测(UP)和位置预测(LP)实验。实验结果表明,W5模型在UP和LP两方面准确率均高于W4模型。同时,W5模型在时间误差和空间距离误差两方面也取得了较好的性能。

  18. Conditional likelihood inference in generalized linear mixed models.

    OpenAIRE

    Sartori, Nicola; Severini , T.A

    2002-01-01

    Consider a generalized linear model with a canonical link function, containing both fixed and random effects. In this paper, we consider inference about the fixed effects based on a conditional likelihood function. It is shown that this conditional likelihood function is valid for any distribution of the random effects and, hence, the resulting inferences about the fixed effects are insensitive to misspecification of the random effects distribution. Inferences based on the conditional likelih...

  19. Sieve likelihood ratio inference on general parameter space

    Institute of Scientific and Technical Information of China (English)

    SHEN Xiaotong; SHI Jian

    2005-01-01

    In this paper,a theory on sieve likelihood ratio inference on general parameter spaces(including infinite dimensional) is studied.Under fairly general regularity conditions,the sieve log-likelihood ratio statistic is proved to be asymptotically x2 distributed,which can be viewed as a generalization of the well-known Wilks' theorem.As an example,a emiparametric partial linear model is investigated.

  20. A notion of graph likelihood and an infinite monkey theorem

    CERN Document Server

    Banerji, Christopher R S; Severini, Simone

    2013-01-01

    We play with a graph-theoretic analogue of the folklore infinite monkey theorem. We define a notion of graph likelihood as the probability that a given graph is constructed by a monkey in a number of time steps equal to the number of vertices. We present an algorithm to compute this graph invariant and closed formulas for some infinite classes. We have to leave the computational complexity of the likelihood as an open problem.

  1. A notion of graph likelihood and an infinite monkey theorem

    Science.gov (United States)

    Banerji, Christopher R. S.; Mansour, Toufik; Severini, Simone

    2014-01-01

    We play with a graph-theoretic analogue of the folklore infinite monkey theorem. We define a notion of graph likelihood as the probability that a given graph is constructed by a monkey in a number of time steps equal to the number of vertices. We present an algorithm to compute this graph invariant and closed formulas for some infinite classes. We have to leave the computational complexity of the likelihood as an open problem.

  2. On the likelihood function of Gaussian max-stable processes

    KAUST Repository

    Genton, M. G.

    2011-05-24

    We derive a closed form expression for the likelihood function of a Gaussian max-stable process indexed by ℝd at p≤d+1 sites, d≥1. We demonstrate the gain in efficiency in the maximum composite likelihood estimators of the covariance matrix from p=2 to p=3 sites in ℝ2 by means of a Monte Carlo simulation study. © 2011 Biometrika Trust.

  3. Hybrid TOA/AOA Approximate Maximum Likelihood Mobile Localization

    OpenAIRE

    Mohamed Zhaounia; Mohamed Adnan Landolsi; Ridha Bouallegue

    2010-01-01

    This letter deals with a hybrid time-of-arrival/angle-of-arrival (TOA/AOA) approximate maximum likelihood (AML) wireless location algorithm. Thanks to the use of both TOA/AOA measurements, the proposed technique can rely on two base stations (BS) only and achieves better performance compared to the original approximate maximum likelihood (AML) method. The use of two BSs is an important advantage in wireless cellular communication systems because it avoids hearability problems and reduces netw...

  4. Tapered composite likelihood for spatial max-stable models

    KAUST Repository

    Sang, Huiyan

    2014-05-01

    Spatial extreme value analysis is useful to environmental studies, in which extreme value phenomena are of interest and meaningful spatial patterns can be discerned. Max-stable process models are able to describe such phenomena. This class of models is asymptotically justified to characterize the spatial dependence among extremes. However, likelihood inference is challenging for such models because their corresponding joint likelihood is unavailable and only bivariate or trivariate distributions are known. In this paper, we propose a tapered composite likelihood approach by utilizing lower dimensional marginal likelihoods for inference on parameters of various max-stable process models. We consider a weighting strategy based on a "taper range" to exclude distant pairs or triples. The "optimal taper range" is selected to maximize various measures of the Godambe information associated with the tapered composite likelihood function. This method substantially reduces the computational cost and improves the efficiency over equally weighted composite likelihood estimators. We illustrate its utility with simulation experiments and an analysis of rainfall data in Switzerland.

  5. Improving and Evaluating Nested Sampling Algorithm for Marginal Likelihood Estimation

    Science.gov (United States)

    Ye, M.; Zeng, X.; Wu, J.; Wang, D.; Liu, J.

    2016-12-01

    With the growing impacts of climate change and human activities on the cycle of water resources, an increasing number of researches focus on the quantification of modeling uncertainty. Bayesian model averaging (BMA) provides a popular framework for quantifying conceptual model and parameter uncertainty. The ensemble prediction is generated by combining each plausible model's prediction, and each model is attached with a model weight which is determined by model's prior weight and marginal likelihood. Thus, the estimation of model's marginal likelihood is crucial for reliable and accurate BMA prediction. Nested sampling estimator (NSE) is a new proposed method for marginal likelihood estimation. The process of NSE is accomplished by searching the parameters' space from low likelihood area to high likelihood area gradually, and this evolution is finished iteratively via local sampling procedure. Thus, the efficiency of NSE is dominated by the strength of local sampling procedure. Currently, Metropolis-Hasting (M-H) algorithm is often used for local sampling. However, M-H is not an efficient sampling algorithm for high-dimensional or complicated parameter space. For improving the efficiency of NSE, it could be ideal to incorporate the robust and efficient sampling algorithm - DREAMzs into the local sampling of NSE. The comparison results demonstrated that the improved NSE could improve the efficiency of marginal likelihood estimation significantly. However, both improved and original NSEs suffer from heavy instability. In addition, the heavy computation cost of huge number of model executions is overcome by using an adaptive sparse grid surrogates.

  6. Employee Likelihood of Purchasing Health Insurance using Fuzzy Inference System

    Directory of Open Access Journals (Sweden)

    Lazim Abdullah

    2012-01-01

    Full Text Available Many believe that employees health and economic factors plays an important role in their likelihood to purchase health insurance. However decision to purchase health insurance is not trivial matters as many risk factors that influence decision. This paper presents a decision model using fuzzy inference system to identify the likelihoods of purchasing health insurance based on the selected risk factors. To build the likelihoods, data from one hundred and twenty eight employees at five organizations under the purview of Kota Star Municipality Malaysia were collected to provide input data. Three risk factors were considered as the input of the system including age, salary and risk of having illness. The likelihoods of purchasing health insurance was the output of the system and defined in three linguistic terms of Low, Medium and High. Input and output data were governed by the Mamdani inference rules of the system to decide the best linguistic term. The linguistic terms that describe the likelihoods of purchasing health insurance were identified by the system based on the three risk factors. It is found that twenty seven employees were likely to purchase health insurance at Low level and fifty six employees show their likelihoods at High level. The usage of fuzzy inference system would offer possible justifications to set a new approach in identifying prospective health insurance purchasers.

  7. Parametric likelihood inference for interval censored competing risks data.

    Science.gov (United States)

    Hudgens, Michael G; Li, Chenxi; Fine, Jason P

    2014-03-01

    Parametric estimation of the cumulative incidence function (CIF) is considered for competing risks data subject to interval censoring. Existing parametric models of the CIF for right censored competing risks data are adapted to the general case of interval censoring. Maximum likelihood estimators for the CIF are considered under the assumed models, extending earlier work on nonparametric estimation. A simple naive likelihood estimator is also considered that utilizes only part of the observed data. The naive estimator enables separate estimation of models for each cause, unlike full maximum likelihood in which all models are fit simultaneously. The naive likelihood is shown to be valid under mixed case interval censoring, but not under an independent inspection process model, in contrast with full maximum likelihood which is valid under both interval censoring models. In simulations, the naive estimator is shown to perform well and yield comparable efficiency to the full likelihood estimator in some settings. The methods are applied to data from a large, recent randomized clinical trial for the prevention of mother-to-child transmission of HIV.

  8. Optimized Large-Scale CMB Likelihood And Quadratic Maximum Likelihood Power Spectrum Estimation

    CERN Document Server

    Gjerløw, E; Eriksen, H K; Górski, K M; Gruppuso, A; Jewell, J B; Plaszczynski, S; Wehus, I K

    2015-01-01

    We revisit the problem of exact CMB likelihood and power spectrum estimation with the goal of minimizing computational cost through linear compression. This idea was originally proposed for CMB purposes by Tegmark et al.\\ (1997), and here we develop it into a fully working computational framework for large-scale polarization analysis, adopting \\WMAP\\ as a worked example. We compare five different linear bases (pixel space, harmonic space, noise covariance eigenvectors, signal-to-noise covariance eigenvectors and signal-plus-noise covariance eigenvectors) in terms of compression efficiency, and find that the computationally most efficient basis is the signal-to-noise eigenvector basis, which is closely related to the Karhunen-Loeve and Principal Component transforms, in agreement with previous suggestions. For this basis, the information in 6836 unmasked \\WMAP\\ sky map pixels can be compressed into a smaller set of 3102 modes, with a maximum error increase of any single multipole of 3.8\\% at $\\ell\\le32$, and a...

  9. On learning context-free and context-sensitive languages.

    Science.gov (United States)

    Boden, M; Wiles, J

    2002-01-01

    The long short-term memory (LSTM) is not the only neural network which learns a context sensitive language. Second-order sequential cascaded networks (SCNs) are able to induce means from a finite fragment of a context-sensitive language for processing strings outside the training set. The dynamical behavior of the SCN is qualitatively distinct from that observed in LSTM networks. Differences in performance and dynamics are discussed.

  10. Diagnostics of the Enterprise Economic Security and the Role of Information and Communication in the Context of Sustainability of Dynamical Equilibrium, Operation and Development

    Directory of Open Access Journals (Sweden)

    Skrynkovskyy Ruslan M.

    2015-03-01

    Full Text Available In the scientific article a system for diagnostics of the enterprise economic security is developed. It has been determined that the main business indicators for diagnostics of the enterprise economic security are: the level of the enterprise financial stability (contains the indicator for the enterprise provision with its own funds, rate of independence, financial stability indicator, current assets to equity ratio, liquid ratio, absolute liquidity ratio, current liquidity ratio; level of the enterprise production activity (calculated on the output-capital ratio, capital-labor ratio, index of workforce productivity, quality indicators of fixed assets, production potential indicator, production profitability ratio, input-output coefficient; level of organizational and administrative activities of the enterprise (takes into account the ratio of administrative expenses to the rate of increase in production volume, rate of saving of the managerial apparatus, rate of information processing; level of employee loyalty to the enterprise (calculated on the rate of personnel turnover, rate of personnel continuity, indicator of employee satisfaction, personnel development indicator, education level of employees; level of scientific and technical and innovative activity of the enterprise (including index of profitability of innovations, profitability of expenditures on research and development works; level of investment activity of the enterprise (includes index of investment profitability, rate of investment activity, rate of return on investments, rate of investment in production; level of market reliability (calculated on index of return on sales, index of return on net assets, index of marketability, level of market research. It has been identified that an important role in the context of sustainability of dynamical equilibrium, operation and development of enterprises is played by information and communication.

  11. Rate of strong consistency of the maximum quasi-likelihood estimator in quasi-likelihood nonlinear models

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Quasi-likelihood nonlinear models (QLNM) include generalized linear models as a special case.Under some regularity conditions,the rate of the strong consistency of the maximum quasi-likelihood estimation (MQLE) is obtained in QLNM.In an important case,this rate is O(n-1/2(loglogn)1/2),which is just the rate of LIL of partial sums for I.I.d variables,and thus cannot be improved anymore.

  12. Context trees

    OpenAIRE

    Ganzinger, Harald; Nieuwenhuis, Robert; Nivela, Pilar

    2001-01-01

    Indexing data structures are well-known to be crucial for the efficiency of the current state-of-the-art theorem provers. Examples are \\emph{discrimination trees}, which are like tries where terms are seen as strings and common prefixes are shared, and \\emph{substitution trees}, where terms keep their tree structure and all common \\emph{contexts} can be shared. Here we describe a new indexing data structure, \\emph{context trees}, where, by means of a limited kind of conte...

  13. Model-driven Development for User-centric Well-being Support: From Dynamic Well-being Domain Models to Context-aware Applications

    NARCIS (Netherlands)

    Bosems, S.; van Sinderen, Marten J.

    Applications that can use information obtained through device sensors to alter their behavior are called context-aware. Design and development of such applications is currently done by modeling the application's context or by using novel requirements engineering methods. If the application is to

  14. Model-driven development for user-centric well-being support: from dynamic well-being domain models to context-aware applications

    NARCIS (Netherlands)

    Bosems, Steven; Sinderen, van Marten

    2015-01-01

    Applications that can use information obtained through device sensors to alter their behavior are called context-aware. Design and development of such applications is currently done by modeling the application's context or by using novel requirements engineering methods. If the application is to sup

  15. Maximum likelihood for genome phylogeny on gene content.

    Science.gov (United States)

    Zhang, Hongmei; Gu, Xun

    2004-01-01

    With the rapid growth of entire genome data, reconstructing the phylogenetic relationship among different genomes has become a hot topic in comparative genomics. Maximum likelihood approach is one of the various approaches, and has been very successful. However, there is no reported study for any applications in the genome tree-making mainly due to the lack of an analytical form of a probability model and/or the complicated calculation burden. In this paper we studied the mathematical structure of the stochastic model of genome evolution, and then developed a simplified likelihood function for observing a specific phylogenetic pattern under four genome situation using gene content information. We use the maximum likelihood approach to identify phylogenetic trees. Simulation results indicate that the proposed method works well and can identify trees with a high correction rate. Real data application provides satisfied results. The approach developed in this paper can serve as the basis for reconstructing phylogenies of more than four genomes.

  16. Factors Influencing the Intended Likelihood of Exposing Sexual Infidelity.

    Science.gov (United States)

    Kruger, Daniel J; Fisher, Maryanne L; Fitzgerald, Carey J

    2015-08-01

    There is a considerable body of literature on infidelity within romantic relationships. However, there is a gap in the scientific literature on factors influencing the likelihood of uninvolved individuals exposing sexual infidelity. Therefore, we devised an exploratory study examining a wide range of potentially relevant factors. Based in part on evolutionary theory, we anticipated nine potential domains or types of influences on the likelihoods of exposing or protecting cheaters, including kinship, strong social alliances, financial support, previous relationship behaviors (including infidelity and abuse), potential relationship transitions, stronger sexual and emotional aspects of the extra-pair relationship, and disease risk. The pattern of results supported these predictions (N = 159 men, 328 women). In addition, there appeared to be a small positive bias for participants to report infidelity when provided with any additional information about the situation. Overall, this study contributes a broad initial description of factors influencing the predicted likelihood of exposing sexual infidelity and encourages further studies in this area.

  17. Joint analysis of prevalence and incidence data using conditional likelihood.

    Science.gov (United States)

    Saarela, Olli; Kulathinal, Sangita; Karvanen, Juha

    2009-07-01

    Disease prevalence is the combined result of duration, disease incidence, case fatality, and other mortality. If information is available on all these factors, and on fixed covariates such as genotypes, prevalence information can be utilized in the estimation of the effects of the covariates on disease incidence. Study cohorts that are recruited as cross-sectional samples and subsequently followed up for disease events of interest produce both prevalence and incidence information. In this paper, we make use of both types of information using a likelihood, which is conditioned on survival until the cross section. In a simulation study making use of real cohort data, we compare the proposed conditional likelihood method to a standard analysis where prevalent cases are omitted and the likelihood expression is conditioned on healthy status at the cross section.

  18. Penalized maximum likelihood estimation and variable selection in geostatistics

    CERN Document Server

    Chu, Tingjin; Wang, Haonan; 10.1214/11-AOS919

    2012-01-01

    We consider the problem of selecting covariates in spatial linear models with Gaussian process errors. Penalized maximum likelihood estimation (PMLE) that enables simultaneous variable selection and parameter estimation is developed and, for ease of computation, PMLE is approximated by one-step sparse estimation (OSE). To further improve computational efficiency, particularly with large sample sizes, we propose penalized maximum covariance-tapered likelihood estimation (PMLE$_{\\mathrm{T}}$) and its one-step sparse estimation (OSE$_{\\mathrm{T}}$). General forms of penalty functions with an emphasis on smoothly clipped absolute deviation are used for penalized maximum likelihood. Theoretical properties of PMLE and OSE, as well as their approximations PMLE$_{\\mathrm{T}}$ and OSE$_{\\mathrm{T}}$ using covariance tapering, are derived, including consistency, sparsity, asymptotic normality and the oracle properties. For covariance tapering, a by-product of our theoretical results is consistency and asymptotic normal...

  19. Generalized empirical likelihood methods for analyzing longitudinal data

    KAUST Repository

    Wang, S.

    2010-02-16

    Efficient estimation of parameters is a major objective in analyzing longitudinal data. We propose two generalized empirical likelihood based methods that take into consideration within-subject correlations. A nonparametric version of the Wilks theorem for the limiting distributions of the empirical likelihood ratios is derived. It is shown that one of the proposed methods is locally efficient among a class of within-subject variance-covariance matrices. A simulation study is conducted to investigate the finite sample properties of the proposed methods and compare them with the block empirical likelihood method by You et al. (2006) and the normal approximation with a correctly estimated variance-covariance. The results suggest that the proposed methods are generally more efficient than existing methods which ignore the correlation structure, and better in coverage compared to the normal approximation with correctly specified within-subject correlation. An application illustrating our methods and supporting the simulation study results is also presented.

  20. Penalized maximum likelihood estimation for generalized linear point processes

    DEFF Research Database (Denmark)

    Hansen, Niels Richard

    2010-01-01

    A generalized linear point process is specified in terms of an intensity that depends upon a linear predictor process through a fixed non-linear function. We present a framework where the linear predictor is parametrized by a Banach space and give results on Gateaux differentiability of the log-likelihood....... Of particular interest is when the intensity is expressed in terms of a linear filter parametrized by a Sobolev space. Using that the Sobolev spaces are reproducing kernel Hilbert spaces we derive results on the representation of the penalized maximum likelihood estimator in a special case and the gradient...... of the negative log-likelihood in general. The latter is used to develop a descent algorithm in the Sobolev space. We conclude the paper by extensions to multivariate and additive model specifications. The methods are implemented in the R-package ppstat....

  1. How to Maximize the Likelihood Function for a DSGE Model

    DEFF Research Database (Denmark)

    Andreasen, Martin Møller

    This paper extends two optimization routines to deal with objective functions for DSGE models. The optimization routines are i) a version of Simulated Annealing developed by Corana, Marchesi & Ridella (1987), and ii) the evolutionary algorithm CMA-ES developed by Hansen, Müller & Koumoutsakos (2003......). Following these extensions, we examine the ability of the two routines to maximize the likelihood function for a sequence of test economies. Our results show that the CMA- ES routine clearly outperforms Simulated Annealing in its ability to find the global optimum and in efficiency. With 10 unknown...... structural parameters in the likelihood function, the CMA-ES routine finds the global optimum in 95% of our test economies compared to 89% for Simulated Annealing. When the number of unknown structural parameters in the likelihood function increases to 20 and 35, then the CMA-ES routine finds the global...

  2. Sediment dynamics and palaeo-environmental context at key stages in the Challenger cold-water coral mound formation: Clues from sediment deposits at the mound base

    Science.gov (United States)

    Huvenne, Veerle Ann Ida; Van Rooij, David; De Mol, Ben; Thierens, Mieke; O'Donnell, Rory; Foubert, Anneleen

    2009-12-01

    IODP Expedition 307, targeting the 160 m high Challenger Mound and its surroundings in the Porcupine Seabight, NE Atlantic, was the first occasion of scientific drilling of a cold-water coral carbonate mound. Such mound structures are found at several locations along the continental margin but are especially numerous off Ireland. All rooted on a common unconformity (RD1) and embedded in drift sediments, the mounds in the Porcupine Seabight remain enigmatic structures, and their initial trigger and formation mechanisms are still not entirely clear. This paper discusses the sedimentary environment during the initial stages of Challenger Mound, and at the start-up of the embedding sediment drift. The results are interpreted within the regional palaeo-environmental context. Based on detailed grain-size analyses and planktonic foraminifera assemblage counts, a 14-m interval overlying the regional base-of-mound unconformity RD1 is characterised at IODP Sites U1317 (on mound), U1316 (off mound), and U1318 (background site). Several sedimentary facies are identified and interpreted in relation to regional current dynamics. Using the foraminifera counts, existing age models for the initial stages of on-mound and off-mound sedimentation are refined. Sedimentation within the initial mound was characterised by a two-mode system, with the observed cyclicities related to glacial/interglacial stages. However, the contrast in environmental conditions between the stages was less extreme than observed in the most recent glacial/interglacial cycles, allowing continuous cold-water coral growth. This sustained presence of coral framework was the key factor for fast mound build-up, baffling sediments at periods of slack currents, and protecting them from renewed erosion during high-current events. The off-mound and background sedimentation consisted mainly of a succession of contourite beds, ranging from sandy contourites in the initial stages to muddy contourites higher up in the

  3. Empirical likelihood-based evaluations of Value at Risk models

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    Value at Risk (VaR) is a basic and very useful tool in measuring market risks. Numerous VaR models have been proposed in literature. Therefore, it is of great interest to evaluate the efficiency of these models, and to select the most appropriate one. In this paper, we shall propose to use the empirical likelihood approach to evaluate these models. Simulation results and real life examples show that the empirical likelihood method is more powerful and more robust than some of the asymptotic method available in literature.

  4. LIKELIHOOD ESTIMATION OF PARAMETERS USING SIMULTANEOUSLY MONITORED PROCESSES

    DEFF Research Database (Denmark)

    Friis-Hansen, Peter; Ditlevsen, Ove Dalager

    2004-01-01

    The topic is maximum likelihood inference from several simultaneously monitored response processes of a structure to obtain knowledge about the parameters of other not monitored but important response processes when the structure is subject to some Gaussian load field in space and time. The consi......The topic is maximum likelihood inference from several simultaneously monitored response processes of a structure to obtain knowledge about the parameters of other not monitored but important response processes when the structure is subject to some Gaussian load field in space and time....... The considered example is a ship sailing with a given speed through a Gaussian wave field....

  5. Unbinned likelihood maximisation framework for neutrino clustering in Python

    Energy Technology Data Exchange (ETDEWEB)

    Coenders, Stefan [Technische Universitaet Muenchen, Boltzmannstr. 2, 85748 Garching (Germany)

    2016-07-01

    Albeit having detected an astrophysical neutrino flux with IceCube, sources of astrophysical neutrinos remain hidden up to now. A detection of a neutrino point source is a smoking gun for hadronic processes and acceleration of cosmic rays. The search for neutrino sources has many degrees of freedom, for example steady versus transient, point-like versus extended sources, et cetera. Here, we introduce a Python framework designed for unbinned likelihood maximisations as used in searches for neutrino point sources by IceCube. Implementing source scenarios in a modular way, likelihood searches on various kinds can be implemented in a user-friendly way, without sacrificing speed and memory management.

  6. Semiparametric maximum likelihood for nonlinear regression with measurement errors.

    Science.gov (United States)

    Suh, Eun-Young; Schafer, Daniel W

    2002-06-01

    This article demonstrates semiparametric maximum likelihood estimation of a nonlinear growth model for fish lengths using imprecisely measured ages. Data on the species corvina reina, found in the Gulf of Nicoya, Costa Rica, consist of lengths and imprecise ages for 168 fish and precise ages for a subset of 16 fish. The statistical problem may therefore be classified as nonlinear errors-in-variables regression with internal validation data. Inferential techniques are based on ideas extracted from several previous works on semiparametric maximum likelihood for errors-in-variables problems. The illustration of the example clarifies practical aspects of the associated computational, inferential, and data analytic techniques.

  7. Nearly Efficient Likelihood Ratio Tests of the Unit Root Hypothesis

    DEFF Research Database (Denmark)

    Jansson, Michael; Nielsen, Morten Ørregaard

    Seemingly absent from the arsenal of currently available "nearly efficient" testing procedures for the unit root hypothesis, i.e. tests whose local asymptotic power functions are indistinguishable from the Gaussian power envelope, is a test admitting a (quasi-)likelihood ratio interpretation. We...... show that the likelihood ratio unit root test derived in a Gaussian AR(1) model with standard normal innovations is nearly efficient in that model. Moreover, these desirable properties carry over to more complicated models allowing for serially correlated and/or non-Gaussian innovations....

  8. Modified maximum likelihood registration based on information fusion

    Institute of Scientific and Technical Information of China (English)

    Yongqing Qi; Zhongliang Jing; Shiqiang Hu

    2007-01-01

    The bias estimation of passive sensors is considered based on information fusion in multi-platform multisensor tracking system. The unobservable problem of bearing-only tracking in blind spot is analyzed. A modified maximum likelihood method, which uses the redundant information of multi-sensor system to calculate the target position, is investigated to estimate the biases. Monte Carlo simulation results show that the modified method eliminates the effect of unobservable problem in the blind spot and can estimate the biases more rapidly and accurately than maximum likelihood method. It is statistically efficient since the standard deviation of bias estimation errors meets the theoretical lower bounds.

  9. Likelihood-based inference for clustered line transect data

    DEFF Research Database (Denmark)

    Waagepetersen, Rasmus; Schweder, Tore

    2006-01-01

    The uncertainty in estimation of spatial animal density from line transect surveys depends on the degree of spatial clustering in the animal population. To quantify the clustering we model line transect data as independent thinnings of spatial shot-noise Cox processes. Likelihood-based inference...... is implemented using markov chain Monte Carlo (MCMC) methods to obtain efficient estimates of spatial clustering parameters. Uncertainty is addressed using parametric bootstrap or by consideration of posterior distributions in a Bayesian setting. Maximum likelihood estimation and Bayesian inference are compared...

  10. Maximum likelihood estimation for life distributions with competing failure modes

    Science.gov (United States)

    Sidik, S. M.

    1979-01-01

    The general model for the competing failure modes assuming that location parameters for each mode are expressible as linear functions of the stress variables and the failure modes act independently is presented. The general form of the likelihood function and the likelihood equations are derived for the extreme value distributions, and solving these equations using nonlinear least squares techniques provides an estimate of the asymptotic covariance matrix of the estimators. Monte-Carlo results indicate that, under appropriate conditions, the location parameters are nearly unbiased, the scale parameter is slightly biased, and the asymptotic covariances are rapidly approached.

  11. Approximated maximum likelihood estimation in multifractal random walks

    CERN Document Server

    Løvsletten, Ola

    2011-01-01

    We present an approximated maximum likelihood method for the multifractal random walk processes of [E. Bacry et al., Phys. Rev. E 64, 026103 (2001)]. The likelihood is computed using a Laplace approximation and a truncation in the dependency structure for the latent volatility. The procedure is implemented as a package in the R computer language. Its performance is tested on synthetic data and compared to an inference approach based on the generalized method of moments. The method is applied to estimate parameters for various financial stock indices.

  12. Parameter estimation in X-ray astronomy using maximum likelihood

    Science.gov (United States)

    Wachter, K.; Leach, R.; Kellogg, E.

    1979-01-01

    Methods of estimation of parameter values and confidence regions by maximum likelihood and Fisher efficient scores starting from Poisson probabilities are developed for the nonlinear spectral functions commonly encountered in X-ray astronomy. It is argued that these methods offer significant advantages over the commonly used alternatives called minimum chi-squared because they rely on less pervasive statistical approximations and so may be expected to remain valid for data of poorer quality. Extensive numerical simulations of the maximum likelihood method are reported which verify that the best-fit parameter value and confidence region calculations are correct over a wide range of input spectra.

  13. A maximum likelihood estimation framework for delay logistic differential equation model

    Science.gov (United States)

    Mahmoud, Ahmed Adly; Dass, Sarat Chandra; Muthuvalu, Mohana S.

    2016-11-01

    This paper will introduce the maximum likelihood method of estimation for delay differential equation model governed by unknown delay and other parameters of interest followed by a numerical solver approach. As an example we consider the delayed logistic differential equation. A grid based estimation framework is proposed. Our methodology estimates correctly the delay parameter as well as the initial starting value of the dynamical system based on simulation data. The computations have been carried out with help of mathematical software: MATLAB® 8.0 R2012b.

  14. Likelihood-based inference for cointegration with nonlinear error-correction

    DEFF Research Database (Denmark)

    Kristensen, Dennis; Rahbek, Anders Christian

    2010-01-01

    We consider a class of nonlinear vector error correction models where the transfer function (or loadings) of the stationary relationships is nonlinear. This includes in particular the smooth transition models. A general representation theorem is given which establishes the dynamic properties...... and a linear trend in general. Gaussian likelihood-based estimators are considered for the long-run cointegration parameters, and the short-run parameters. Asymptotic theory is provided for these and it is discussed to what extend asymptotic normality and mixed normality can be found. A simulation study...

  15. Predicting crash likelihood and severity on freeways with real-time loop detector data.

    Science.gov (United States)

    Xu, Chengcheng; Tarko, Andrew P; Wang, Wei; Liu, Pan

    2013-08-01

    Real-time crash risk prediction using traffic data collected from loop detector stations is useful in dynamic safety management systems aimed at improving traffic safety through application of proactive safety countermeasures. The major drawback of most of the existing studies is that they focus on the crash risk without consideration of crash severity. This paper presents an effort to develop a model that predicts the crash likelihood at different levels of severity with a particular focus on severe crashes. The crash data and traffic data used in this study were collected on the I-880 freeway in California, United States. This study considers three levels of crash severity: fatal/incapacitating injury crashes (KA), non-incapacitating/possible injury crashes (BC), and property-damage-only crashes (PDO). The sequential logit model was used to link the likelihood of crash occurrences at different severity levels to various traffic flow characteristics derived from detector data. The elasticity analysis was conducted to evaluate the effect of the traffic flow variables on the likelihood of crash and its severity.The results show that the traffic flow characteristics contributing to crash likelihood were quite different at different levels of severity. The PDO crashes were more likely to occur under congested traffic flow conditions with highly variable speed and frequent lane changes, while the KA and BC crashes were more likely to occur under less congested traffic flow conditions. High speed, coupled with a large speed difference between adjacent lanes under uncongested traffic conditions, was found to increase the likelihood of severe crashes (KA). This study applied the 20-fold cross-validation method to estimate the prediction performance of the developed models. The validation results show that the model's crash prediction performance at each severity level was satisfactory. The findings of this study can be used to predict the probabilities of crash at

  16. Generative Contexts

    Science.gov (United States)

    Lyles, Dan Allen

    Educational research has identified how science, technology, engineering, and mathematics (STEM) practice and education have underperforming metrics in racial and gender diversity, despite decades of intervention. These disparities are part of the construction of a culture of science that is alienating to these populations. Recent studies in a social science framework described as "Generative Justice" have suggested that the context of social and scientific practice might be modified to bring about more just and equitable relations among the disenfranchised by circulating the value they and their non-human allies create back to them in unalienated forms. What is not known are the underlying principles of social and material space that makes a system more or less generative. I employ an autoethnographic method at four sites: a high school science class; a farm committed to "Black and Brown liberation"; a summer program geared towards youth environmental mapping; and a summer workshop for Harlem middle school students. My findings suggest that by identifying instances where material affinity, participatory voice, and creative solidarity are mutually reinforcing, it is possible to create educational contexts that generate unalienated value, and circulate it back to the producers themselves. This cycle of generation may help explain how to create systems of justice that strengthen and grow themselves through successive iterations. The problem of lack of diversity in STEM may be addressed not merely by recruiting the best and the brightest from underrepresented populations, but by changing the context of STEM education to provide tools for its own systematic restructuring.

  17. Estimating parameters of generalized integrate-and-fire neurons from the maximum likelihood of spike trains.

    Science.gov (United States)

    Dong, Yi; Mihalas, Stefan; Russell, Alexander; Etienne-Cummings, Ralph; Niebur, Ernst

    2011-11-01

    When a neuronal spike train is observed, what can we deduce from it about the properties of the neuron that generated it? A natural way to answer this question is to make an assumption about the type of neuron, select an appropriate model for this type, and then choose the model parameters as those that are most likely to generate the observed spike train. This is the maximum likelihood method. If the neuron obeys simple integrate-and-fire dynamics, Paninski, Pillow, and Simoncelli (2004) showed that its negative log-likelihood function is convex and that, at least in principle, its unique global minimum can thus be found by gradient descent techniques. Many biological neurons are, however, known to generate a richer repertoire of spiking behaviors than can be explained in a simple integrate-and-fire model. For instance, such a model retains only an implicit (through spike-induced currents), not an explicit, memory of its input; an example of a physiological situation that cannot be explained is the absence of firing if the input current is increased very slowly. Therefore, we use an expanded model (Mihalas & Niebur, 2009 ), which is capable of generating a large number of complex firing patterns while still being linear. Linearity is important because it maintains the distribution of the random variables and still allows maximum likelihood methods to be used. In this study, we show that although convexity of the negative log-likelihood function is not guaranteed for this model, the minimum of this function yields a good estimate for the model parameters, in particular if the noise level is treated as a free parameter. Furthermore, we show that a nonlinear function minimization method (r-algorithm with space dilation) usually reaches the global minimum.

  18. Identification of Sparse Neural Functional Connectivity using Penalized Likelihood Estimation and Basis Functions

    Science.gov (United States)

    Song, Dong; Wang, Haonan; Tu, Catherine Y.; Marmarelis, Vasilis Z.; Hampson, Robert E.; Deadwyler, Sam A.; Berger, Theodore W.

    2013-01-01

    One key problem in computational neuroscience and neural engineering is the identification and modeling of functional connectivity in the brain using spike train data. To reduce model complexity, alleviate overfitting, and thus facilitate model interpretation, sparse representation and estimation of functional connectivity is needed. Sparsities include global sparsity, which captures the sparse connectivities between neurons, and local sparsity, which reflects the active temporal ranges of the input-output dynamical interactions. In this paper, we formulate a generalized functional additive model (GFAM) and develop the associated penalized likelihood estimation methods for such a modeling problem. A GFAM consists of a set of basis functions convolving the input signals, and a link function generating the firing probability of the output neuron from the summation of the convolutions weighted by the sought model coefficients. Model sparsities are achieved by using various penalized likelihood estimations and basis functions. Specifically, we introduce two variations of the GFAM using a global basis (e.g., Laguerre basis) and group LASSO estimation, and a local basis (e.g., B-spline basis) and group bridge estimation, respectively. We further develop an optimization method based on quadratic approximation of the likelihood function for the estimation of these models. Simulation and experimental results show that both group-LASSO-Laguerre and group-bridge-B-spline can capture faithfully the global sparsities, while the latter can replicate accurately and simultaneously both global and local sparsities. The sparse models outperform the full models estimated with the standard maximum likelihood method in out-of-sample predictions. PMID:23674048

  19. A Unified Maximum Likelihood Approach to Document Retrieval.

    Science.gov (United States)

    Bodoff, David; Enache, Daniel; Kambil, Ajit; Simon, Gary; Yukhimets, Alex

    2001-01-01

    Addresses the query- versus document-oriented dichotomy in information retrieval. Introduces a maximum likelihood approach to utilizing feedback data that can be used to construct a concrete object function that estimates both document and query parameters in accordance with all available feedback data. (AEF)

  20. Profile likelihood maps of a 15-dimensional MSSM

    NARCIS (Netherlands)

    Strege, C.; Bertone, G.; Besjes, G.J.; Caron, S.; Ruiz de Austri, R.; Strubig, A.; Trotta, R.

    2014-01-01

    We present statistically convergent profile likelihood maps obtained via global fits of a phenomenological Minimal Supersymmetric Standard Model with 15 free parameters (the MSSM-15), based on over 250M points. We derive constraints on the model parameters from direct detection limits on dark matter

  1. MAXIMUM-LIKELIHOOD-ESTIMATION OF THE ENTROPY OF AN ATTRACTOR

    NARCIS (Netherlands)

    SCHOUTEN, JC; TAKENS, F; VANDENBLEEK, CM

    1994-01-01

    In this paper, a maximum-likelihood estimate of the (Kolmogorov) entropy of an attractor is proposed that can be obtained directly from a time series. Also, the relative standard deviation of the entropy estimate is derived; it is dependent on the entropy and on the number of samples used in the est

  2. GPU Accelerated Likelihoods for Stereo-Based Articulated Tracking

    DEFF Research Database (Denmark)

    Friborg, Rune Møllegaard; Hauberg, Søren; Erleben, Kenny

    For many years articulated tracking has been an active research topic in the computer vision community. While working solutions have been suggested, computational time is still problematic. We present a GPU implementation of a ray-casting based likelihood model that is orders of magnitude faster...

  3. A KULLBACK-LEIBLER EMPIRICAL LIKELIHOOD INFERENCE FOR CENSORED DATA

    Institute of Scientific and Technical Information of China (English)

    SHI Jian; Tai-Shing Lau

    2002-01-01

    In this paper, two kinds of Kullback-Leibler criteria with appropriate con straints are proposed to construct empirical likelihood confidence intervals for the mean of right censored data. It is shown that one of the criteria is equivalent to Adimari's (1997)procedure, and the other shares the same asymptotic behavior.

  4. GPU accelerated likelihoods for stereo-based articulated tracking

    DEFF Research Database (Denmark)

    Friborg, Rune Møllegaard; Hauberg, Søren; Erleben, Kenny

    2010-01-01

    For many years articulated tracking has been an active research topic in the computer vision community. While working solutions have been suggested, computational time is still problematic. We present a GPU implementation of a ray-casting based likelihood model that is orders of magnitude faster...

  5. A KULLBACK—LEIBLER EMPIRICAL LIKELIHOOD INFERENCE FOR CENSORED DATA

    Institute of Scientific and Technical Information of China (English)

    SHIJian; Tai-ShingLan

    2002-01-01

    In this paper,two kinds of Kullback-Leibler criteria with appropriate constraints are proposed to construct empirical likelihood confidence intervals for the mean of right censored data.It is shown that one of the criteria is equivalent to Adimari's(1997) procedure,and the other shares the same asymptotic behavior.

  6. Community Level Disadvantage and the Likelihood of First Ischemic Stroke

    Directory of Open Access Journals (Sweden)

    Bernadette Boden-Albala

    2012-01-01

    Full Text Available Background and Purpose. Residing in “disadvantaged” communities may increase morbidity and mortality independent of individual social resources and biological factors. This study evaluates the impact of population-level disadvantage on incident ischemic stroke likelihood in a multiethnic urban population. Methods. A population based case-control study was conducted in an ethnically diverse community of New York. First ischemic stroke cases and community controls were enrolled and a stroke risk assessment performed. Data regarding population level economic indicators for each census tract was assembled using geocoding. Census variables were also grouped together to define a broader measure of collective disadvantage. We evaluated the likelihood of stroke for population-level variables controlling for individual social (education, social isolation, and insurance and vascular risk factors. Results. We age-, sex-, and race-ethnicity-matched 687 incident ischemic stroke cases to 1153 community controls. The mean age was 69 years: 60% women; 22% white, 28% black, and 50% Hispanic. After adjustment, the index of community level disadvantage (OR 2.0, 95% CI 1.7–2.1 was associated with increased stroke likelihood overall and among all three race-ethnic groups. Conclusion. Social inequalities measured by census tract data including indices of community disadvantage confer a significant likelihood of ischemic stroke independent of conventional risk factors.

  7. Heteroscedastic one-factor models and marginal maximum likelihood estimation

    NARCIS (Netherlands)

    Hessen, D.J.; Dolan, C.V.

    2009-01-01

    In the present paper, a general class of heteroscedastic one-factor models is considered. In these models, the residual variances of the observed scores are explicitly modelled as parametric functions of the one-dimensional factor score. A marginal maximum likelihood procedure for parameter estimati

  8. Bias Correction for Alternating Iterative Maximum Likelihood Estimators

    Institute of Scientific and Technical Information of China (English)

    Gang YU; Wei GAO; Ningzhong SHI

    2013-01-01

    In this paper,we give a definition of the alternating iterative maximum likelihood estimator (AIMLE) which is a biased estimator.Furthermore we adjust the AIMLE to result in asymptotically unbiased and consistent estimators by using a bootstrap iterative bias correction method as in Kuk (1995).Two examples and simulation results reported illustrate the performance of the bias correction for AIMLE.

  9. Maximum likelihood Jukes-Cantor triplets: analytic solutions.

    Science.gov (United States)

    Chor, Benny; Hendy, Michael D; Snir, Sagi

    2006-03-01

    Maximum likelihood (ML) is a popular method for inferring a phylogenetic tree of the evolutionary relationship of a set of taxa, from observed homologous aligned genetic sequences of the taxa. Generally, the computation of the ML tree is based on numerical methods, which in a few cases, are known to converge to a local maximum on a tree, which is suboptimal. The extent of this problem is unknown, one approach is to attempt to derive algebraic equations for the likelihood equation and find the maximum points analytically. This approach has so far only been successful in the very simplest cases, of three or four taxa under the Neyman model of evolution of two-state characters. In this paper we extend this approach, for the first time, to four-state characters, the Jukes-Cantor model under a molecular clock, on a tree T on three taxa, a rooted triple. We employ spectral methods (Hadamard conjugation) to express the likelihood function parameterized by the path-length spectrum. Taking partial derivatives, we derive a set of polynomial equations whose simultaneous solution contains all critical points of the likelihood function. Using tools of algebraic geometry (the resultant of two polynomials) in the computer algebra packages (Maple), we are able to find all turning points analytically. We then employ this method on real sequence data and obtain realistic results on the primate-rodents divergence time.

  10. A Monte Carlo Evaluation of Maximum Likelihood Multidimensional Scaling Methods

    NARCIS (Netherlands)

    Bijmolt, T.H.A.; Wedel, M.

    1996-01-01

    We compare three alternative Maximum Likelihood Multidimensional Scaling methods for pairwise dissimilarity ratings, namely MULTISCALE, MAXSCAL, and PROSCAL in a Monte Carlo study.The three MLMDS methods recover the true con gurations very well.The recovery of the true dimensionality depends on the

  11. Maximum likelihood estimation of phase-type distributions

    DEFF Research Database (Denmark)

    Esparza, Luz Judith R

    This work is concerned with the statistical inference of phase-type distributions and the analysis of distributions with rational Laplace transform, known as matrix-exponential distributions. The thesis is focused on the estimation of the maximum likelihood parameters of phase-type distributions ...

  12. Likelihood Inference for a Nonstationary Fractional Autoregressive Model

    DEFF Research Database (Denmark)

    Johansen, Søren; Nielsen, Morten Ørregaard

    This paper discusses model based inference in an autoregressive model for fractional processes based on the Gaussian likelihood. The model allows for the process to be fractional of order d or d - b; where d = b > 1/2 are parameters to be estimated. We model the data X¿, ..., X¿ given the initial...

  13. Composite likelihood and two-stage estimation in family studies

    DEFF Research Database (Denmark)

    Andersen, Elisabeth Anne Wreford

    2004-01-01

    In this paper register based family studies provide the motivation for linking a two-stage estimation procedure in copula models for multivariate failure time data with a composite likelihood approach. The asymptotic properties of the estimators in both parametric and semi-parametric models are d...

  14. Likelihood Inference for a Fractionally Cointegrated Vector Autoregressive Model

    DEFF Research Database (Denmark)

    Johansen, Søren; Nielsen, Morten Ørregaard

    We consider model based inference in a fractionally cointegrated (or cofractional) vector autoregressive model based on the conditional Gaussian likelihood. The model allows the process X(t) to be fractional of order d and cofractional of order d-b; that is, there exist vectors ß for which ß...

  15. Trimmed Likelihood-based Estimation in Binary Regression Models

    NARCIS (Netherlands)

    Cizek, P.

    2005-01-01

    The binary-choice regression models such as probit and logit are typically estimated by the maximum likelihood method.To improve its robustness, various M-estimation based procedures were proposed, which however require bias corrections to achieve consistency and their resistance to outliers is rela

  16. Planck 2013 results. XV. CMB power spectra and likelihood

    CERN Document Server

    Ade, P.A.R.; Armitage-Caplan, C.; Arnaud, M.; Ashdown, M.; Atrio-Barandela, F.; Aumont, J.; Baccigalupi, C.; Banday, A.J.; Barreiro, R.B.; Bartlett, J.G.; Battaner, E.; Benabed, K.; Benoit, A.; Benoit-Levy, A.; Bernard, J.P.; Bersanelli, M.; Bielewicz, P.; Bobin, J.; Bock, J.J.; Bonaldi, A.; Bonavera, L.; Bond, J.R.; Borrill, J.; Bouchet, F.R.; Boulanger, F.; Bridges, M.; Bucher, M.; Burigana, C.; Butler, R.C.; Calabrese, E.; Cardoso, J.F.; Catalano, A.; Challinor, A.; Chamballu, A.; Chiang, L.Y.; Chiang, H.C.; Christensen, P.R.; Church, S.; Clements, D.L.; Colombi, S.; Colombo, L.P.L.; Combet, C.; Couchot, F.; Coulais, A.; Crill, B.P.; Curto, A.; Cuttaia, F.; Danese, L.; Davies, R.D.; Davis, R.J.; de Bernardis, P.; de Rosa, A.; de Zotti, G.; Delabrouille, J.; Delouis, J.M.; Desert, F.X.; Dickinson, C.; Diego, J.M.; Dole, H.; Donzelli, S.; Dore, O.; Douspis, M.; Dunkley, J.; Dupac, X.; Efstathiou, G.; Elsner, F.; Ensslin, T.A.; Eriksen, H.K.; Finelli, F.; Forni, O.; Frailis, M.; Fraisse, A.A.; Franceschi, E.; Gaier, T.C.; Galeotta, S.; Galli, S.; Ganga, K.; Giard, M.; Giardino, G.; Giraud-Heraud, Y.; Gjerlow, E.; Gonzalez-Nuevo, J.; Gorski, K.M.; Gratton, S.; Gregorio, A.; Gruppuso, A.; Gudmundsson, J.E.; Hansen, F.K.; Hanson, D.; Harrison, D.; Helou, G.; Henrot-Versille, S.; Hernandez-Monteagudo, C.; Herranz, D.; Hildebrandt, S.R.; Hivon, E.; Hobson, M.; Holmes, W.A.; Hornstrup, A.; Hovest, W.; Huffenberger, K.M.; Hurier, G.; Jaffe, T.R.; Jaffe, A.H.; Jewell, J.; Jones, W.C.; Juvela, M.; Keihanen, E.; Keskitalo, R.; Kiiveri, K.; Kisner, T.S.; Kneissl, R.; Knoche, J.; Knox, L.; Kunz, M.; Kurki-Suonio, H.; Lagache, G.; Lahteenmaki, A.; Lamarre, J.M.; Lasenby, A.; Lattanzi, M.; Laureijs, R.J.; Lawrence, C.R.; Le Jeune, M.; Leach, S.; Leahy, J.P.; Leonardi, R.; Leon-Tavares, J.; Lesgourgues, J.; Liguori, M.; Lilje, P.B.; Lindholm, V.; Linden-Vornle, M.; Lopez-Caniego, M.; Lubin, P.M.; Macias-Perez, J.F.; Maffei, B.; Maino, D.; Mandolesi, N.; Marinucci, D.; Maris, M.; Marshall, D.J.; Martin, P.G.; Martinez-Gonzalez, E.; Masi, S.; Matarrese, S.; Matthai, F.; Mazzotta, P.; Meinhold, P.R.; Melchiorri, A.; Mendes, L.; Menegoni, E.; Mennella, A.; Migliaccio, M.; Millea, M.; Mitra, S.; Miville-Deschenes, M.A.; Molinari, D.; Moneti, A.; Montier, L.; Morgante, G.; Mortlock, D.; Moss, A.; Munshi, D.; Naselsky, P.; Nati, F.; Natoli, P.; Netterfield, C.B.; Norgaard-Nielsen, H.U.; Noviello, F.; Novikov, D.; Novikov, I.; O'Dwyer, I.J.; Orieux, F.; Osborne, S.; Oxborrow, C.A.; Paci, F.; Pagano, L.; Pajot, F.; Paladini, R.; Paoletti, D.; Partridge, B.; Pasian, F.; Patanchon, G.; Paykari, P.; Perdereau, O.; Perotto, L.; Perrotta, F.; Piacentini, F.; Piat, M.; Pierpaoli, E.; Pietrobon, D.; Plaszczynski, S.; Pointecouteau, E.; Polenta, G.; Ponthieu, N.; Popa, L.; Poutanen, T.; Pratt, G.W.; Prezeau, G.; Prunet, S.; Puget, J.L.; Rachen, J.P.; Rahlin, A.; Rebolo, R.; Reinecke, M.; Remazeilles, M.; Renault, C.; Ricciardi, S.; Riller, T.; Ringeval, C.; Ristorcelli, I.; Rocha, G.; Rosset, C.; Roudier, G.; Rowan-Robinson, M.; Rubino-Martin, J.A.; Rusholme, B.; Sandri, M.; Sanselme, L.; Santos, D.; Savini, G.; Scott, D.; Seiffert, M.D.; Shellard, E.P.S.; Spencer, L.D.; Starck, J.L.; Stolyarov, V.; Stompor, R.; Sudiwala, R.; Sureau, F.; Sutton, D.; Suur-Uski, A.S.; Sygnet, J.F.; Tauber, J.A.; Tavagnacco, D.; Terenzi, L.; Toffolatti, L.; Tomasi, M.; Tristram, M.; Tucci, M.; Tuovinen, J.; Turler, M.; Valenziano, L.; Valiviita, J.; Van Tent, B.; Varis, J.; Vielva, P.; Villa, F.; Vittorio, N.; Wade, L.A.; Wandelt, B.D.; Wehus, I.K.; White, M.; White, S.D.M.; Yvon, D.; Zacchei, A.; Zonca, A.

    2014-01-01

    We present the Planck likelihood, a complete statistical description of the two-point correlation function of the CMB temperature fluctuations. We use this likelihood to derive the Planck CMB power spectrum over three decades in l, covering 2 = 50, we employ a correlated Gaussian likelihood approximation based on angular cross-spectra derived from the 100, 143 and 217 GHz channels. We validate our likelihood through an extensive suite of consistency tests, and assess the impact of residual foreground and instrumental uncertainties on cosmological parameters. We find good internal agreement among the high-l cross-spectra with residuals of a few uK^2 at l <= 1000. We compare our results with foreground-cleaned CMB maps, and with cross-spectra derived from the 70 GHz Planck map, and find broad agreement in terms of spectrum residuals and cosmological parameters. The best-fit LCDM cosmology is in excellent agreement with preliminary Planck polarisation spectra. The standard LCDM cosmology is well constrained b...

  17. Maximum Likelihood Estimation of Nonlinear Structural Equation Models.

    Science.gov (United States)

    Lee, Sik-Yum; Zhu, Hong-Tu

    2002-01-01

    Developed an EM type algorithm for maximum likelihood estimation of a general nonlinear structural equation model in which the E-step is completed by a Metropolis-Hastings algorithm. Illustrated the methodology with results from a simulation study and two real examples using data from previous studies. (SLD)

  18. Reconceptualizing Social Influence in Counseling: The Elaboration Likelihood Model.

    Science.gov (United States)

    McNeill, Brian W.; Stoltenberg, Cal D.

    1989-01-01

    Presents Elaboration Likelihood Model (ELM) of persuasion (a reconceptualization of the social influence process) as alternative model of attitude change. Contends ELM unifies conflicting social psychology results and can potentially account for inconsistent research findings in counseling psychology. Provides guidelines on integrating…

  19. Maximum likelihood estimation of the attenuated ultrasound pulse

    DEFF Research Database (Denmark)

    Rasmussen, Klaus Bolding

    1994-01-01

    The attenuated ultrasound pulse is divided into two parts: a stationary basic pulse and a nonstationary attenuation pulse. A standard ARMA model is used for the basic pulse, and a nonstandard ARMA model is derived for the attenuation pulse. The maximum likelihood estimator of the attenuated...

  20. Silence that can be dangerous: a vignette study to assess healthcare professionals' likelihood of speaking up about safety concerns.

    Directory of Open Access Journals (Sweden)

    David L B Schwappach

    Full Text Available To investigate the likelihood of speaking up about patient safety in oncology and to clarify the effect of clinical and situational context factors on the likelihood of voicing concerns.1013 nurses and doctors in oncology rated four clinical vignettes describing coworkers' errors and rule violations in a self-administered factorial survey (65% response rate. Multiple regression analysis was used to model the likelihood of speaking up as outcome of vignette attributes, responder's evaluations of the situation and personal characteristics.Respondents reported a high likelihood of speaking up about patient safety but the variation between and within types of errors and rule violations was substantial. Staff without managerial function provided significantly higher levels of decision difficulty and discomfort to speak up. Based on the information presented in the vignettes, 74%-96% would speak up towards a supervisor failing to check a prescription, 45%-81% would point a coworker to a missed hand disinfection, 82%-94% would speak up towards nurses who violate a safety rule in medication preparation, and 59%-92% would question a doctor violating a safety rule in lumbar puncture. Several vignette attributes predicted the likelihood of speaking up. Perceived potential harm, anticipated discomfort, and decision difficulty were significant predictors of the likelihood of speaking up.Clinicians' willingness to speak up about patient safety is considerably affected by contextual factors. Physicians and nurses without managerial function report substantial discomfort with speaking up. Oncology departments should provide staff with clear guidance and trainings on when and how to voice safety concerns.

  1. Silence That Can Be Dangerous: A Vignette Study to Assess Healthcare Professionals’ Likelihood of Speaking up about Safety Concerns

    Science.gov (United States)

    Schwappach, David L. B.; Gehring, Katrin

    2014-01-01

    Purpose To investigate the likelihood of speaking up about patient safety in oncology and to clarify the effect of clinical and situational context factors on the likelihood of voicing concerns. Patients and Methods 1013 nurses and doctors in oncology rated four clinical vignettes describing coworkers’ errors and rule violations in a self-administered factorial survey (65% response rate). Multiple regression analysis was used to model the likelihood of speaking up as outcome of vignette attributes, responder’s evaluations of the situation and personal characteristics. Results Respondents reported a high likelihood of speaking up about patient safety but the variation between and within types of errors and rule violations was substantial. Staff without managerial function provided significantly higher levels of decision difficulty and discomfort to speak up. Based on the information presented in the vignettes, 74%−96% would speak up towards a supervisor failing to check a prescription, 45%−81% would point a coworker to a missed hand disinfection, 82%−94% would speak up towards nurses who violate a safety rule in medication preparation, and 59%−92% would question a doctor violating a safety rule in lumbar puncture. Several vignette attributes predicted the likelihood of speaking up. Perceived potential harm, anticipated discomfort, and decision difficulty were significant predictors of the likelihood of speaking up. Conclusions Clinicians’ willingness to speak up about patient safety is considerably affected by contextual factors. Physicians and nurses without managerial function report substantial discomfort with speaking up. Oncology departments should provide staff with clear guidance and trainings on when and how to voice safety concerns. PMID:25116338

  2. Exclusion probabilities and likelihood ratios with applications to mixtures.

    Science.gov (United States)

    Slooten, Klaas-Jan; Egeland, Thore

    2016-01-01

    The statistical evidence obtained from mixed DNA profiles can be summarised in several ways in forensic casework including the likelihood ratio (LR) and the Random Man Not Excluded (RMNE) probability. The literature has seen a discussion of the advantages and disadvantages of likelihood ratios and exclusion probabilities, and part of our aim is to bring some clarification to this debate. In a previous paper, we proved that there is a general mathematical relationship between these statistics: RMNE can be expressed as a certain average of the LR, implying that the expected value of the LR, when applied to an actual contributor to the mixture, is at least equal to the inverse of the RMNE. While the mentioned paper presented applications for kinship problems, the current paper demonstrates the relevance for mixture cases, and for this purpose, we prove some new general properties. We also demonstrate how to use the distribution of the likelihood ratio for donors of a mixture, to obtain estimates for exceedance probabilities of the LR for non-donors, of which the RMNE is a special case corresponding to L R>0. In order to derive these results, we need to view the likelihood ratio as a random variable. In this paper, we describe how such a randomization can be achieved. The RMNE is usually invoked only for mixtures without dropout. In mixtures, artefacts like dropout and drop-in are commonly encountered and we address this situation too, illustrating our results with a basic but widely implemented model, a so-called binary model. The precise definitions, modelling and interpretation of the required concepts of dropout and drop-in are not entirely obvious, and we attempt to clarify them here in a general likelihood framework for a binary model.

  3. National context

    DEFF Research Database (Denmark)

    Ravn, Tine

    2011-01-01

    This document forms part of the tasks of Work Package 3 for the European project PLACES (Platform of Local Authorities and Communicators Engaged in Science) whose main goal is to offer to a wide and diverse community of actors a common platform to structure their science communication activities,......, at a city/regional level (www.openplaces.eu). The main objective of this document is to have an overview of different contexts of scientific culture that are present around Europe with a particular focus on the local dimension of the initiatives and policies in science communication....

  4. Parallel Likelihood Function Evaluation on Heterogeneous Many-core Systems

    CERN Document Server

    Jarp, Sverre; Leduc, Julien; Nowak, Andrzej; Sneen Lindal, Yngve

    2011-01-01

    This paper describes a parallel implementation that allows the evaluations of the likelihood function for data analysis methods to run cooperatively on heterogeneous computational devices (i.e. CPU and GPU) belonging to a single computational node. The implementation is able to split and balance the workload needed for the evaluation of the likelihood function in corresponding sub-workloads to be executed in parallel on each computational device. The CPU parallelization is implemented using OpenMP, while the GPU implementation is based on OpenCL. The comparison of the performance of these implementations for different configurations and different hardware systems are reported. Tests are based on a real data analysis carried out in the high energy physics community.

  5. Maximum-likelihood fits to histograms for improved parameter estimation

    CERN Document Server

    Fowler, Joseph W

    2013-01-01

    Straightforward methods for adapting the familiar chi^2 statistic to histograms of discrete events and other Poisson distributed data generally yield biased estimates of the parameters of a model. The bias can be important even when the total number of events is large. For the case of estimating a microcalorimeter's energy resolution at 6 keV from the observed shape of the Mn K-alpha fluorescence spectrum, a poor choice of chi^2 can lead to biases of at least 10% in the estimated resolution when up to thousands of photons are observed. The best remedy is a Poisson maximum-likelihood fit, through a simple modification of the standard Levenberg-Marquardt algorithm for chi^2 minimization. Where the modification is not possible, another approach allows iterative approximation of the maximum-likelihood fit.

  6. Measures of family resemblance for binary traits: likelihood based inference.

    Science.gov (United States)

    Shoukri, Mohamed M; ElDali, Abdelmoneim; Donner, Allan

    2012-07-24

    Detection and estimation of measures of familial aggregation is considered the first step to establish whether a certain disease has genetic component. Such measures are usually estimated from observational studies on siblings, parent-offspring, extended pedigrees or twins. When the trait of interest is quantitative (e.g. Blood pressures, body mass index, blood glucose levels, etc.) efficient likelihood estimation of such measures is feasible under the assumption of multivariate normality of the distributions of the traits. In this case the intra-class and inter-class correlations are used to assess the similarities among family members. When the trail is measured on the binary scale, we establish a full likelihood inference on such measures among siblings, parents, and parent-offspring. We illustrate the methodology on nuclear family data where the trait is the presence or absence of hypertension.

  7. Applications of the Likelihood Theory in Finance: Modelling and Pricing

    CERN Document Server

    Janssen, Arnold

    2012-01-01

    This paper discusses the connection between mathematical finance and statistical modelling which turns out to be more than a formal mathematical correspondence. We like to figure out how common results and notions in statistics and their meaning can be translated to the world of mathematical finance and vice versa. A lot of similarities can be expressed in terms of LeCam's theory for statistical experiments which is the theory of the behaviour of likelihood processes. For positive prices the arbitrage free financial assets fit into filtered experiments. It is shown that they are given by filtered likelihood ratio processes. From the statistical point of view, martingale measures, completeness and pricing formulas are revisited. The pricing formulas for various options are connected with the power functions of tests. For instance the Black-Scholes price of a European option has an interpretation as Bayes risk of a Neyman Pearson test. Under contiguity the convergence of financial experiments and option prices ...

  8. A composite likelihood approach for spatially correlated survival data.

    Science.gov (United States)

    Paik, Jane; Ying, Zhiliang

    2013-01-01

    The aim of this paper is to provide a composite likelihood approach to handle spatially correlated survival data using pairwise joint distributions. With e-commerce data, a recent question of interest in marketing research has been to describe spatially clustered purchasing behavior and to assess whether geographic distance is the appropriate metric to describe purchasing dependence. We present a model for the dependence structure of time-to-event data subject to spatial dependence to characterize purchasing behavior from the motivating example from e-commerce data. We assume the Farlie-Gumbel-Morgenstern (FGM) distribution and then model the dependence parameter as a function of geographic and demographic pairwise distances. For estimation of the dependence parameters, we present pairwise composite likelihood equations. We prove that the resulting estimators exhibit key properties of consistency and asymptotic normality under certain regularity conditions in the increasing-domain framework of spatial asymptotic theory.

  9. GENERALIZATION OF RAYLEIGH MAXIMUM LIKELIHOOD DESPECKLING FILTER USING QUADRILATERAL KERNELS

    Directory of Open Access Journals (Sweden)

    S. Sridevi

    2013-02-01

    Full Text Available Speckle noise is the most prevalent noise in clinical ultrasound images. It visibly looks like light and dark spots and deduce the pixel intensity as murkiest. Gazing at fetal ultrasound images, the impact of edge and local fine details are more palpable for obstetricians and gynecologists to carry out prenatal diagnosis of congenital heart disease. A robust despeckling filter has to be contrived to proficiently suppress speckle noise and simultaneously preserve the features. The proposed filter is the generalization of Rayleigh maximum likelihood filter by the exploitation of statistical tools as tuning parameters and use different shapes of quadrilateral kernels to estimate the noise free pixel from neighborhood. The performance of various filters namely Median, Kuwahura, Frost, Homogenous mask filter and Rayleigh maximum likelihood filter are compared with the proposed filter in terms PSNR and image profile. Comparatively the proposed filters surpass the conventional filters.

  10. Smoothed log-concave maximum likelihood estimation with applications

    CERN Document Server

    Chen, Yining

    2011-01-01

    We study the smoothed log-concave maximum likelihood estimator of a probability distribution on $\\mathbb{R}^d$. This is a fully automatic nonparametric density estimator, obtained as a canonical smoothing of the log-concave maximum likelihood estimator. We demonstrate its attractive features both through an analysis of its theoretical properties and a simulation study. Moreover, we show how the estimator can be used as an intermediate stage of more involved procedures, such as constructing a classifier or estimating a functional of the density. Here again, the use of the estimator can be justified both on theoretical grounds and through its finite sample performance, and we illustrate its use in a breast cancer diagnosis (classification) problem.

  11. Secondary Analysis under Cohort Sampling Designs Using Conditional Likelihood

    Directory of Open Access Journals (Sweden)

    Olli Saarela

    2012-01-01

    Full Text Available Under cohort sampling designs, additional covariate data are collected on cases of a specific type and a randomly selected subset of noncases, primarily for the purpose of studying associations with a time-to-event response of interest. With such data available, an interest may arise to reuse them for studying associations between the additional covariate data and a secondary non-time-to-event response variable, usually collected for the whole study cohort at the outset of the study. Following earlier literature, we refer to such a situation as secondary analysis. We outline a general conditional likelihood approach for secondary analysis under cohort sampling designs and discuss the specific situations of case-cohort and nested case-control designs. We also review alternative methods based on full likelihood and inverse probability weighting. We compare the alternative methods for secondary analysis in two simulated settings and apply them in a real-data example.

  12. $\\ell_0$-penalized maximum likelihood for sparse directed acyclic graphs

    CERN Document Server

    van de Geer, Sara

    2012-01-01

    We consider the problem of regularized maximum likelihood estimation for the structure and parameters of a high-dimensional, sparse directed acyclic graphical (DAG) model with Gaussian distribution, or equivalently, of a Gaussian structural equation model. We show that the $\\ell_0$-penalized maximum likelihood estimator of a DAG has about the same number of edges as the minimal-edge I-MAP (a DAG with minimal number of edges representing the distribution), and that it converges in Frobenius norm. We allow the number of nodes $p$ to be much larger than sample size $n$ but assume a sparsity condition and that any representation of the true DAG has at least a fixed proportion of its non-zero edge weights above the noise level. Our results do not rely on the restrictive strong faithfulness condition which is required for methods based on conditional independence testing such as the PC-algorithm.

  13. A Weighted Likelihood Ratio of Two Related Negative Hypergeomeric Distributions

    Institute of Scientific and Technical Information of China (English)

    Titi Obilade

    2004-01-01

    In this paper we consider some related negative hypergeometric distributions arising from the problem of sampling without replacement from an urn containing balls of different colours and in different proportions but stopping only after some specifi number of balls of different colours have been obtained.With the aid of some simple recurrence relations and identities we obtain in the case of two colours the moments for the maximum negative hypergeometric distribution,the minimum negative hypergeometric distribution,the likelihood ratio negative hypergeometric distribution and consequently the likelihood proportional negative hypergeometric distributiuon.To the extent that the sampling scheme is applicable to modelling data as illustrated with a biological example and in fact many situations of estimating Bernoulli parameters for binary traits within afinite population,these are important first-step results.

  14. A model independent safeguard for unbinned Profile Likelihood

    CERN Document Server

    Priel, Nadav; Landsman, Hagar; Manfredini, Alessandro; Budnik, Ranny

    2016-01-01

    We present a general method to include residual un-modeled background shape uncertainties in profile likelihood based statistical tests for high energy physics and astroparticle physics counting experiments. This approach provides a simple and natural protection against undercoverage, thus lowering the chances of a false discovery or of an over constrained confidence interval, and allows a natural transition to unbinned space. Unbinned likelihood enhances the sensitivity and allows optimal usage of information for the data and the models. We show that the asymptotic behavior of the test statistic can be regained in cases where the model fails to describe the true background behavior, and present 1D and 2D case studies for model-driven and data-driven background models. The resulting penalty on sensitivities follows the actual discrepancy between the data and the models, and is asymptotically reduced to zero with increasing knowledge.

  15. Semidefinite Programming for Approximate Maximum Likelihood Sinusoidal Parameter Estimation

    Directory of Open Access Journals (Sweden)

    Kenneth W. K. Lui

    2009-01-01

    Full Text Available We study the convex optimization approach for parameter estimation of several sinusoidal models, namely, single complex/real tone, multiple complex sinusoids, and single two-dimensional complex tone, in the presence of additive Gaussian noise. The major difficulty for optimally determining the parameters is that the corresponding maximum likelihood (ML estimators involve finding the global minimum or maximum of multimodal cost functions because the frequencies are nonlinear in the observed signals. By relaxing the nonconvex ML formulations using semidefinite programs, high-fidelity approximate solutions are obtained in a globally optimum fashion. Computer simulations are included to contrast the estimation performance of the proposed semi-definite relaxation methods with the iterative quadratic maximum likelihood technique as well as Cramér-Rao lower bound.

  16. Bayesian experimental design for models with intractable likelihoods.

    Science.gov (United States)

    Drovandi, Christopher C; Pettitt, Anthony N

    2013-12-01

    In this paper we present a methodology for designing experiments for efficiently estimating the parameters of models with computationally intractable likelihoods. The approach combines a commonly used methodology for robust experimental design, based on Markov chain Monte Carlo sampling, with approximate Bayesian computation (ABC) to ensure that no likelihood evaluations are required. The utility function considered for precise parameter estimation is based upon the precision of the ABC posterior distribution, which we form efficiently via the ABC rejection algorithm based on pre-computed model simulations. Our focus is on stochastic models and, in particular, we investigate the methodology for Markov process models of epidemics and macroparasite population evolution. The macroparasite example involves a multivariate process and we assess the loss of information from not observing all variables.

  17. Semidefinite Programming for Approximate Maximum Likelihood Sinusoidal Parameter Estimation

    Science.gov (United States)

    Lui, Kenneth W. K.; So, H. C.

    2009-12-01

    We study the convex optimization approach for parameter estimation of several sinusoidal models, namely, single complex/real tone, multiple complex sinusoids, and single two-dimensional complex tone, in the presence of additive Gaussian noise. The major difficulty for optimally determining the parameters is that the corresponding maximum likelihood (ML) estimators involve finding the global minimum or maximum of multimodal cost functions because the frequencies are nonlinear in the observed signals. By relaxing the nonconvex ML formulations using semidefinite programs, high-fidelity approximate solutions are obtained in a globally optimum fashion. Computer simulations are included to contrast the estimation performance of the proposed semi-definite relaxation methods with the iterative quadratic maximum likelihood technique as well as Cramér-Rao lower bound.

  18. A composite likelihood approach for spatially correlated survival data

    Science.gov (United States)

    Paik, Jane; Ying, Zhiliang

    2013-01-01

    The aim of this paper is to provide a composite likelihood approach to handle spatially correlated survival data using pairwise joint distributions. With e-commerce data, a recent question of interest in marketing research has been to describe spatially clustered purchasing behavior and to assess whether geographic distance is the appropriate metric to describe purchasing dependence. We present a model for the dependence structure of time-to-event data subject to spatial dependence to characterize purchasing behavior from the motivating example from e-commerce data. We assume the Farlie-Gumbel-Morgenstern (FGM) distribution and then model the dependence parameter as a function of geographic and demographic pairwise distances. For estimation of the dependence parameters, we present pairwise composite likelihood equations. We prove that the resulting estimators exhibit key properties of consistency and asymptotic normality under certain regularity conditions in the increasing-domain framework of spatial asymptotic theory. PMID:24223450

  19. Maximum likelihood method and Fisher's information in physics and econophysics

    CERN Document Server

    Syska, Jacek

    2012-01-01

    Three steps in the development of the maximum likelihood (ML) method are presented. At first, the application of the ML method and Fisher information notion in the model selection analysis is described (Chapter 1). The fundamentals of differential geometry in the construction of the statistical space are introduced, illustrated also by examples of the estimation of the exponential models. At second, the notions of the relative entropy and the information channel capacity are introduced (Chapter 2). The observed and expected structural information principle (IP) and the variational IP of the modified extremal physical information (EPI) method of Frieden and Soffer are presented and discussed (Chapter 3). The derivation of the structural IP based on the analyticity of the logarithm of the likelihood function and on the metricity of the statistical space of the system is given. At third, the use of the EPI method is developed (Chapters 4-5). The information channel capacity is used for the field theory models cl...

  20. Failed refutations: further comments on parsimony and likelihood methods and their relationship to Popper's degree of corroboration.

    Science.gov (United States)

    de Queiroz, Kevin; Poe, Steven

    2003-06-01

    Kluge's (2001, Syst. Biol. 50:322-330) continued arguments that phylogenetic methods based on the statistical principle of likelihood are incompatible with the philosophy of science described by Karl Popper are based on false premises related to Kluge's misrepresentations of Popper's philosophy. Contrary to Kluge's conjectures, likelihood methods are not inherently verificationist; they do not treat every instance of a hypothesis as confirmation of that hypothesis. The historical nature of phylogeny does not preclude phylogenetic hypotheses from being evaluated using the probability of evidence. The low absolute probabilities of hypotheses are irrelevant to the correct interpretation of Popper's concept termed degree of corroboration, which is defined entirely in terms of relative probabilities. Popper did not advocate minimizing background knowledge; in any case, the background knowledge of both parsimony and likelihood methods consists of the general assumption of descent with modification and additional assumptions that are deterministic, concerning which tree is considered most highly corroborated. Although parsimony methods do not assume (in the sense of entailing) that homoplasy is rare, they do assume (in the sense of requiring to obtain a correct phylogenetic inference) certain things about patterns of homoplasy. Both parsimony and likelihood methods assume (in the sense of implying by the manner in which they operate) various things about evolutionary processes, although violation of those assumptions does not always cause the methods to yield incorrect phylogenetic inferences. Test severity is increased by sampling additional relevant characters rather than by character reanalysis, although either interpretation is compatible with the use of phylogenetic likelihood methods. Neither parsimony nor likelihood methods assess test severity (critical evidence) when used to identify a most highly corroborated tree(s) based on a single method or model and a

  1. A likelihood approach to estimate the number of co-infections.

    Directory of Open Access Journals (Sweden)

    Kristan A Schneider

    Full Text Available The number of co-infections of a pathogen (multiplicity of infection or MOI is a relevant parameter in epidemiology as it relates to transmission intensity. Notably, such quantities can be built into a metric in the context of disease control and prevention. Having applications to malaria in mind, we develop here a maximum-likelihood (ML framework to estimate the quantities of interest at low computational and no additional costs to study designs or data collection. We show how the ML estimate for the quantities of interest and corresponding confidence-regions are obtained from multiple genetic loci. Assuming specifically that infections are rare and independent events, the number of infections per host follows a conditional Poisson distribution. Under this assumption, we show that a unique ML estimate for the parameter (λ describing MOI exists which is found by a simple recursion. Moreover, we provide explicit formulas for asymptotic confidence intervals, and show that profile-likelihood-based confidence intervals exist, which are found by a simple two-dimensional recursion. Based on the confidence intervals we provide alternative statistical tests for the MOI parameter. Finally, we illustrate the methods on three malaria data sets. The statistical framework however is not limited to malaria.

  2. Maximum Likelihood Under Response Biased Sampling\\ud

    OpenAIRE

    Chambers, Raymond; Dorfman, Alan; Wang, Suojin

    2003-01-01

    Informative sampling occurs when the probability of inclusion in sample depends on\\ud the value of the survey response variable. Response or size biased sampling is a\\ud particular case of informative sampling where the inclusion probability is proportional\\ud to the value of this variable. In this paper we describe a general model for response\\ud biased sampling, which we call array sampling, and develop maximum likelihood and\\ud estimating equation theory appropriate to this situation. The ...

  3. Likelihood-based inference for clustered line transect data

    DEFF Research Database (Denmark)

    Waagepetersen, Rasmus Plenge; Schweder, Tore

    The uncertainty in estimation of spatial animal density from line transect surveys depends on the degree of spatial clustering in the animal population. To quantify the clustering we model line transect data as independent thinnings of spatial shot-noise Cox processes. Likelihood-based inference...... in an example concerning minke whales in the North Atlantic. Our modelling and computational approach is flexible but demanding in terms of computing time....

  4. Forecasting New Product Sales from Likelihood of Purchase Ratings

    OpenAIRE

    William J. Infosino

    1986-01-01

    This paper compares consumer likelihood of purchase ratings for a proposed new product to their actual purchase behavior after the product was introduced. The ratings were obtained from a mail survey a few weeks before the product was introduced. The analysis leads to a model for forecasting new product sales. The model is supported by both empirical evidence and a reasonable theoretical foundation. In addition to calibrating the relationship between questionnaire ratings and actual purchases...

  5. Improved Likelihood Function in Particle-based IR Eye Tracking

    DEFF Research Database (Denmark)

    Satria, R.; Sorensen, J.; Hammoud, R.

    2005-01-01

    In this paper we propose a log likelihood-ratio function of foreground and background models used in a particle filter to track the eye region in dark-bright pupil image sequences. This model fuses information from both dark and bright pupil images and their difference image into one model. Our...... performance in challenging sequences with test subjects showing large head movements and under significant light conditions....

  6. Australian food life style segments and elaboration likelihood differences

    DEFF Research Database (Denmark)

    Brunsø, Karen; Reid, Mike

    As the global food marketing environment becomes more competitive, the international and comparative perspective of consumers' attitudes and behaviours becomes more important for both practitioners and academics. This research employs the Food-Related Life Style (FRL) instrument in Australia...... insights into cross-cultural similarities and differences, into elaboration likelihood differences among consumer segments, and show how the involvement construct may be used as basis for communication development....

  7. Maximizing Friend-Making Likelihood for Social Activity Organization

    Science.gov (United States)

    2015-05-22

    the interplay of the group size, the constraint on existing friendships and the objective function on the likelihood of friend making. We prove that...social networks (OSNs), e.g., Facebook , Meetup, and Skout1, more and more people initiate friend gatherings or group activities via these OSNs. For...example, more than 16 millions of events are created on Facebook each month to organize various kinds of activities2, and more than 500 thousands of face

  8. Penalized maximum likelihood estimation for generalized linear point processes

    OpenAIRE

    2010-01-01

    A generalized linear point process is specified in terms of an intensity that depends upon a linear predictor process through a fixed non-linear function. We present a framework where the linear predictor is parametrized by a Banach space and give results on Gateaux differentiability of the log-likelihood. Of particular interest is when the intensity is expressed in terms of a linear filter parametrized by a Sobolev space. Using that the Sobolev spaces are reproducing kernel Hilbert spaces we...

  9. Semidefinite Programming for Approximate Maximum Likelihood Sinusoidal Parameter Estimation

    OpenAIRE

    2009-01-01

    We study the convex optimization approach for parameter estimation of several sinusoidal models, namely, single complex/real tone, multiple complex sinusoids, and single two-dimensional complex tone, in the presence of additive Gaussian noise. The major difficulty for optimally determining the parameters is that the corresponding maximum likelihood (ML) estimators involve finding the global minimum or maximum of multimodal cost functions because the frequencies are nonlinear in the observed s...

  10. Maximum Likelihood Sequence Detection Receivers for Nonlinear Optical Channels

    OpenAIRE

    2015-01-01

    The space-time whitened matched filter (ST-WMF) maximum likelihood sequence detection (MLSD) architecture has been recently proposed (Maggio et al., 2014). Its objective is reducing implementation complexity in transmissions over nonlinear dispersive channels. The ST-WMF-MLSD receiver (i) drastically reduces the number of states of the Viterbi decoder (VD) and (ii) offers a smooth trade-off between performance and complexity. In this work the ST-WMF-MLSD receiver is investigated in detail. We...

  11. Influence functions of trimmed likelihood estimators for lifetime experiments

    OpenAIRE

    2015-01-01

    We provide a general approach for deriving the influence function for trimmed likelihood estimators using the implicit function theorem. The approach is applied to lifetime models with exponential or lognormal distributions possessing a linear or nonlinear link function. A side result is that the functional form of the trimmed estimator for location and linear regression used by Bednarski and Clarke (1993, 2002) and Bednarski et al. (2010) is not generally always the correct fu...

  12. Fertilization response likelihood for the interpretation of leaf analyses

    Directory of Open Access Journals (Sweden)

    Celsemy Eleutério Maia

    2012-04-01

    Full Text Available Leaf analysis is the chemical evaluation of the nutritional status where the nutrient concentrations found in the tissue reflect the nutritional status of the plants. Thus, a correct interpretation of the results of leaf analysis is fundamental for an effective use of this tool. The purpose of this study was to propose and compare the method of Fertilization Response Likelihood (FRL for interpretation of leaf analysis with that of the Diagnosis and Recommendation Integrated System (DRIS. The database consisted of 157 analyses of the N, P, K, Ca, Mg, S, Cu, Fe, Mn, Zn, and B concentrations in coffee leaves, which were divided into two groups: low yield ( 30 bags ha-1. The DRIS indices were calculated using the method proposed by Jones (1981. The fertilization response likelihood was computed based on the approximation of normal distribution. It was found that the Fertilization Response Likelihood (FRL allowed an evaluation of the nutritional status of coffee trees, coinciding with the DRIS-based diagnoses in 84.96 % of the crops.

  13. CMB likelihood approximation by a Gaussianized Blackwell-Rao estimator

    CERN Document Server

    Rudjord, Ø; Eriksen, H K; Huey, Greg; Górski, K M; Jewell, J B

    2008-01-01

    We introduce a new CMB temperature likelihood approximation called the Gaussianized Blackwell-Rao (GBR) estimator. This estimator is derived by transforming the observed marginal power spectrum distributions obtained by the CMB Gibbs sampler into standard univariate Gaussians, and then approximate their joint transformed distribution by a multivariate Gaussian. The method is exact for full-sky coverage and uniform noise, and an excellent approximation for sky cuts and scanning patterns relevant for modern satellite experiments such as WMAP and Planck. A single evaluation of this estimator between l=2 and 200 takes ~0.2 CPU milliseconds, while for comparison, a single pixel space likelihood evaluation between l=2 and 30 for a map with ~2500 pixels requires ~20 seconds. We apply this tool to the 5-year WMAP temperature data, and re-estimate the angular temperature power spectrum, $C_{\\ell}$, and likelihood, L(C_l), for l<=200, and derive new cosmological parameters for the standard six-parameter LambdaCDM mo...

  14. Likelihood inference for a fractionally cointegrated vector autoregressive model

    DEFF Research Database (Denmark)

    Johansen, Søren; Nielsen, Morten Ørregaard

    We consider model based inference in a fractionally cointegrated (or cofractional) vector autoregressive model based on the conditional Gaussian likelihood. The model allows the process X_{t} to be fractional of order d and cofractional of order d-b; that is, there exist vectors β for which β......′X_{t} is fractional of order d-b. The parameters d and b satisfy either d≥b≥1/2, d=b≥1/2, or d=d_{0}≥b≥1/2. Our main technical contribution is the proof of consistency of the maximum likelihood estimators on the set 1/2≤b≤d≤d_{1} for any d_{1}≥d_{0}. To this end, we consider the conditional likelihood as a stochastic...... process in the parameters, and prove that it converges in distribution when errors are i.i.d. with suitable moment conditions and initial values are bounded. We then prove that the estimator of β is asymptotically mixed Gaussian and estimators of the remaining parameters are asymptotically Gaussian. We...

  15. Likelihood Inference for a Fractionally Cointegrated Vector Autoregressive Model

    DEFF Research Database (Denmark)

    Johansen, Søren; Nielsen, Morten Ørregaard

    We consider model based inference in a fractionally cointegrated (or cofractional) vector autoregressive model based on the conditional Gaussian likelihood. The model allows the process X(t) to be fractional of order d and cofractional of order d-b; that is, there exist vectors ß for which ß......'X(t) is fractional of order d-b. The parameters d and b satisfy either d=b=1/2, d=b=1/2, or d=d0=b=1/2. Our main technical contribution is the proof of consistency of the maximum likelihood estimators on the set 1/2=b=d=d1 for any d1=d0. To this end, we consider the conditional likelihood as a stochastic process...... in the parameters, and prove that it converges in distribution when errors are i.i.d. with suitable moment conditions and initial values are bounded. We then prove that the estimator of ß is asymptotically mixed Gaussian and estimators of the remaining parameters are asymptotically Gaussian. We also find...

  16. Moment Conditions Selection Based on Adaptive Penalized Empirical Likelihood

    Directory of Open Access Journals (Sweden)

    Yunquan Song

    2014-01-01

    Full Text Available Empirical likelihood is a very popular method and has been widely used in the fields of artificial intelligence (AI and data mining as tablets and mobile application and social media dominate the technology landscape. This paper proposes an empirical likelihood shrinkage method to efficiently estimate unknown parameters and select correct moment conditions simultaneously, when the model is defined by moment restrictions in which some are possibly misspecified. We show that our method enjoys oracle-like properties; that is, it consistently selects the correct moment conditions and at the same time its estimator is as efficient as the empirical likelihood estimator obtained by all correct moment conditions. Moreover, unlike the GMM, our proposed method allows us to carry out confidence regions for the parameters included in the model without estimating the covariances of the estimators. For empirical implementation, we provide some data-driven procedures for selecting the tuning parameter of the penalty function. The simulation results show that the method works remarkably well in terms of correct moment selection and the finite sample properties of the estimators. Also, a real-life example is carried out to illustrate the new methodology.

  17. Local solutions of Maximum Likelihood Estimation in Quantum State Tomography

    CERN Document Server

    Gonçalves, Douglas S; Lavor, Carlile; Farías, Osvaldo Jiménez; Ribeiro, P H Souto

    2011-01-01

    Maximum likelihood estimation is one of the most used methods in quantum state tomography, where the aim is to find the best density matrix for the description of a physical system. Results of measurements on the system should match the expected values produced by the density matrix. In some cases however, if the matrix is parameterized to ensure positivity and unit trace, the negative log-likelihood function may have several local minima. In several papers in the field, authors associate a source of errors to the possibility that most of these local minima are not global, so that optimization methods can be trapped in the wrong minimum, leading to a wrong density matrix. Here we show that, for convex negative log-likelihood functions, all local minima are global. We also show that a practical source of errors is in fact the use of optimization methods that do not have global convergence property or present numerical instabilities. The clarification of this point has important repercussion on quantum informat...

  18. Accurate determination of phase arrival times using autoregressive likelihood estimation

    Directory of Open Access Journals (Sweden)

    G. Kvaerna

    1994-06-01

    Full Text Available We have investigated the potential automatic use of an onset picker based on autoregressive likelihood estimation. Both a single component version and a three component version of this method have been tested on data from events located in the Khibiny Massif of the Kola peninsula, recorded at the Apatity array, the Apatity three component station and the ARCESS array. Using this method, we have been able to estimate onset times to an accuracy (standard deviation of about 0.05 s for P-phases and 0.15 0.20 s for S phases. These accuracies are as good as for analyst picks, and are considerably better than the accuracies of the current onset procedure used for processing of regional array data at NORSAR. In another application, we have developed a generic procedure to reestimate the onsets of all types of first arriving P phases. By again applying the autoregressive likelihood technique, we have obtained automatic onset times of a quality such that 70% of the automatic picks are within 0.1 s of the best manual pick. For the onset time procedure currently used at NORSAR, the corresponding number is 28%. Clearly, automatic reestimation of first arriving P onsets using the autoregressive likelihood technique has the potential of significantly reducing the retiming efforts of the analyst.

  19. Maximum likelihood tuning of a vehicle motion filter

    Science.gov (United States)

    Trankle, Thomas L.; Rabin, Uri H.

    1990-01-01

    This paper describes the use of maximum likelihood parameter estimation unknown parameters appearing in a nonlinear vehicle motion filter. The filter uses the kinematic equations of motion of a rigid body in motion over a spherical earth. The nine states of the filter represent vehicle velocity, attitude, and position. The inputs to the filter are three components of translational acceleration and three components of angular rate. Measurements used to update states include air data, altitude, position, and attitude. Expressions are derived for the elements of filter matrices needed to use air data in a body-fixed frame with filter states expressed in a geographic frame. An expression for the likelihood functions of the data is given, along with accurate approximations for the function's gradient and Hessian with respect to unknown parameters. These are used by a numerical quasi-Newton algorithm for maximizing the likelihood function of the data in order to estimate the unknown parameters. The parameter estimation algorithm is useful for processing data from aircraft flight tests or for tuning inertial navigation systems.

  20. Empirical likelihood method for non-ignorable missing data problems.

    Science.gov (United States)

    Guan, Zhong; Qin, Jing

    2017-01-01

    Missing response problem is ubiquitous in survey sampling, medical, social science and epidemiology studies. It is well known that non-ignorable missing is the most difficult missing data problem where the missing of a response depends on its own value. In statistical literature, unlike the ignorable missing data problem, not many papers on non-ignorable missing data are available except for the full parametric model based approach. In this paper we study a semiparametric model for non-ignorable missing data in which the missing probability is known up to some parameters, but the underlying distributions are not specified. By employing Owen (1988)'s empirical likelihood method we can obtain the constrained maximum empirical likelihood estimators of the parameters in the missing probability and the mean response which are shown to be asymptotically normal. Moreover the likelihood ratio statistic can be used to test whether the missing of the responses is non-ignorable or completely at random. The theoretical results are confirmed by a simulation study. As an illustration, the analysis of a real AIDS trial data shows that the missing of CD4 counts around two years are non-ignorable and the sample mean based on observed data only is biased.

  1. Theory and context / Theory in context

    DEFF Research Database (Denmark)

    Glaveanu, Vlad Petre

    2014-01-01

    It is debatable whether the psychology of creativity is a field in crisis or not. There are clear signs of increased fragmenta-tion and a scarcity of integrative efforts, but is this necessari-ly bad? Do we need more comprehensive theories of creativ-ity and a return to old epistemological...... questions? This de-pends on how one understands theory. Against a view of theoretical work as aiming towards generality, universality, uniformity, completeness, and singularity, I advocate for a dynamic perspective in which theory is plural, multifaceted, and contextual. Far from ‘waiting for the Messiah......’, theoreti-cal work in the psychology of creativity can be integrative without having the ambition to explain or, even more, predict, creative expression across all people, at all times, and in all domains. To avoid such ambition, the psychology of creativi-ty requires a theory of context that doesn...

  2. How are important life events disclosed on facebook? Relationships with likelihood of sharing and privacy.

    Science.gov (United States)

    Bevan, Jennifer L; Cummings, Megan B; Kubiniec, Ashley; Mogannam, Megan; Price, Madison; Todd, Rachel

    2015-01-01

    This study examined an aspect of Facebook disclosure that has as yet gone unexplored: whether a user prefers to share information directly, for example, through status updates, or indirectly, via photos with no caption or relationship status changes without context or explanation. The focus was on the sharing of important positive and negative life events related to romantic relationships, health, and work/school in relation to likelihood of sharing this type of information on Facebook and general attitudes toward privacy. An online survey of 599 adult Facebook users found that when positive life events were shared, users preferred to do so indirectly, whereas negative life events were more likely to be disclosed directly. Privacy shared little association with how information was shared. Implications for understanding the finer nuances of how news is shared on Facebook are discussed.

  3. Comparisons of likelihood and machine learning methods of individual classification

    Science.gov (United States)

    Guinand, B.; Topchy, A.; Page, K.S.; Burnham-Curtis, M. K.; Punch, W.F.; Scribner, K.T.

    2002-01-01

    Classification methods used in machine learning (e.g., artificial neural networks, decision trees, and k-nearest neighbor clustering) are rarely used with population genetic data. We compare different nonparametric machine learning techniques with parametric likelihood estimations commonly employed in population genetics for purposes of assigning individuals to their population of origin (“assignment tests”). Classifier accuracy was compared across simulated data sets representing different levels of population differentiation (low and high FST), number of loci surveyed (5 and 10), and allelic diversity (average of three or eight alleles per locus). Empirical data for the lake trout (Salvelinus namaycush) exhibiting levels of population differentiation comparable to those used in simulations were examined to further evaluate and compare classification methods. Classification error rates associated with artificial neural networks and likelihood estimators were lower for simulated data sets compared to k-nearest neighbor and decision tree classifiers over the entire range of parameters considered. Artificial neural networks only marginally outperformed the likelihood method for simulated data (0–2.8% lower error rates). The relative performance of each machine learning classifier improved relative likelihood estimators for empirical data sets, suggesting an ability to “learn” and utilize properties of empirical genotypic arrays intrinsic to each population. Likelihood-based estimation methods provide a more accessible option for reliable assignment of individuals to the population of origin due to the intricacies in development and evaluation of artificial neural networks. In recent years, characterization of highly polymorphic molecular markers such as mini- and microsatellites and development of novel methods of analysis have enabled researchers to extend investigations of ecological and evolutionary processes below the population level to the level of

  4. Communicating likelihoods and probabilities in forecasts of volcanic eruptions

    Science.gov (United States)

    Doyle, Emma E. H.; McClure, John; Johnston, David M.; Paton, Douglas

    2014-02-01

    The issuing of forecasts and warnings of natural hazard events, such as volcanic eruptions, earthquake aftershock sequences and extreme weather often involves the use of probabilistic terms, particularly when communicated by scientific advisory groups to key decision-makers, who can differ greatly in relative expertise and function in the decision making process. Recipients may also differ in their perception of relative importance of political and economic influences on interpretation. Consequently, the interpretation of these probabilistic terms can vary greatly due to the framing of the statements, and whether verbal or numerical terms are used. We present a review from the psychology literature on how the framing of information influences communication of these probability terms. It is also unclear as to how people rate their perception of an event's likelihood throughout a time frame when a forecast time window is stated. Previous research has identified that, when presented with a 10-year time window forecast, participants viewed the likelihood of an event occurring ‘today’ as being of less than that in year 10. Here we show that this skew in perception also occurs for short-term time windows (under one week) that are of most relevance for emergency warnings. In addition, unlike the long-time window statements, the use of the phrasing “within the next…” instead of “in the next…” does not mitigate this skew, nor do we observe significant differences between the perceived likelihoods of scientists and non-scientists. This finding suggests that effects occurring due to the shorter time window may be ‘masking’ any differences in perception due to wording or career background observed for long-time window forecasts. These results have implications for scientific advice, warning forecasts, emergency management decision-making, and public information as any skew in perceived event likelihood towards the end of a forecast time window may result in

  5. Laurinda Abreu; Patrice Bourdelais; Teresa Ortiz-Gómez; Guillermo Palacios, eds. Dynamics of health and welfare: texts and contexts. Dinámicas de salud y bienestar: textos y contextos [Reseña bibliográfica

    OpenAIRE

    Arrizabalaga, Jon

    2009-01-01

    Dynamics of health and welfare: texts and contexts constituye una selección antológica comentada de textos pasados y presentes sobre la salud pública y el bienestar social, y sus intersecciones con el género y con los procesos migratorios y urbanísticos, centrado en Europa y Latinoamérica. El material aparece agrupado en tres grandes campos temáticos, cada cual al cuidado de un equipo de editores específico: «Salud y bienestar» (Laurinda Abreu y Patrice Bourdelais, p. 13-100), ...

  6. Dynamics

    CERN Document Server

    Goodman, Lawrence E

    2001-01-01

    Beginning text presents complete theoretical treatment of mechanical model systems and deals with technological applications. Topics include introduction to calculus of vectors, particle motion, dynamics of particle systems and plane rigid bodies, technical applications in plane motions, theory of mechanical vibrations, and more. Exercises and answers appear in each chapter.

  7. Spatiotemporal Context Modelling in Pervasive Context-Aware Computing Environment: A Logic Perspective

    Directory of Open Access Journals (Sweden)

    Darine Ameyed

    2016-04-01

    Full Text Available Pervasive context-aware computing, is one of the topics that received particular attention from researchers. The context, itself is an important notion explored in many works discussing its: acquisition, definition, modelling, reasoning and more. Given the permanent evolution of context-aware systems, context modeling is still a complex task, due to the lack of an adequate, dynamic, formal and relevant context representation. This paper discusses various context modeling approaches and previous logic-based works. It also proposes a preliminary formal spatiotemporal context modelling based on first order logic, derived from the structure of natural languages.

  8. Early Course in Obstetrics Increases Likelihood of Practice Including Obstetrics.

    Science.gov (United States)

    Pearson, Jennifer; Westra, Ruth

    2016-10-01

    The Department of Family Medicine and Community Health Duluth has offered the Obstetrical Longitudinal Course (OBLC) as an elective for first-year medical students since 1999. The objective of the OBLC Impact Survey was to assess the effectiveness of the course over the past 15 years. A Qualtrics survey was emailed to participants enrolled in the course from 1999-2014. Data was compiled for the respondent group as a whole as well as four cohorts based on current level of training/practice. Cross-tabulations with Fisher's exact test were applied and odds ratios calculated for factors affecting likelihood of eventual practice including obstetrics. Participation in the OBLC was successful in increasing exposure, awareness, and comfort in caring for obstetrical patients and feeling more prepared for the OB-GYN Clerkship. A total of 50.5% of course participants felt the OBLC influenced their choice of specialty. For participants who are currently physicians, 51% are practicing family medicine with obstetrics or OB-GYN. Of the cohort of family physicians, 65.2% made the decision whether to include obstetrics in practice during medical school. Odds ratios show the likelihood of practicing obstetrics is higher when participants have completed the OBLC and also are practicing in a rural community. Early exposure to obstetrics, as provided by the OBLC, appears to increase the likelihood of including obstetrics in practice, especially if eventual practice is in a rural community. This course may be a tool to help create a pipeline for future rural family physicians providing obstetrical care.

  9. CUSUM control charts based on likelihood ratio for preliminary analysis

    Institute of Scientific and Technical Information of China (English)

    Yi DAI; Zhao-jun WANG; Chang-liang ZOU

    2007-01-01

    To detect and estimate a shift in either the mean and the deviation or both for the preliminary analysis, the statistical process control (SPC) tool, the control chart based on the likelihood ratio test (LRT), is the most popular method.Sullivan and woodall pointed out the test statistic lrt (n1, n2) is approximately distributed as x2 (2) as the sample size n, n1 and n2 are very large, and the value of n1 = 2, 3,..., n- 2 and that of n2 = n- n1.So it is inevitable that n1 or n2 is not large. In this paper the limit distribution of lrt(n1, n2) for fixed n1 or n2 is figured out, and the exactly analytic formulae for evaluating the expectation and the variance of the limit distribution are also obtained.In addition, the properties of the standardized likelihood ratio statistic slr(n1,n) are discussed in this paper. Although slr(n1, n) contains the most important information, slr(i, n)(i ≠ n1) also contains lots of information. The cumulative sum (CUSUM) control chart can obtain more information in this condition. So we propose two CUSUM control charts based on the likelihood ratio statistics for the preliminary analysis on the individual observations. One focuses on detecting the shifts in location in the historical data and the other is more general in detecting a shift in either the location and the scale or both.Moreover, the simulated results show that the proposed two control charts are, respectively, superior to their competitors not only in the detection of the sustained shifts but also in the detection of some other out-of-control situations considered in this paper.

  10. CUSUM control charts based on likelihood ratio for preliminary analysis

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    To detect and estimate a shift in either the mean and the deviation or both for the preliminary analysis, the statistical process control (SPC) tool, the control chart based on the likelihood ratio test (LRT), is the most popular method. Sullivan and woodall pointed out the test statistic lrt(n1, n2) is approximately distributed as x2(2) as the sample size n,n1 and n2 are very large, and the value of n1 = 2,3,..., n - 2 and that of n2 = n - n1. So it is inevitable that n1 or n2 is not large. In this paper the limit distribution of lrt(n1, n2) for fixed n1 or n2 is figured out, and the exactly analytic formulae for evaluating the expectation and the variance of the limit distribution are also obtained. In addition, the properties of the standardized likelihood ratio statistic slr(n1, n) are discussed in this paper. Although slr(n1, n) contains the most important information, slr(i, n)(i≠n1) also contains lots of information. The cumulative sum (CUSUM) control chart can obtain more information in this condition. So we propose two CUSUM control charts based on the likelihood ratio statistics for the preliminary analysis on the individual observations. One focuses on detecting the shifts in location in the historical data and the other is more general in detecting a shift in either the location and the scale or both. Moreover, the simulated results show that the proposed two control charts are, respectively, superior to their competitors not only in the detection of the sustained shifts but also in the detection of some other out-of-control situations considered in this paper.

  11. A Non-standard Empirical Likelihood for Time Series

    DEFF Research Database (Denmark)

    Nordman, Daniel J.; Bunzel, Helle; Lahiri, Soumendra N.

    Standard blockwise empirical likelihood (BEL) for stationary, weakly dependent time series requires specifying a fixed block length as a tuning parameter for setting confidence regions. This aspect can be difficult and impacts coverage accuracy. As an alternative, this paper proposes a new version......-standard asymptotics and requires a significantly different development compared to standard BEL. We establish the large-sample distribution of log-ratio statistics from the new BEL method for calibrating confidence regions for mean or smooth function parameters of time series. This limit law is not the usual chi...

  12. Bayesian and maximum likelihood estimation of genetic maps

    DEFF Research Database (Denmark)

    York, Thomas L.; Durrett, Richard T.; Tanksley, Steven;

    2005-01-01

    There has recently been increased interest in the use of Markov Chain Monte Carlo (MCMC)-based Bayesian methods for estimating genetic maps. The advantage of these methods is that they can deal accurately with missing data and genotyping errors. Here we present an extension of the previous methods...... that makes the Bayesian method applicable to large data sets. We present an extensive simulation study examining the statistical properties of the method and comparing it with the likelihood method implemented in Mapmaker. We show that the Maximum A Posteriori (MAP) estimator of the genetic distances...

  13. Maximum Likelihood Localization of Radiation Sources with unknown Source Intensity

    CERN Document Server

    Baidoo-Williams, Henry E

    2016-01-01

    In this paper, we consider a novel and robust maximum likelihood approach to localizing radiation sources with unknown statistics of the source signal strength. The result utilizes the smallest number of sensors required theoretically to localize the source. It is shown, that should the source lie in the open convex hull of the sensors, precisely $N+1$ are required in $\\mathbb{R}^N, ~N \\in \\{1,\\cdots,3\\}$. It is further shown that the region of interest, the open convex hull of the sensors, is entirely devoid of false stationary points. An augmented gradient ascent algorithm with random projections should an estimate escape the convex hull is presented.

  14. Similar tests and the standardized log likelihood ratio statistic

    DEFF Research Database (Denmark)

    Jensen, Jens Ledet

    1986-01-01

    When testing an affine hypothesis in an exponential family the 'ideal' procedure is to calculate the exact similar test, or an approximation to this, based on the conditional distribution given the minimal sufficient statistic under the null hypothesis. By contrast to this there is a 'primitive......' approach in which the marginal distribution of a test statistic considered and any nuisance parameter appearing in the test statistic is replaced by an estimate. We show here that when using standardized likelihood ratio statistics the 'primitive' procedure is in fact an 'ideal' procedure to order O(n -3...

  15. MAXIMUM LIKELIHOOD ESTIMATION FOR PERIODIC AUTOREGRESSIVE MOVING AVERAGE MODELS.

    Science.gov (United States)

    Vecchia, A.V.

    1985-01-01

    A useful class of models for seasonal time series that cannot be filtered or standardized to achieve second-order stationarity is that of periodic autoregressive moving average (PARMA) models, which are extensions of ARMA models that allow periodic (seasonal) parameters. An approximation to the exact likelihood for Gaussian PARMA processes is developed, and a straightforward algorithm for its maximization is presented. The algorithm is tested on several periodic ARMA(1, 1) models through simulation studies and is compared to moment estimation via the seasonal Yule-Walker equations. Applicability of the technique is demonstrated through an analysis of a seasonal stream-flow series from the Rio Caroni River in Venezuela.

  16. AN EFFICIENT APPROXIMATE MAXIMUM LIKELIHOOD SIGNAL DETECTION FOR MIMO SYSTEMS

    Institute of Scientific and Technical Information of China (English)

    Cao Xuehong

    2007-01-01

    This paper proposes an efficient approximate Maximum Likelihood (ML) detection method for Multiple-Input Multiple-Output (MIMO) systems, which searches local area instead of exhaustive search and selects valid search points in each transmit antenna signal constellation instead of all hyperplane. Both of the selection and search complexity can be reduced significantly. The method performs the tradeoff between computational complexity and system performance by adjusting the neighborhood size to select the valid search points. Simulation results show that the performance is comparable to that of the ML detection while the complexity is only as the small fraction of ML.

  17. Maximum likelihood characterization of rotationally symmetric distributions on the sphere

    OpenAIRE

    Duerinckx, Mitia; Ley, Christophe

    2012-01-01

    A classical characterization result, which can be traced back to Gauss, states that the maximum likelihood estimator (MLE) of the location parameter equals the sample mean for any possible univariate samples of any possible sizes n if and only if the samples are drawn from a Gaussian population. A similar result, in the two-dimensional case, is given in von Mises (1918) for the Fisher-von Mises-Langevin (FVML) distribution, the equivalent of the Gaussian law on the unit circle. Half a century...

  18. Maximum-likelihood analysis of the COBE angular correlation function

    Science.gov (United States)

    Seljak, Uros; Bertschinger, Edmund

    1993-01-01

    We have used maximum-likelihood estimation to determine the quadrupole amplitude Q(sub rms-PS) and the spectral index n of the density fluctuation power spectrum at recombination from the COBE DMR data. We find a strong correlation between the two parameters of the form Q(sub rms-PS) = (15.7 +/- 2.6) exp (0.46(1 - n)) microK for fixed n. Our result is slightly smaller than and has a smaller statistical uncertainty than the 1992 estimate of Smoot et al.

  19. Adaptive quasi-likelihood estimate in generalized linear models

    Institute of Scientific and Technical Information of China (English)

    CHEN Xia; CHEN Xiru

    2005-01-01

    This paper gives a thorough theoretical treatment on the adaptive quasilikelihood estimate of the parameters in the generalized linear models. The unknown covariance matrix of the response variable is estimated by the sample. It is shown that the adaptive estimator defined in this paper is asymptotically most efficient in the sense that it is asymptotic normal, and the covariance matrix of the limit distribution coincides with the one for the quasi-likelihood estimator for the case that the covariance matrix of the response variable is completely known.

  20. Planck intermediate results: XVI. Profile likelihoods for cosmological parameters

    DEFF Research Database (Denmark)

    Bartlett, J.G.; Cardoso, J.-F.; Delabrouille, J.

    2014-01-01

    mass distribution. By applying the Feldman-Cousins prescription, we again obtain results very similar to those of the Bayesian methodology. However, the profile-likelihood analysis of the cosmic microwave background (CMB) combination (Planck+WP+highL) reveals a minimum well within the unphysical...... negative-mass region. We show that inclusion of the Planck CMB-lensing information regularizes this issue, and provide a robust frequentist upper limit σmv ≤0:26 eV (95% confidence) from the CMB+lensing+BAO data combination. © ESO 2014....

  1. Maximum likelihood characterization of rotationally symmetric distributions on the sphere

    OpenAIRE

    Duerinckx, Mitia; Ley, Christophe

    2012-01-01

    A classical characterization result, which can be traced back to Gauss, states that the maximum likelihood estimator (MLE) of the location parameter equals the sample mean for any possible univariate samples of any possible sizes n if and only if the samples are drawn from a Gaussian population. A similar result, in the two-dimensional case, is given in von Mises (1918) for the Fisher-von Mises-Langevin (FVML) distribution, the equivalent of the Gaussian law on the unit circle. Half a century...

  2. Estimating epidemiological parameters for bovine tuberculosis in British cattle using a Bayesian partial-likelihood approach.

    Science.gov (United States)

    O'Hare, A; Orton, R J; Bessell, P R; Kao, R R

    2014-05-22

    Fitting models with Bayesian likelihood-based parameter inference is becoming increasingly important in infectious disease epidemiology. Detailed datasets present the opportunity to identify subsets of these data that capture important characteristics of the underlying epidemiology. One such dataset describes the epidemic of bovine tuberculosis (bTB) in British cattle, which is also an important exemplar of a disease with a wildlife reservoir (the Eurasian badger). Here, we evaluate a set of nested dynamic models of bTB transmission, including individual- and herd-level transmission heterogeneity and assuming minimal prior knowledge of the transmission and diagnostic test parameters. We performed a likelihood-based bootstrapping operation on the model to infer parameters based only on the recorded numbers of cattle testing positive for bTB at the start of each herd outbreak considering high- and low-risk areas separately. Models without herd heterogeneity are preferred in both areas though there is some evidence for super-spreading cattle. Similar to previous studies, we found low test sensitivities and high within-herd basic reproduction numbers (R0), suggesting that there may be many unobserved infections in cattle, even though the current testing regime is sufficient to control within-herd epidemics in most cases. Compared with other, more data-heavy approaches, the summary data used in our approach are easily collected, making our approach attractive for other systems.

  3. A probabilistic assessment of the likelihood of vegetation drought under varying climate conditions across China.

    Science.gov (United States)

    Liu, Zhiyong; Li, Chao; Zhou, Ping; Chen, Xiuzhi

    2016-10-07

    Climate change significantly impacts the vegetation growth and terrestrial ecosystems. Using satellite remote sensing observations, here we focus on investigating vegetation dynamics and the likelihood of vegetation-related drought under varying climate conditions across China. We first compare temporal trends of Normalized Difference Vegetation Index (NDVI) and climatic variables over China. We find that in fact there is no significant change in vegetation over the cold regions where warming is significant. Then, we propose a joint probability model to estimate the likelihood of vegetation-related drought conditioned on different precipitation/temperature scenarios in growing season across China. To the best of our knowledge, this study is the first to examine the vegetation-related drought risk over China from a perspective based on joint probability. Our results demonstrate risk patterns of vegetation-related drought under both low and high precipitation/temperature conditions. We further identify the variations in vegetation-related drought risk under different climate conditions and the sensitivity of drought risk to climate variability. These findings provide insights for decision makers to evaluate drought risk and vegetation-related develop drought mitigation strategies over China in a warming world. The proposed methodology also has a great potential to be applied for vegetation-related drought risk assessment in other regions worldwide.

  4. A probabilistic assessment of the likelihood of vegetation drought under varying climate conditions across China

    Science.gov (United States)

    Liu, Zhiyong; Li, Chao; Zhou, Ping; Chen, Xiuzhi

    2016-10-01

    Climate change significantly impacts the vegetation growth and terrestrial ecosystems. Using satellite remote sensing observations, here we focus on investigating vegetation dynamics and the likelihood of vegetation-related drought under varying climate conditions across China. We first compare temporal trends of Normalized Difference Vegetation Index (NDVI) and climatic variables over China. We find that in fact there is no significant change in vegetation over the cold regions where warming is significant. Then, we propose a joint probability model to estimate the likelihood of vegetation-related drought conditioned on different precipitation/temperature scenarios in growing season across China. To the best of our knowledge, this study is the first to examine the vegetation-related drought risk over China from a perspective based on joint probability. Our results demonstrate risk patterns of vegetation-related drought under both low and high precipitation/temperature conditions. We further identify the variations in vegetation-related drought risk under different climate conditions and the sensitivity of drought risk to climate variability. These findings provide insights for decision makers to evaluate drought risk and vegetation-related develop drought mitigation strategies over China in a warming world. The proposed methodology also has a great potential to be applied for vegetation-related drought risk assessment in other regions worldwide.

  5. LikeDM: likelihood calculator of dark matter detection

    CERN Document Server

    Huang, Xiaoyuan; Yuan, Qiang

    2016-01-01

    With the large progresses of searching for dark matter (DM) particles from indirect and direct methods, we develop a numerical tool which enables fast calculation of the likelihood of specified DM particle models given a number of observational data, such as charged cosmic rays from space-borne experiments (e.g., PAMELA, AMS-02), $\\gamma$-rays from Fermi space telescope, and the underground direct detection experiments. The purpose of this tool, \\likedm\\ --- likelihood calculator of dark matter detection, is to bridge the particle model of DM and the observational data. The intermediate steps between these two, including the astrophysical backgrounds, the propagation of charged particles, the analysis of Fermi $\\gamma$-ray data, as well as the DM velocity distribution and the nuclear form factor, have been dealt with in the code. We release the first version (v1.0) focusing on the constraints of charged cosmic and gamma rays and the direct detection part will be implemented in the next version. This manual de...

  6. Likelihood Analysis of the Minimal AMSB Model arXiv

    CERN Document Server

    Bagnaschi, E.; Sakurai, K.; Buchmueller, O.; Cavanaugh, R.; Chobanova, V.; Citron, M.; Costa, J.C.; De Roeck, A.; Dolan, M.J.; Ellis, J.R.; Flächer, H.; Heinemeyer, S.; Isidori, G.; Lucio, M.; Luo, F.; Martínez Santos, D.; Olive, K.A.; Richards, A.; Weiglein, G.

    We perform a likelihood analysis of the minimal Anomaly-Mediated Supersymmetry Breaking (mAMSB) model using constraints from cosmology and accelerator experiments. We find that a wino-like or a Higgsino-like neutralino LSP, $m_{\\tilde \\chi^0_{1}}$, may provide the cold dark matter (DM) with similar likelihood. The upper limit on the DM density from Planck and other experiments enforces $m_{\\tilde \\chi^0_{1}} \\lesssim 3~TeV$ after the inclusion of Sommerfeld enhancement in its annihilations. If most of the cold DM density is provided by the $\\tilde \\chi_0^1$, the measured value of the Higgs mass favours a limited range of $\\tan \\beta \\sim 5$ (or for $\\mu > 0$, $\\tan \\beta \\sim 45$) but the scalar mass $m_0$ is poorly constrained. In the wino-LSP case, $m_{3/2}$ is constrained to about $900~TeV$ and ${m_{\\tilde \\chi^0_{1}}}$ to $2.9\\pm0.1~TeV$, whereas in the Higgsino-LSP case $m_{3/2}$ has just a lower limit $\\gtrsim 650TeV$ ($\\gtrsim 480TeV$) and $m_{\\tilde \\chi^0_{1}}$ is constrained to $1.12 ~(1.13) \\pm0.02...

  7. The Multi-Mission Maximum Likelihood framework (3ML)

    CERN Document Server

    Vianello, Giacomo; Younk, Patrick; Tibaldo, Luigi; Burgess, James M; Ayala, Hugo; Harding, Patrick; Hui, Michelle; Omodei, Nicola; Zhou, Hao

    2015-01-01

    Astrophysical sources are now observed by many different instruments at different wavelengths, from radio to high-energy gamma-rays, with an unprecedented quality. Putting all these data together to form a coherent view, however, is a very difficult task. Each instrument has its own data format, software and analysis procedure, which are difficult to combine. It is for example very challenging to perform a broadband fit of the energy spectrum of the source. The Multi-Mission Maximum Likelihood framework (3ML) aims to solve this issue, providing a common framework which allows for a coherent modeling of sources using all the available data, independent of their origin. At the same time, thanks to its architecture based on plug-ins, 3ML uses the existing official software of each instrument for the corresponding data in a way which is transparent to the user. 3ML is based on the likelihood formalism, in which a model summarizing our knowledge about a particular region of the sky is convolved with the instrument...

  8. Hybrid pairwise likelihood analysis of animal behavior experiments.

    Science.gov (United States)

    Cattelan, Manuela; Varin, Cristiano

    2013-12-01

    The study of the determinants of fights between animals is an important issue in understanding animal behavior. For this purpose, tournament experiments among a set of animals are often used by zoologists. The results of these tournament experiments are naturally analyzed by paired comparison models. Proper statistical analysis of these models is complicated by the presence of dependence between the outcomes of fights because the same animal is involved in different contests. This paper discusses two different model specifications to account for between-fights dependence. Models are fitted through the hybrid pairwise likelihood method that iterates between optimal estimating equations for the regression parameters and pairwise likelihood inference for the association parameters. This approach requires the specification of means and covariances only. For this reason, the method can be applied also when the computation of the joint distribution is difficult or inconvenient. The proposed methodology is investigated by simulation studies and applied to real data about adult male Cape Dwarf Chameleons. © 2013, The International Biometric Society.

  9. Menyoal Elaboration Likelihood Model (ELM dan Teori Retorika

    Directory of Open Access Journals (Sweden)

    Yudi Perbawaningsih

    2012-06-01

    Full Text Available Abstract: Persuasion is a communication process to establish or change attitudes, which can be understood through theory of Rhetoric and theory of Elaboration Likelihood Model (ELM. This study elaborates these theories in a Public Lecture series which to persuade the students in choosing their concentration of study. The result shows that in term of persuasion effectiveness it is not quite relevant to separate the message and its source. The quality of source is determined by the quality of the message, and vice versa. Separating the two routes of the persuasion process as described in the ELM theory would not be relevant. Abstrak: Persuasi adalah proses komunikasi untuk membentuk atau mengubah sikap, yang dapat dipahami dengan teori Retorika dan teori Elaboration Likelihood Model (ELM. Penelitian ini mengelaborasi teori tersebut dalam Kuliah Umum sebagai sarana mempersuasi mahasiswa untuk memilih konsentrasi studi studi yang didasarkan pada proses pengolahan informasi. Menggunakan metode survey, didapatkan hasil yaitu tidaklah cukup relevan memisahkan pesan dan narasumber dalam melihat efektivitas persuasi. Keduanya menyatu yang berarti bahwa kualitas narasumber ditentukan oleh kualitas pesan yang disampaikannya, dan sebaliknya. Memisahkan proses persuasi dalam dua lajur seperti yang dijelaskan dalam ELM teori menjadi tidak relevan.

  10. Likelihood free inference for Markov processes: a comparison.

    Science.gov (United States)

    Owen, Jamie; Wilkinson, Darren J; Gillespie, Colin S

    2015-04-01

    Approaches to Bayesian inference for problems with intractable likelihoods have become increasingly important in recent years. Approximate Bayesian computation (ABC) and "likelihood free" Markov chain Monte Carlo techniques are popular methods for tackling inference in these scenarios but such techniques are computationally expensive. In this paper we compare the two approaches to inference, with a particular focus on parameter inference for stochastic kinetic models, widely used in systems biology. Discrete time transition kernels for models of this type are intractable for all but the most trivial systems yet forward simulation is usually straightforward. We discuss the relative merits and drawbacks of each approach whilst considering the computational cost implications and efficiency of these techniques. In order to explore the properties of each approach we examine a range of observation regimes using two example models. We use a Lotka-Volterra predator-prey model to explore the impact of full or partial species observations using various time course observations under the assumption of known and unknown measurement error. Further investigation into the impact of observation error is then made using a Schlögl system, a test case which exhibits bi-modal state stability in some regions of parameter space.

  11. On the Likelihood of Supernova Enrichment of Protoplanetary Disks

    Science.gov (United States)

    Williams, Jonathan P.; Gaidos, Eric

    2007-07-01

    We estimate the likelihood of direct injection of supernova ejecta into protoplanetary disks using a model in which the number of stars with disks decreases linearly with time, and clusters expand linearly with time such that their surface density is independent of stellar number. The similarity of disk dissipation and main-sequence lifetimes implies that the typical supernova progenitor is very massive, ~75-100 Msolar. Such massive stars are found only in clusters with >~104 members. Moreover, there is only a small region around a supernova within which disks can survive the blast yet be enriched to the level observed in the solar system. These two factors limit the overall likelihood of supernova enrichment of a protoplanetary disk to radionucleides in meteorites is to be explained in this way, however, the solar system most likely formed in one of the largest clusters in the Galaxy, more than 2 orders of magnitude greater than Orion, where multiple supernovae impacted many disks in a short period of time.

  12. On the likelihood of supernova enrichment of protoplanetary disks

    CERN Document Server

    Williams, Jonathan P

    2007-01-01

    We estimate the likelihood of direct injection of supernova ejecta into protoplanetary disks using a model in which the number of stars with disks decreases linearly with time, and clusters expand linearly with time such that their surface density is independent of stellar number. The similarity of disk dissipation and main sequence lifetimes implies that the typical supernova progenitor is very massive, ~ 75-100 Msun. Such massive stars are found only in clusters with > 10^4 members. Moreover, there is only a small region around a supernova within which disks can survive the blast yet be enriched to the level observed in the Solar System. These two factors limit the overall likelihood of supernova enrichment of a protoplanetary disk to < 1%. If the presence of short lived radionucleides in meteorites is to be explained in this way, however, the Solar System most likely formed in one of the largest clusters in the Galaxy, more than two orders of magnitude greater than Orion, where multiple supernovae impac...

  13. Likelihood-based CT reconstruction of objects containing known components

    Energy Technology Data Exchange (ETDEWEB)

    Stayman, J. Webster [Johns Hopkins Univ., Baltimore, MD (United States). Dept. of Biomedical Engineering; Otake, Yoshito; Uneri, Ali; Prince, Jerry L.; Siewerdsen, Jeffrey H.

    2011-07-01

    There are many situations in medical imaging where there are known components within the imaging volume. Such is the case in diagnostic X-ray CT imaging of patients with implants, in intraoperative CT imaging where there may be surgical tools in the field, or in situations where the patient support (table or frame) or other devices are outside the (truncated) reconstruction FOV. In such scenarios it is often of great interest to image the relation between the known component and the surrounding anatomy, or to provide high-quality images at the boundary of these objects, or simply to minimize artifacts arising from such components. We propose a framework for simultaneously estimating the position and orientation of a known component and the surrounding volume. Toward this end, we adopt a likelihood-based objective function with an image volume jointly parameterized by a known object, or objects, with unknown registration parameters and an unknown background attenuation volume. The objective is solved iteratively using an alternating minimization approach between the two parameter types. Because this model integrates a substantial amount of prior knowledge about the overall volume, we expect a number of advantages including the reduction of metal artifacts, potential for more sparse data acquisition (decreased time and dose), and/or improved image quality. We illustrate this approach using simulated spine CT data that contains pedicle screws placed in a vertebra, and demonstrate improved performance over traditional filtered-backprojection and penalized-likelihood reconstruction techniques. (orig.)

  14. Empirical likelihood ratio tests for multivariate regression models

    Institute of Scientific and Technical Information of China (English)

    WU Jianhong; ZHU Lixing

    2007-01-01

    This paper proposes some diagnostic tools for checking the adequacy of multivariate regression models including classical regression and time series autoregression. In statistical inference, the empirical likelihood ratio method has been well known to be a powerful tool for constructing test and confidence region. For model checking, however, the naive empirical likelihood (EL) based tests are not of Wilks' phenomenon. Hence, we make use of bias correction to construct the EL-based score tests and derive a nonparametric version of Wilks' theorem. Moreover, by the advantages of both the EL and score test method, the EL-based score tests share many desirable features as follows: They are self-scale invariant and can detect the alternatives that converge to the null at rate n-1/2, the possibly fastest rate for lack-of-fit testing; they involve weight functions, which provides us with the flexibility to choose scores for improving power performance, especially under directional alternatives. Furthermore, when the alternatives are not directional, we construct asymptotically distribution-free maximin tests for a large class of possible alternatives. A simulation study is carried out and an application for a real dataset is analyzed.

  15. Gauging the likelihood of stable cavitation from ultrasound contrast agents.

    Science.gov (United States)

    Bader, Kenneth B; Holland, Christy K

    2013-01-07

    The mechanical index (MI) was formulated to gauge the likelihood of adverse bioeffects from inertial cavitation. However, the MI formulation did not consider bubble activity from stable cavitation. This type of bubble activity can be readily nucleated from ultrasound contrast agents (UCAs) and has the potential to promote beneficial bioeffects. Here, the presence of stable cavitation is determined numerically by tracking the onset of subharmonic oscillations within a population of bubbles for frequencies up to 7 MHz and peak rarefactional pressures up to 3 MPa. In addition, the acoustic pressure rupture threshold of an UCA population was determined using the Marmottant model. The threshold for subharmonic emissions of optimally sized bubbles was found to be lower than the inertial cavitation threshold for all frequencies studied. The rupture thresholds of optimally sized UCAs were found to be lower than the threshold for subharmonic emissions for either single cycle or steady state acoustic excitations. Because the thresholds of both subharmonic emissions and UCA rupture are linearly dependent on frequency, an index of the form I(CAV) = P(r)/f (where P(r) is the peak rarefactional pressure in MPa and f is the frequency in MHz) was derived to gauge the likelihood of subharmonic emissions due to stable cavitation activity nucleated from UCAs.

  16. tmle : An R Package for Targeted Maximum Likelihood Estimation

    Directory of Open Access Journals (Sweden)

    Susan Gruber

    2012-11-01

    Full Text Available Targeted maximum likelihood estimation (TMLE is a general approach for constructing an efficient double-robust semi-parametric substitution estimator of a causal effect parameter or statistical association measure. tmle is a recently developed R package that implements TMLE of the effect of a binary treatment at a single point in time on an outcome of interest, controlling for user supplied covariates, including an additive treatment effect, relative risk, odds ratio, and the controlled direct effect of a binary treatment controlling for a binary intermediate variable on the pathway from treatment to the out- come. Estimation of the parameters of a marginal structural model is also available. The package allows outcome data with missingness, and experimental units that contribute repeated records of the point-treatment data structure, thereby allowing the analysis of longitudinal data structures. Relevant factors of the likelihood may be modeled or fit data-adaptively according to user specifications, or passed in from an external estimation procedure. Effect estimates, variances, p values, and 95% confidence intervals are provided by the software.

  17. Improved anomaly detection using multi-scale PLS and generalized likelihood ratio test

    KAUST Repository

    Madakyaru, Muddu

    2017-02-16

    Process monitoring has a central role in the process industry to enhance productivity, efficiency, and safety, and to avoid expensive maintenance. In this paper, a statistical approach that exploit the advantages of multiscale PLS models (MSPLS) and those of a generalized likelihood ratio (GLR) test to better detect anomalies is proposed. Specifically, to consider the multivariate and multi-scale nature of process dynamics, a MSPLS algorithm combining PLS and wavelet analysis is used as modeling framework. Then, GLR hypothesis testing is applied using the uncorrelated residuals obtained from MSPLS model to improve the anomaly detection abilities of these latent variable based fault detection methods even further. Applications to a simulated distillation column data are used to evaluate the proposed MSPLS-GLR algorithm.

  18. Modified likelihood ratio test for homogeneity in bivariate normal mixtures with presence of a structural parameter

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    This paper investigates the asymptotic properties of the modified likelihood ratio statistic for testing homogeneity in bivariate normal mixture models with an unknown structural parameter. It is shown that the modified likelihood ratio statistic has χ22 null limiting distribution.

  19. Modified likelihood ratio test for homogeneity in normal mixtures with two samples

    Institute of Scientific and Technical Information of China (English)

    QIN Yong-song; LEI Qing-zhu

    2008-01-01

    This paper investigates the modified likelihood ratio test(LRT) for homogeneity in normal mixtures of two samples with mixing proportions unknown. It is proved that the limit distribution of the modified likelihood ratio test is X2(1).

  20. Estimation for Non-Gaussian Locally Stationary Processes with Empirical Likelihood Method

    Directory of Open Access Journals (Sweden)

    Hiroaki Ogata

    2012-01-01

    Full Text Available An application of the empirical likelihood method to non-Gaussian locally stationary processes is presented. Based on the central limit theorem for locally stationary processes, we give the asymptotic distributions of the maximum empirical likelihood estimator and the empirical likelihood ratio statistics, respectively. It is shown that the empirical likelihood method enables us to make inferences on various important indices in a time series analysis. Furthermore, we give a numerical study and investigate a finite sample property.

  1. Is there a hierarchy of social inferences? The likelihood and speed of inferring intentionality, mind, and personality.

    Science.gov (United States)

    Malle, Bertram F; Holbrook, Jess

    2012-04-01

    People interpret behavior by making inferences about agents' intentionality, mind, and personality. Past research studied such inferences 1 at a time; in real life, people make these inferences simultaneously. The present studies therefore examined whether 4 major inferences (intentionality, desire, belief, and personality), elicited simultaneously in response to an observed behavior, might be ordered in a hierarchy of likelihood and speed. To achieve generalizability, the studies included a wide range of stimulus behaviors, presented them verbally and as dynamic videos, and assessed inferences both in a retrieval paradigm (measuring the likelihood and speed of accessing inferences immediately after they were made) and in an online processing paradigm (measuring the speed of forming inferences during behavior observation). Five studies provide evidence for a hierarchy of social inferences-from intentionality and desire to belief to personality-that is stable across verbal and visual presentations and that parallels the order found in developmental and primate research.

  2. Peptaibol antiamoebin I: spatial structure, backbone dynamics, interaction with bicelles and lipid-protein nanodiscs, and pore formation in context of barrel-stave model.

    Science.gov (United States)

    Shenkarev, Zakhar O; Paramonov, Alexander S; Lyukmanova, Ekaterina N; Gizatullina, Albina K; Zhuravleva, Anastasia V; Tagaev, Andrey A; Yakimenko, Zoya A; Telezhinskaya, Irina N; Kirpichnikov, Mikhail P; Ovchinnikova, Tatiana V; Arseniev, Alexander S

    2013-05-01

    Antiamoebin I (Aam-I) is a membrane-active peptaibol antibiotic isolated from fungal species belonging to the genera Cephalosporium, Emericellopsis, Gliocladium, and Stilbella. In comparison with other 16-amino acid-residue peptaibols, e.g., zervamicin IIB (Zrv-IIB), Aam-I possesses relatively weak biological and channel-forming activities. In MeOH solution, Aam-I demonstrates fast cooperative transitions between right-handed and left-handed helical conformation of the N-terminal (1-8) region. We studied Aam-I spatial structure and backbone dynamics in the membrane-mimicking environment (DMPC/DHPC bicelles)(1) ) by heteronuclear (1) H,(13) C,(15) N-NMR spectroscopy. Interaction with the bicelles stabilizes the Aam-I right-handed helical conformation retaining significant intramolecular mobility on the ms-μs time scale. Extensive ms-μs dynamics were also detected in the DPC and DHPC micelles and DOPG nanodiscs. In contrast, Zrv-IIB in the DPC micelles demonstrates appreciably lesser mobility on the μs-ms time scale. Titration with Mn(2+) and 16-doxylstearate paramagnetic probes revealed Aam-I binding to the bicelle surface with the N-terminus slightly immersed into hydrocarbon region. Fluctuations of the Aam-I helix between surface-bound and transmembrane (TM) state were observed in the nanodisc membranes formed from the short-chain (diC12 : 0) DLPC/DLPG lipids. All the obtained experimental data are in agreement with the barrel-stave model of TM pore formation, similarly to the mechanism proposed for Zrv-IIB and other peptaibols. The observed extensive intramolecular dynamics explains the relatively low activity of Aam-I.

  3. Using a dynamical advection to reconstruct a part of the SSH evolution in the context of SWOT, application to the Mediterranean Sea

    Science.gov (United States)

    Rogé, Marine; Morrow, Rosemary; Ubelmann, Clément; Dibarboure, Gérald

    2017-08-01

    The main oceanographic objective of the future SWOT mission is to better characterize the ocean mesoscale and sub-mesoscale circulation, by observing a finer range of ocean topography dynamics down to 20 km wavelength. Despite the very high spatial resolution of the future satellite, it will not capture the time evolution of the shorter mesoscale signals, such as the formation and evolution of small eddies. SWOT will have an exact repeat cycle of 21 days, with near repeats around 5-10 days, depending on the latitude. Here, we investigate a technique to reconstruct the missing 2D SSH signal in the time between two satellite revisits. We use the dynamical interpolation (DI) technique developed by Ubelmann et al. (2015). Based on potential vorticity (hereafter PV) conservation using a one and a half layer quasi-geostrophic model, it features an active advection of the SSH field. This model has been tested in energetic open ocean regions such as the Gulf Stream and the Californian Current, and has given promising results. Here, we test this model in the Western Mediterranean Sea, a lower energy region with complex small scale physics, and compare the SSH reconstruction with the high-resolution Symphonie model. We investigate an extension of the simple dynamical model including a separated mean circulation. We find that the DI gives a 16-18% improvement in the reconstruction of the surface height and eddy kinetic energy fields, compared with a simple linear interpolation, and a 37% improvement in the Northern Current subregion. Reconstruction errors are higher during winter and autumn but statistically, the improvement from the DI is also better for these seasons.

  4. Use of Context in Video Processing

    Science.gov (United States)

    Wu, Chen; Aghajan, Hamid

    Interpreting an event or a scene based on visual data often requires additional contextual information. Contextual information may be obtained from different sources. In this chapter, we discuss two broad categories of contextual sources: environmental context and user-centric context. Environmental context refers to information derived from domain knowledge or from concurrently sensed effects in the area of operation. User-centric context refers to information obtained and accumulated from the user. Both types of context can include static or dynamic contextual elements. Examples from a smart home environment are presented to illustrate how different types of contextual data can be applied to aid the decision-making process.

  5. Service Degradation in Context Management Frameworks

    DEFF Research Database (Denmark)

    Shawky, Ahmed; Olsen, Rasmus Løvenstein; Pedersen, Jens Myrup

    2011-01-01

    Context aware network services are a new and inter-esting way to enhance network users experience. A context aware application/service enhances network performance in relation to dynamic context information, e.g. mobility, location and device information as it senses and reacts to environment...... changes. The reliability of the information accessed is a key factor in achieving reliable context aware application. This paper will review the service degradation in Context Management Frameworks (CMF) and the effect of high network utilization, with particular focus on the reliability of the accessed...

  6. Applications of non-standard maximum likelihood techniques in energy and resource economics

    Science.gov (United States)

    Moeltner, Klaus

    Two important types of non-standard maximum likelihood techniques, Simulated Maximum Likelihood (SML) and Pseudo-Maximum Likelihood (PML), have only recently found consideration in the applied economic literature. The objective of this thesis is to demonstrate how these methods can be successfully employed in the analysis of energy and resource models. Chapter I focuses on SML. It constitutes the first application of this technique in the field of energy economics. The framework is as follows: Surveys on the cost of power outages to commercial and industrial customers usually capture multiple observations on the dependent variable for a given firm. The resulting pooled data set is censored and exhibits cross-sectional heterogeneity. We propose a model that addresses these issues by allowing regression coefficients to vary randomly across respondents and by using the Geweke-Hajivassiliou-Keane simulator and Halton sequences to estimate high-order cumulative distribution terms. This adjustment requires the use of SML in the estimation process. Our framework allows for a more comprehensive analysis of outage costs than existing models, which rely on the assumptions of parameter constancy and cross-sectional homogeneity. Our results strongly reject both of these restrictions. The central topic of the second Chapter is the use of PML, a robust estimation technique, in count data analysis of visitor demand for a system of recreation sites. PML has been popular with researchers in this context, since it guards against many types of mis-specification errors. We demonstrate, however, that estimation results will generally be biased even if derived through PML if the recreation model is based on aggregate, or zonal data. To countervail this problem, we propose a zonal model of recreation that captures some of the underlying heterogeneity of individual visitors by incorporating distributional information on per-capita income into the aggregate demand function. This adjustment

  7. H.264 SVC Complexity Reduction Based on Likelihood Mode Decision

    Directory of Open Access Journals (Sweden)

    L. Balaji

    2015-01-01

    Full Text Available H.264 Advanced Video Coding (AVC was prolonged to Scalable Video Coding (SVC. SVC executes in different electronics gadgets such as personal computer, HDTV, SDTV, IPTV, and full-HDTV in which user demands various scaling of the same content. The various scaling is resolution, frame rate, quality, heterogeneous networks, bandwidth, and so forth. Scaling consumes more encoding time and computational complexity during mode selection. In this paper, to reduce encoding time and computational complexity, a fast mode decision algorithm based on likelihood mode decision (LMD is proposed. LMD is evaluated in both temporal and spatial scaling. From the results, we conclude that LMD performs well, when compared to the previous fast mode decision algorithms. The comparison parameters are time, PSNR, and bit rate. LMD achieve time saving of 66.65% with 0.05% detriment in PSNR and 0.17% increment in bit rate compared with the full search method.

  8. On the Performance of Maximum Likelihood Inverse Reinforcement Learning

    CERN Document Server

    Ratia, Héctor; Martinez-Cantin, Ruben

    2012-01-01

    Inverse reinforcement learning (IRL) addresses the problem of recovering a task description given a demonstration of the optimal policy used to solve such a task. The optimal policy is usually provided by an expert or teacher, making IRL specially suitable for the problem of apprenticeship learning. The task description is encoded in the form of a reward function of a Markov decision process (MDP). Several algorithms have been proposed to find the reward function corresponding to a set of demonstrations. One of the algorithms that has provided best results in different applications is a gradient method to optimize a policy squared error criterion. On a parallel line of research, other authors have presented recently a gradient approximation of the maximum likelihood estimate of the reward signal. In general, both approaches approximate the gradient estimate and the criteria at different stages to make the algorithm tractable and efficient. In this work, we provide a detailed description of the different metho...

  9. Maximum Likelihood Analysis of Low Energy CDMS II Germanium Data

    CERN Document Server

    Agnese, R; Balakishiyeva, D; Thakur, R Basu; Bauer, D A; Billard, J; Borgland, A; Bowles, M A; Brandt, D; Brink, P L; Bunker, R; Cabrera, B; Caldwell, D O; Cerdeno, D G; Chagani, H; Chen, Y; Cooley, J; Cornell, B; Crewdson, C H; Cushman, P; Daal, M; Di Stefano, P C F; Doughty, T; Esteban, L; Fallows, S; Figueroa-Feliciano, E; Fritts, M; Godfrey, G L; Golwala, S R; Graham, M; Hall, J; Harris, H R; Hertel, S A; Hofer, T; Holmgren, D; Hsu, L; Huber, M E; Jastram, A; Kamaev, O; Kara, B; Kelsey, M H; Kennedy, A; Kiveni, M; Koch, K; Leder, A; Loer, B; Asamar, E Lopez; Mahapatra, R; Mandic, V; Martinez, C; McCarthy, K A; Mirabolfathi, N; Moffatt, R A; Moore, D C; Nelson, R H; Oser, S M; Page, K; Page, W A; Partridge, R; Pepin, M; Phipps, A; Prasad, K; Pyle, M; Qiu, H; Rau, W; Redl, P; Reisetter, A; Ricci, Y; Rogers, H E; Saab, T; Sadoulet, B; Sander, J; Schneck, K; Schnee, R W; Scorza, S; Serfass, B; Shank, B; Speller, D; Upadhyayula, S; Villano, A N; Welliver, B; Wright, D H; Yellin, S; Yen, J J; Young, B A; Zhang, J

    2014-01-01

    We report on the results of a search for a Weakly Interacting Massive Particle (WIMP) signal in low-energy data of the Cryogenic Dark Matter Search (CDMS~II) experiment using a maximum likelihood analysis. A background model is constructed using GEANT4 to simulate the surface-event background from $^{210}$Pb decay-chain events, while using independent calibration data to model the gamma background. Fitting this background model to the data results in no statistically significant WIMP component. In addition, we perform fits using an analytic ad hoc background model proposed by Collar and Fields, who claimed to find a large excess of signal-like events in our data. We confirm the strong preference for a signal hypothesis in their analysis under these assumptions, but excesses are observed in both single- and multiple-scatter events, which implies the signal is not caused by WIMPs, but rather reflects the inadequacy of their background model.

  10. Maximum Likelihood Position Location with a Limited Number of References

    Directory of Open Access Journals (Sweden)

    D. Munoz-Rodriguez

    2011-04-01

    Full Text Available A Position Location (PL scheme for mobile users on the outskirts of coverage areas is presented. The proposedmethodology makes it possible to obtain location information with only two land-fixed references. We introduce ageneral formulation and show that maximum-likelihood estimation can provide adequate PL information in thisscenario. The Root Mean Square (RMS error and error-distribution characterization are obtained for differentpropagation scenarios. In addition, simulation results and comparisons to another method are provided showing theaccuracy and the robustness of the method proposed. We study accuracy limits of the proposed methodology fordifferent propagation environments and show that even in the case of mismatch in the error variances, good PLestimation is feasible.

  11. Does induction really reduce the likelihood of caesarean section?

    Science.gov (United States)

    Wickham, Sara

    2014-09-01

    Two recent systematic reviews have arrived at the same, rather surprising and somewhat counter-intuitive result. That is, contrary to the belief and experience of many people who work on labour wards every day, induction of labour doesn't increase the chance of caesarean section at all. In fact, the reviewers argue, their results demonstrate that induction of labour reduces the likelihood of caesarean section. It might be that our instincts are wrong, and that we need to reconsider what we think we know. But before we rush to recommend induction as the latest tool to promote normal birth, we might want to look a bit more closely at the evidence, as I am not at all certain that this apparently straightforward conclusion is quite as cut-and-dried as it sounds.

  12. The elaboration likelihood model and communication about food risks.

    Science.gov (United States)

    Frewer, L J; Howard, C; Hedderley, D; Shepherd, R

    1997-12-01

    Factors such as hazard type and source credibility have been identified as important in the establishment of effective strategies for risk communication. The elaboration likelihood model was adapted to investigate the potential impact of hazard type, information source, and persuasive content of information on individual engagement in elaborative, or thoughtful, cognitions about risk messages. One hundred sixty respondents were allocated to one of eight experimental groups, and the effects of source credibility, persuasive content of information and hazard type were systematically varied. The impact of the different factors on beliefs about the information and elaborative processing examined. Low credibility was particularly important in reducing risk perceptions, although persuasive content and hazard type were also influential in determining whether elaborative processing occurred.

  13. Sparse-posterior Gaussian Processes for general likelihoods

    CERN Document Server

    Yuan,; Abdel-Gawad, Ahmed H; Minka, Thomas P

    2012-01-01

    Gaussian processes (GPs) provide a probabilistic nonparametric representation of functions in regression, classification, and other problems. Unfortunately, exact learning with GPs is intractable for large datasets. A variety of approximate GP methods have been proposed that essentially map the large dataset into a small set of basis points. Among them, two state-of-the-art methods are sparse pseudo-input Gaussian process (SPGP) (Snelson and Ghahramani, 2006) and variablesigma GP (VSGP) Walder et al. (2008), which generalizes SPGP and allows each basis point to have its own length scale. However, VSGP was only derived for regression. In this paper, we propose a new sparse GP framework that uses expectation propagation to directly approximate general GP likelihoods using a sparse and smooth basis. It includes both SPGP and VSGP for regression as special cases. Plus as an EP algorithm, it inherits the ability to process data online. As a particular choice of approximating family, we blur each basis point with a...

  14. Evaluating maximum likelihood estimation methods to determine the hurst coefficients

    Science.gov (United States)

    Kendziorski, C. M.; Bassingthwaighte, J. B.; Tonellato, P. J.

    1999-12-01

    A maximum likelihood estimation method implemented in S-PLUS ( S-MLE) to estimate the Hurst coefficient ( H) is evaluated. The Hurst coefficient, with 0.5long memory time series by quantifying the rate of decay of the autocorrelation function. S-MLE was developed to estimate H for fractionally differenced (fd) processes. However, in practice it is difficult to distinguish between fd processes and fractional Gaussian noise (fGn) processes. Thus, the method is evaluated for estimating H for both fd and fGn processes. S-MLE gave biased results of H for fGn processes of any length and for fd processes of lengths less than 2 10. A modified method is proposed to correct for this bias. It gives reliable estimates of H for both fd and fGn processes of length greater than or equal to 2 11.

  15. Empirical likelihood for balanced ranked-set sampled data

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    Ranked-set sampling(RSS) often provides more efficient inference than simple random sampling(SRS).In this article,we propose a systematic nonparametric technique,RSS-EL,for hypoth-esis testing and interval estimation with balanced RSS data using empirical likelihood(EL).We detail the approach for interval estimation and hypothesis testing in one-sample and two-sample problems and general estimating equations.In all three cases,RSS is shown to provide more efficient inference than SRS of the same size.Moreover,the RSS-EL method does not require any easily violated assumptions needed by existing rank-based nonparametric methods for RSS data,such as perfect ranking,identical ranking scheme in two groups,and location shift between two population distributions.The merit of the RSS-EL method is also demonstrated through simulation studies.

  16. Marginal Maximum Likelihood Estimation of Item Response Models in R

    Directory of Open Access Journals (Sweden)

    Matthew S. Johnson

    2007-02-01

    Full Text Available Item response theory (IRT models are a class of statistical models used by researchers to describe the response behaviors of individuals to a set of categorically scored items. The most common IRT models can be classified as generalized linear fixed- and/or mixed-effect models. Although IRT models appear most often in the psychological testing literature, researchers in other fields have successfully utilized IRT-like models in a wide variety of applications. This paper discusses the three major methods of estimation in IRT and develops R functions utilizing the built-in capabilities of the R environment to find the marginal maximum likelihood estimates of the generalized partial credit model. The currently available R packages ltm is also discussed.

  17. Maximum likelihood identification of aircraft stability and control derivatives

    Science.gov (United States)

    Mehra, R. K.; Stepner, D. E.; Tyler, J. S.

    1974-01-01

    Application of a generalized identification method to flight test data analysis. The method is based on the maximum likelihood (ML) criterion and includes output error and equation error methods as special cases. Both the linear and nonlinear models with and without process noise are considered. The flight test data from lateral maneuvers of HL-10 and M2/F3 lifting bodies are processed to determine the lateral stability and control derivatives, instrumentation accuracies, and biases. A comparison is made between the results of the output error method and the ML method for M2/F3 data containing gusts. It is shown that better fits to time histories are obtained by using the ML method. The nonlinear model considered corresponds to the longitudinal equations of the X-22 VTOL aircraft. The data are obtained from a computer simulation and contain both process and measurement noise. The applicability of the ML method to nonlinear models with both process and measurement noise is demonstrated.

  18. Music genre classification via likelihood fusion from multiple feature models

    Science.gov (United States)

    Shiu, Yu; Kuo, C.-C. J.

    2005-01-01

    Music genre provides an efficient way to index songs in a music database, and can be used as an effective means to retrieval music of a similar type, i.e. content-based music retrieval. A new two-stage scheme for music genre classification is proposed in this work. At the first stage, we examine a couple of different features, construct their corresponding parametric models (e.g. GMM and HMM) and compute their likelihood functions to yield soft classification results. In particular, the timbre, rhythm and temporal variation features are considered. Then, at the second stage, these soft classification results are integrated to result in a hard decision for final music genre classification. Experimental results are given to demonstrate the performance of the proposed scheme.

  19. Analytical maximum likelihood estimation of stellar magnetic fields

    CERN Document Server

    González, M J Martínez; Ramos, A Asensio; Belluzzi, L

    2011-01-01

    The polarised spectrum of stellar radiation encodes valuable information on the conditions of stellar atmospheres and the magnetic fields that permeate them. In this paper, we give explicit expressions to estimate the magnetic field vector and its associated error from the observed Stokes parameters. We study the solar case where specific intensities are observed and then the stellar case, where we receive the polarised flux. In this second case, we concentrate on the explicit expression for the case of a slow rotator with a dipolar magnetic field geometry. Moreover, we also give explicit formulae to retrieve the magnetic field vector from the LSD profiles without assuming mean values for the LSD artificial spectral line. The formulae have been obtained assuming that the spectral lines can be described in the weak field regime and using a maximum likelihood approach. The errors are recovered by means of the hermitian matrix. The bias of the estimators are analysed in depth.

  20. Narrow band interference cancelation in OFDM: Astructured maximum likelihood approach

    KAUST Repository

    Sohail, Muhammad Sadiq

    2012-06-01

    This paper presents a maximum likelihood (ML) approach to mitigate the effect of narrow band interference (NBI) in a zero padded orthogonal frequency division multiplexing (ZP-OFDM) system. The NBI is assumed to be time variant and asynchronous with the frequency grid of the ZP-OFDM system. The proposed structure based technique uses the fact that the NBI signal is sparse as compared to the ZP-OFDM signal in the frequency domain. The structure is also useful in reducing the computational complexity of the proposed method. The paper also presents a data aided approach for improved NBI estimation. The suitability of the proposed method is demonstrated through simulations. © 2012 IEEE.

  1. Likelihood Approximation With Hierarchical Matrices For Large Spatial Datasets

    KAUST Repository

    Litvinenko, Alexander

    2017-09-03

    We use available measurements to estimate the unknown parameters (variance, smoothness parameter, and covariance length) of a covariance function by maximizing the joint Gaussian log-likelihood function. To overcome cubic complexity in the linear algebra, we approximate the discretized covariance function in the hierarchical (H-) matrix format. The H-matrix format has a log-linear computational cost and storage O(kn log n), where the rank k is a small integer and n is the number of locations. The H-matrix technique allows us to work with general covariance matrices in an efficient way, since H-matrices can approximate inhomogeneous covariance functions, with a fairly general mesh that is not necessarily axes-parallel, and neither the covariance matrix itself nor its inverse have to be sparse. We demonstrate our method with Monte Carlo simulations and an application to soil moisture data. The C, C++ codes and data are freely available.

  2. CONSTRUCTING A FLEXIBLE LIKELIHOOD FUNCTION FOR SPECTROSCOPIC INFERENCE

    Energy Technology Data Exchange (ETDEWEB)

    Czekala, Ian; Andrews, Sean M.; Mandel, Kaisey S.; Green, Gregory M. [Harvard-Smithsonian Center for Astrophysics, 60 Garden Street, Cambridge, MA 02138 (United States); Hogg, David W., E-mail: iczekala@cfa.harvard.edu [Center for Cosmology and Particle Physics, Department of Physics, New York University, 4 Washington Place, New York, NY, 10003 (United States)

    2015-10-20

    We present a modular, extensible likelihood framework for spectroscopic inference based on synthetic model spectra. The subtraction of an imperfect model from a continuously sampled spectrum introduces covariance between adjacent datapoints (pixels) into the residual spectrum. For the high signal-to-noise data with large spectral range that is commonly employed in stellar astrophysics, that covariant structure can lead to dramatically underestimated parameter uncertainties (and, in some cases, biases). We construct a likelihood function that accounts for the structure of the covariance matrix, utilizing the machinery of Gaussian process kernels. This framework specifically addresses the common problem of mismatches in model spectral line strengths (with respect to data) due to intrinsic model imperfections (e.g., in the atomic/molecular databases or opacity prescriptions) by developing a novel local covariance kernel formalism that identifies and self-consistently downweights pathological spectral line “outliers.” By fitting many spectra in a hierarchical manner, these local kernels provide a mechanism to learn about and build data-driven corrections to synthetic spectral libraries. An open-source software implementation of this approach is available at http://iancze.github.io/Starfish, including a sophisticated probabilistic scheme for spectral interpolation when using model libraries that are sparsely sampled in the stellar parameters. We demonstrate some salient features of the framework by fitting the high-resolution V-band spectrum of WASP-14, an F5 dwarf with a transiting exoplanet, and the moderate-resolution K-band spectrum of Gliese 51, an M5 field dwarf.

  3. Constructing a Flexible Likelihood Function for Spectroscopic Inference

    Science.gov (United States)

    Czekala, Ian; Andrews, Sean M.; Mandel, Kaisey S.; Hogg, David W.; Green, Gregory M.

    2015-10-01

    We present a modular, extensible likelihood framework for spectroscopic inference based on synthetic model spectra. The subtraction of an imperfect model from a continuously sampled spectrum introduces covariance between adjacent datapoints (pixels) into the residual spectrum. For the high signal-to-noise data with large spectral range that is commonly employed in stellar astrophysics, that covariant structure can lead to dramatically underestimated parameter uncertainties (and, in some cases, biases). We construct a likelihood function that accounts for the structure of the covariance matrix, utilizing the machinery of Gaussian process kernels. This framework specifically addresses the common problem of mismatches in model spectral line strengths (with respect to data) due to intrinsic model imperfections (e.g., in the atomic/molecular databases or opacity prescriptions) by developing a novel local covariance kernel formalism that identifies and self-consistently downweights pathological spectral line “outliers.” By fitting many spectra in a hierarchical manner, these local kernels provide a mechanism to learn about and build data-driven corrections to synthetic spectral libraries. An open-source software implementation of this approach is available at http://iancze.github.io/Starfish, including a sophisticated probabilistic scheme for spectral interpolation when using model libraries that are sparsely sampled in the stellar parameters. We demonstrate some salient features of the framework by fitting the high-resolution V-band spectrum of WASP-14, an F5 dwarf with a transiting exoplanet, and the moderate-resolution K-band spectrum of Gliese 51, an M5 field dwarf.

  4. Accelerated maximum likelihood parameter estimation for stochastic biochemical systems

    Directory of Open Access Journals (Sweden)

    Daigle Bernie J

    2012-05-01

    Full Text Available Abstract Background A prerequisite for the mechanistic simulation of a biochemical system is detailed knowledge of its kinetic parameters. Despite recent experimental advances, the estimation of unknown parameter values from observed data is still a bottleneck for obtaining accurate simulation results. Many methods exist for parameter estimation in deterministic biochemical systems; methods for discrete stochastic systems are less well developed. Given the probabilistic nature of stochastic biochemical models, a natural approach is to choose parameter values that maximize the probability of the observed data with respect to the unknown parameters, a.k.a. the maximum likelihood parameter estimates (MLEs. MLE computation for all but the simplest models requires the simulation of many system trajectories that are consistent with experimental data. For models with unknown parameters, this presents a computational challenge, as the generation of consistent trajectories can be an extremely rare occurrence. Results We have developed Monte Carlo Expectation-Maximization with Modified Cross-Entropy Method (MCEM2: an accelerated method for calculating MLEs that combines advances in rare event simulation with a computationally efficient version of the Monte Carlo expectation-maximization (MCEM algorithm. Our method requires no prior knowledge regarding parameter values, and it automatically provides a multivariate parameter uncertainty estimate. We applied the method to five stochastic systems of increasing complexity, progressing from an analytically tractable pure-birth model to a computationally demanding model of yeast-polarization. Our results demonstrate that MCEM2 substantially accelerates MLE computation on all tested models when compared to a stand-alone version of MCEM. Additionally, we show how our method identifies parameter values for certain classes of models more accurately than two recently proposed computationally efficient methods

  5. Likelihood of tree topologies with fossils and diversification rate estimation.

    Science.gov (United States)

    Didier, Gilles; Fau, Marine; Laurin, Michel

    2017-04-18

    Since the diversification process cannot be directly observed at the human scale, it has to be studied from the information available, namely the extant taxa and the fossil record. In this sense, phylogenetic trees including both extant taxa and fossils are the most complete representations of the diversification process that one can get. Such phylogenetic trees can be reconstructed from molecular and morphological data, to some extent. Among the temporal information of such phylogenetic trees, fossil ages are by far the most precisely known (divergence times are inferences calibrated mostly with fossils). We propose here a method to compute the likelihood of a phylogenetic tree with fossils in which the only considered time information is the fossil ages, and apply it to the estimation of the diversification rates from such data. Since it is required in our computation, we provide a method for determining the probability of a tree topology under the standard diversification model.Testing 21 our approach on simulated data shows that the maximum likelihood rate estimates from the phylogenetic tree topology and the fossil dates are almost as accurate as those obtained by taking into account all the data, including the divergence times. Moreover, they are substantially more accurate than the estimates obtained only from the exact divergence times (without taking into account the fossil record).We also provide an empirical example composed of 50 Permo-carboniferous eupelycosaur (early synapsid) taxa ranging in age from about 315 Ma (Late Carboniferous) to 270 Ma (shortly after the end of the Early Permian). Our analyses suggest a speciation (cladogenesis, or birth) rate of about 0.1 per lineage and per My, a marginally lower extinction rate, and a considerable hidden paleobiodiversity of early synapsids. © The Author(s) 2017. Published by Oxford University Press, on behalf of the Society of Systematic Biologists. All rights reserved. For Permissions, please email

  6. Molecular clock fork phylogenies: closed form analytic maximum likelihood solutions.

    Science.gov (United States)

    Chor, Benny; Snir, Sagi

    2004-12-01

    Maximum likelihood (ML) is increasingly used as an optimality criterion for selecting evolutionary trees, but finding the global optimum is a hard computational task. Because no general analytic solution is known, numeric techniques such as hill climbing or expectation maximization (EM) are used in order to find optimal parameters for a given tree. So far, analytic solutions were derived only for the simplest model-three-taxa, two-state characters, under a molecular clock. Quoting Ziheng Yang, who initiated the analytic approach,"this seems to be the simplest case, but has many of the conceptual and statistical complexities involved in phylogenetic estimation."In this work, we give general analytic solutions for a family of trees with four-taxa, two-state characters, under a molecular clock. The change from three to four taxa incurs a major increase in the complexity of the underlying algebraic system, and requires novel techniques and approaches. We start by presenting the general maximum likelihood problem on phylogenetic trees as a constrained optimization problem, and the resulting system of polynomial equations. In full generality, it is infeasible to solve this system, therefore specialized tools for the molecular clock case are developed. Four-taxa rooted trees have two topologies-the fork (two subtrees with two leaves each) and the comb (one subtree with three leaves, the other with a single leaf). We combine the ultrametric properties of molecular clock fork trees with the Hadamard conjugation to derive a number of topology dependent identities. Employing these identities, we substantially simplify the system of polynomial equations for the fork. We finally employ symbolic algebra software to obtain closed formanalytic solutions (expressed parametrically in the input data). In general, four-taxa trees can have multiple ML points. In contrast, we can now prove that each fork topology has a unique(local and global) ML point.

  7. Awareness of Entities, Activities and Contexts in Ambient Systems

    DEFF Research Database (Denmark)

    Kristensen, Bent Bruun

    2013-01-01

    Ambient systems are modeled by entities, activities and contexts, where entities exist in contexts and engage in activities. A context supports a dynamic collection of entities by services and offers awareness information about the entities. Activities also exist in contexts and model ongoing...

  8. Composite interval mapping of QTL for dynamic traits

    Institute of Scientific and Technical Information of China (English)

    GAO Huijiang; YANG Runqing

    2006-01-01

    Many economically important quantitative traits in animals and plants are measured repeatedly over time. These traits are called dynamic traits. Mapping QTL controlling the phenotypic profiles of dynamic traits has become an interesting topic for animal and plant breeders. However, statistical methods of QTL mapping for dynamic traits have not been well developed. We develop a composite interval mapping approach to detecting QTL for dynamic traits. We fit the profile of each QTL effect with Legendre polynomials. Parameter estimation and statistical test are performed on the regression coefficients of the polynomials under the maximum likelihood framework. Maximum likelihood estimates of QTL parameters are obtained via the EM algorithm. Results of simulation study showed that composite interval mapping can improve both the statistcial power of QTL detecting and the accuracy of parameter estimation relative to the simply interval mapping procedure where only one QTL is fit to each model. The method is developed in the context of an F2 mapping population, but extension to other types of mapping populations is straightforward.

  9. Empirical likelihood-based inference in a partially linear model for longitudinal data

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    A partially linear model with longitudinal data is considered, empirical likelihood to inference for the regression coefficients and the baseline function is investigated, the empirical log-likelihood ratios is proven to be asymptotically chi-squared, and the corresponding confidence regions for the parameters of interest are then constructed. Also by the empirical likelihood ratio functions, we can obtain the maximum empirical likelihood estimates of the regression coefficients and the baseline function, and prove the asymptotic normality. The numerical results are conducted to compare the performance of the empirical likelihood and the normal approximation-based method, and a real example is analysed.

  10. Empirical likelihood-based inference in a partially linear model for longitudinal data

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    A partially linear model with longitudinal data is considered, empirical likelihood to infer- ence for the regression coefficients and the baseline function is investigated, the empirical log-likelihood ratios is proven to be asymptotically chi-squared, and the corresponding confidence regions for the pa- rameters of interest are then constructed. Also by the empirical likelihood ratio functions, we can obtain the maximum empirical likelihood estimates of the regression coefficients and the baseline function, and prove the asymptotic normality. The numerical results are conducted to compare the performance of the empirical likelihood and the normal approximation-based method, and a real example is analysed.

  11. The Dynamic Adaptability of Context and Translation%语境动态顺应论与翻译——浅析赛珍珠几部英文原著的书名翻译

    Institute of Scientific and Technical Information of China (English)

    曾玉萍

    2012-01-01

    语境动态顺应论认为语言使用的过程是语言使用者按照交际环境和交际对象不断进行语言选择的过程。语言使用过程中的语言的选择必须和语境相顺应。翻译作为一种跨文化交流的活动,也涉及到在目标语和原语之间进行不断选择的动态过程。赛珍珠几本英文原著的书名翻译同样应符合语境动态顺应。%The dynamic adaptability of context holds that the process ol using language is language users continuous making of linguistic choices which must conform to the communicational circumstances and its ob- jects. Translation, as an intercultural communication activity, also involves the continuous making of linguistics choices between the source language and the target language. The translation of the titles of some of Pearl S. Bock's well-known books should conform to the adaptability of context.

  12. Prior event rate ratio adjustment for hidden confounding in observational studies of treatment effectiveness: a pairwise Cox likelihood approach.

    Science.gov (United States)

    Lin, Nan Xuan; Henley, William Edward

    2016-12-10

    Observational studies provide a rich source of information for assessing effectiveness of treatment interventions in many situations where it is not ethical or practical to perform randomized controlled trials. However, such studies are prone to bias from hidden (unmeasured) confounding. A promising approach to identifying and reducing the impact of unmeasured confounding is prior event rate ratio (PERR) adjustment, a quasi-experimental analytic method proposed in the context of electronic medical record database studies. In this paper, we present a statistical framework for using a pairwise approach to PERR adjustment that removes bias inherent in the original PERR method. A flexible pairwise Cox likelihood function is derived and used to demonstrate the consistency of the simple and convenient alternative PERR (PERR-ALT) estimator. We show how to estimate standard errors and confidence intervals for treatment effect estimates based on the observed information and provide R code to illustrate how to implement the method. Assumptions required for the pairwise approach (as well as PERR) are clarified, and the consequences of model misspecification are explored. Our results confirm the need for researchers to consider carefully the suitability of the method in the context of each problem. Extensions of the pairwise likelihood to more complex designs involving time-varying covariates or more than two periods are considered. We illustrate the application of the method using data from a longitudinal cohort study of enzyme replacement therapy for lysosomal storage disorders. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  13. Does the simple dynamical systems approach provide useful information about catchment hydrological functioning in a Mediterranean context? Application to the Ardèche catchment (France

    Directory of Open Access Journals (Sweden)

    M. Adamovic

    2014-09-01

    Full Text Available This study explores how catchment heterogeneity and variability can be summarized in simplified models, representing the dominant hydrological processes. It focuses on Mediterranean catchments, characterized by heterogeneous geology, pedology, and land use, as well as steep topography and a rainfall regime in which summer droughts contrast with high-rainfall periods in autumn. The Ardèche catchment (south-east France, typical of this environment, is chosen to explore the following questions: (1 can such a Mediterranean catchment be adequately characterized by simple dynamical systems approach and what are the limits of the method under such conditions? (2 What information about dominant predictors of hydrological variability can be retrieved from this analysis in such catchments? In this work we apply the data-driven approach of Kirchner (WRR, 2009 to estimate discharge sensitivity functions that summarize the behavior of four sub-catchments of the Ardèche, using non-vegetation periods (November–March from 9 years of data (2000–2008 from operational networks. The relevance of the inferred sensitivity function is assessed through hydrograph simulations, and through estimating precipitation rates from discharge fluctuations. We find that the discharge-sensitivity function is downward-curving in double-logarithmic space, thus allowing further simulation of discharge and non-divergence of the model, only during non-vegetation periods. The analysis is complemented by a Monte-Carlo sensitivity analysis showing how the parameters summarizing the discharge sensitivity function impact the simulated hydrographs. The resulting discharge simulation results are good for granite catchments, found to be predominantly characterized by saturation excess runoff and sub-surface flow processes. The simple dynamical system hypothesis works especially well in wet conditions (peaks and recessions are well modeled. On the other hand, poor model performance is

  14. Modeling trust context in networks

    CERN Document Server

    Adali, Sibel

    2013-01-01

    We make complex decisions every day, requiring trust in many different entities for different reasons. These decisions are not made by combining many isolated trust evaluations. Many interlocking factors play a role, each dynamically impacting the others.? In this brief, 'trust context' is defined as the system level description of how the trust evaluation process unfolds.Networks today are part of almost all human activity, supporting and shaping it. Applications increasingly incorporate new interdependencies and new trust contexts. Social networks connect people and organizations throughout

  15. Cultural Context and Translation

    Institute of Scientific and Technical Information of China (English)

    张敏

    2009-01-01

    cultural context plays an important role in translation. Because translation is a cross-culture activity, the culture context that influ-ences translating is consisted of both the culture contexts of source language and target language. This article firstly analyzes the concept of context and cultural context, then according to the procedure of translating classifies cultural context into two stages and talks about how they respectively influence translating.

  16. Rayleigh-maximum-likelihood bilateral filter for ultrasound image enhancement.

    Science.gov (United States)

    Li, Haiyan; Wu, Jun; Miao, Aimin; Yu, Pengfei; Chen, Jianhua; Zhang, Yufeng

    2017-04-17

    Ultrasound imaging plays an important role in computer diagnosis since it is non-invasive and cost-effective. However, ultrasound images are inevitably contaminated by noise and speckle during acquisition. Noise and speckle directly impact the physician to interpret the images and decrease the accuracy in clinical diagnosis. Denoising method is an important component to enhance the quality of ultrasound images; however, several limitations discourage the results because current denoising methods can remove noise while ignoring the statistical characteristics of speckle and thus undermining the effectiveness of despeckling, or vice versa. In addition, most existing algorithms do not identify noise, speckle or edge before removing noise or speckle, and thus they reduce noise and speckle while blurring edge details. Therefore, it is a challenging issue for the traditional methods to effectively remove noise and speckle in ultrasound images while preserving edge details. To overcome the above-mentioned limitations, a novel method, called Rayleigh-maximum-likelihood switching bilateral filter (RSBF) is proposed to enhance ultrasound images by two steps: noise, speckle and edge detection followed by filtering. Firstly, a sorted quadrant median vector scheme is utilized to calculate the reference median in a filtering window in comparison with the central pixel to classify the target pixel as noise, speckle or noise-free. Subsequently, the noise is removed by a bilateral filter and the speckle is suppressed by a Rayleigh-maximum-likelihood filter while the noise-free pixels are kept unchanged. To quantitatively evaluate the performance of the proposed method, synthetic ultrasound images contaminated by speckle are simulated by using the speckle model that is subjected to Rayleigh distribution. Thereafter, the corrupted synthetic images are generated by the original image multiplied with the Rayleigh distributed speckle of various signal to noise ratio (SNR) levels and

  17. Likelihood ratio model for classification of forensic evidence

    Energy Technology Data Exchange (ETDEWEB)

    Zadora, G., E-mail: gzadora@ies.krakow.pl [Institute of Forensic Research, Westerplatte 9, 31-033 Krakow (Poland); Neocleous, T., E-mail: tereza@stats.gla.ac.uk [University of Glasgow, Department of Statistics, 15 University Gardens, Glasgow G12 8QW (United Kingdom)

    2009-05-29

    One of the problems of analysis of forensic evidence such as glass fragments, is the determination of their use-type category, e.g. does a glass fragment originate from an unknown window or container? Very small glass fragments arise during various accidents and criminal offences, and could be carried on the clothes, shoes and hair of participants. It is therefore necessary to obtain information on their physicochemical composition in order to solve the classification problem. Scanning Electron Microscopy coupled with an Energy Dispersive X-ray Spectrometer and the Glass Refractive Index Measurement method are routinely used in many forensic institutes for the investigation of glass. A natural form of glass evidence evaluation for forensic purposes is the likelihood ratio-LR = p(E|H{sub 1})/p(E|H{sub 2}). The main aim of this paper was to study the performance of LR models for glass object classification which considered one or two sources of data variability, i.e. between-glass-object variability and(or) within-glass-object variability. Within the proposed model a multivariate kernel density approach was adopted for modelling the between-object distribution and a multivariate normal distribution was adopted for modelling within-object distributions. Moreover, a graphical method of estimating the dependence structure was employed to reduce the highly multivariate problem to several lower-dimensional problems. The performed analysis showed that the best likelihood model was the one which allows to include information about between and within-object variability, and with variables derived from elemental compositions measured by SEM-EDX, and refractive values determined before (RI{sub b}) and after (RI{sub a}) the annealing process, in the form of dRI = log{sub 10}|RI{sub a} - RI{sub b}|. This model gave better results than the model with only between-object variability considered. In addition, when dRI and variables derived from elemental compositions were used, this

  18. Remote sensing and modelling of vegetation dynamics for early estimation and spatial analysis of grain yields in semiarid context in central Tunisia

    Science.gov (United States)

    Chahbi, Aicha; Zribi, Mehrez; Lili-Chabaane, Zohra

    2016-04-01

    In arid and semi-arid areas, population growth, urbanization, food security and climate change have an impact on agriculture in general and particular on the cereal production. Therefore to improve food security in arid countries, crop canopy monitoring and yield forecasting cereals are needed. Many models, based on the use of remote sensing or agro-meteorological models, have been developed to estimate the biomass and grain yield of cereals. Through the use of a rich database, acquired over a period of two years for more than 80 test fields, and from optical satellite SPOT/HRV images, the aim of the present study is to evaluate the feasibility of two yield prediction approaches. The first approach is based on the application of the semi-empirical growth model SAFY, developed to simulate the dynamics of the LAI and the grain yield, at the field scale. The model is able to reproduce the time evolution of the leaf area index of all fields with acceptable error. However, an inter-comparison between ground yield measurements and SAFY model simulations reveals that the yields are under-estimated by this model. We can explain the limits of the semi-empirical model SAFY by its simplicity and also by various factors that were not considered (fertilization, irrigation,...). To improve the yield estimation, a new approach is proposed: the grain yield is estimated in function of the LAI in the growth period between 25 March and 5 April. The LAI of this period is estimated by SAFY model. A linear relationship is developed between the measured grain yield and the LAI area of the maximum growth period.This approach is robust, the measured and estimated grain yields are well correlated. Following the validation of this approach, yield estimations are proposed for the entire studied site using the SPOT/HRV images.

  19. Context-specific graphical models for discret longitudinal data

    DEFF Research Database (Denmark)

    Edwards, David; Anantharama Ankinakatte, Smitha

    2015-01-01

    Ron et al. (1998) introduced a rich family of models for discrete longitudinal data called acyclic probabilistic finite automata. These may be represented as directed graphs that embody context-specific conditional independence relations. Here, the approach is developed from a statistical...... perspective. It is shown here that likelihood ratio tests may be constructed using standard contingency table methods, a model selection procedure that minimizes a penalized likelihood criterion is described, and a way to extend the models to incorporate covariates is proposed. The methods are applied...

  20. Evidence for extra radiation? Profile likelihood versus Bayesian posterior

    CERN Document Server

    Hamann, Jan

    2011-01-01

    A number of recent analyses of cosmological data have reported hints for the presence of extra radiation beyond the standard model expectation. In order to test the robustness of these claims under different methods of constructing parameter constraints, we perform a Bayesian posterior-based and a likelihood profile-based analysis of current data. We confirm the presence of a slight discrepancy between posterior- and profile-based constraints, with the marginalised posterior preferring higher values of the effective number of neutrino species N_eff. This can be traced back to a volume effect occurring during the marginalisation process, and we demonstrate that the effect is related to the fact that cosmic microwave background (CMB) data constrain N_eff only indirectly via the redshift of matter-radiation equality. Once present CMB data are combined with external information about, e.g., the Hubble parameter, the difference between the methods becomes small compared to the uncertainty of N_eff. We conclude tha...