WorldWideScience

Sample records for instrumental variables estimates

  1. A review of instrumental variable estimators for Mendelian randomization.

    Science.gov (United States)

    Burgess, Stephen; Small, Dylan S; Thompson, Simon G

    2017-10-01

    Instrumental variable analysis is an approach for obtaining causal inferences on the effect of an exposure (risk factor) on an outcome from observational data. It has gained in popularity over the past decade with the use of genetic variants as instrumental variables, known as Mendelian randomization. An instrumental variable is associated with the exposure, but not associated with any confounder of the exposure-outcome association, nor is there any causal pathway from the instrumental variable to the outcome other than via the exposure. Under the assumption that a single instrumental variable or a set of instrumental variables for the exposure is available, the causal effect of the exposure on the outcome can be estimated. There are several methods available for instrumental variable estimation; we consider the ratio method, two-stage methods, likelihood-based methods, and semi-parametric methods. Techniques for obtaining statistical inferences and confidence intervals are presented. The statistical properties of estimates from these methods are compared, and practical advice is given about choosing a suitable analysis method. In particular, bias and coverage properties of estimators are considered, especially with weak instruments. Settings particularly relevant to Mendelian randomization are prioritized in the paper, notably the scenario of a continuous exposure and a continuous or binary outcome.

  2. Comparison of variance estimators for metaanalysis of instrumental variable estimates

    NARCIS (Netherlands)

    Schmidt, A. F.; Hingorani, A. D.; Jefferis, B. J.; White, J.; Groenwold, R. H H; Dudbridge, F.; Ben-Shlomo, Y.; Chaturvedi, N.; Engmann, J.; Hughes, A.; Humphries, S.; Hypponen, E.; Kivimaki, M.; Kuh, D.; Kumari, M.; Menon, U.; Morris, R.; Power, C.; Price, J.; Wannamethee, G.; Whincup, P.

    2016-01-01

    Background: Mendelian randomization studies perform instrumental variable (IV) analysis using genetic IVs. Results of individual Mendelian randomization studies can be pooled through meta-analysis. We explored how different variance estimators influence the meta-analysed IV estimate. Methods: Two

  3. Instrumental variables I: instrumental variables exploit natural variation in nonexperimental data to estimate causal relationships.

    Science.gov (United States)

    Rassen, Jeremy A; Brookhart, M Alan; Glynn, Robert J; Mittleman, Murray A; Schneeweiss, Sebastian

    2009-12-01

    The gold standard of study design for treatment evaluation is widely acknowledged to be the randomized controlled trial (RCT). Trials allow for the estimation of causal effect by randomly assigning participants either to an intervention or comparison group; through the assumption of "exchangeability" between groups, comparing the outcomes will yield an estimate of causal effect. In the many cases where RCTs are impractical or unethical, instrumental variable (IV) analysis offers a nonexperimental alternative based on many of the same principles. IV analysis relies on finding a naturally varying phenomenon, related to treatment but not to outcome except through the effect of treatment itself, and then using this phenomenon as a proxy for the confounded treatment variable. This article demonstrates how IV analysis arises from an analogous but potentially impossible RCT design, and outlines the assumptions necessary for valid estimation. It gives examples of instruments used in clinical epidemiology and concludes with an outline on estimation of effects.

  4. Eliminating Survivor Bias in Two-stage Instrumental Variable Estimators.

    Science.gov (United States)

    Vansteelandt, Stijn; Walter, Stefan; Tchetgen Tchetgen, Eric

    2018-07-01

    Mendelian randomization studies commonly focus on elderly populations. This makes the instrumental variables analysis of such studies sensitive to survivor bias, a type of selection bias. A particular concern is that the instrumental variable conditions, even when valid for the source population, may be violated for the selective population of individuals who survive the onset of the study. This is potentially very damaging because Mendelian randomization studies are known to be sensitive to bias due to even minor violations of the instrumental variable conditions. Interestingly, the instrumental variable conditions continue to hold within certain risk sets of individuals who are still alive at a given age when the instrument and unmeasured confounders exert additive effects on the exposure, and moreover, the exposure and unmeasured confounders exert additive effects on the hazard of death. In this article, we will exploit this property to derive a two-stage instrumental variable estimator for the effect of exposure on mortality, which is insulated against the above described selection bias under these additivity assumptions.

  5. Censored Quantile Instrumental Variable Estimates of the Price Elasticity of Expenditure on Medical Care.

    Science.gov (United States)

    Kowalski, Amanda

    2016-01-02

    Efforts to control medical care costs depend critically on how individuals respond to prices. I estimate the price elasticity of expenditure on medical care using a censored quantile instrumental variable (CQIV) estimator. CQIV allows estimates to vary across the conditional expenditure distribution, relaxes traditional censored model assumptions, and addresses endogeneity with an instrumental variable. My instrumental variable strategy uses a family member's injury to induce variation in an individual's own price. Across the conditional deciles of the expenditure distribution, I find elasticities that vary from -0.76 to -1.49, which are an order of magnitude larger than previous estimates.

  6. Instrumental variable estimation of treatment effects for duration outcomes

    NARCIS (Netherlands)

    G.E. Bijwaard (Govert)

    2007-01-01

    textabstractIn this article we propose and implement an instrumental variable estimation procedure to obtain treatment effects on duration outcomes. The method can handle the typical complications that arise with duration data of time-varying treatment and censoring. The treatment effect we

  7. LARF: Instrumental Variable Estimation of Causal Effects through Local Average Response Functions

    Directory of Open Access Journals (Sweden)

    Weihua An

    2016-07-01

    Full Text Available LARF is an R package that provides instrumental variable estimation of treatment effects when both the endogenous treatment and its instrument (i.e., the treatment inducement are binary. The method (Abadie 2003 involves two steps. First, pseudo-weights are constructed from the probability of receiving the treatment inducement. By default LARF estimates the probability by a probit regression. It also provides semiparametric power series estimation of the probability and allows users to employ other external methods to estimate the probability. Second, the pseudo-weights are used to estimate the local average response function conditional on treatment and covariates. LARF provides both least squares and maximum likelihood estimates of the conditional treatment effects.

  8. Robust best linear estimation for regression analysis using surrogate and instrumental variables.

    Science.gov (United States)

    Wang, C Y

    2012-04-01

    We investigate methods for regression analysis when covariates are measured with errors. In a subset of the whole cohort, a surrogate variable is available for the true unobserved exposure variable. The surrogate variable satisfies the classical measurement error model, but it may not have repeated measurements. In addition to the surrogate variables that are available among the subjects in the calibration sample, we assume that there is an instrumental variable (IV) that is available for all study subjects. An IV is correlated with the unobserved true exposure variable and hence can be useful in the estimation of the regression coefficients. We propose a robust best linear estimator that uses all the available data, which is the most efficient among a class of consistent estimators. The proposed estimator is shown to be consistent and asymptotically normal under very weak distributional assumptions. For Poisson or linear regression, the proposed estimator is consistent even if the measurement error from the surrogate or IV is heteroscedastic. Finite-sample performance of the proposed estimator is examined and compared with other estimators via intensive simulation studies. The proposed method and other methods are applied to a bladder cancer case-control study.

  9. Bayesian methods for meta-analysis of causal relationships estimated using genetic instrumental variables

    DEFF Research Database (Denmark)

    Burgess, Stephen; Thompson, Simon G; Thompson, Grahame

    2010-01-01

    Genetic markers can be used as instrumental variables, in an analogous way to randomization in a clinical trial, to estimate the causal relationship between a phenotype and an outcome variable. Our purpose is to extend the existing methods for such Mendelian randomization studies to the context o...

  10. Instrumental Variables in the Long Run

    DEFF Research Database (Denmark)

    Casey, Gregory; Klemp, Marc Patrick Brag

    2017-01-01

    In the study of long-run economic growth, it is common to use historical or geographical variables as instruments for contemporary endogenous regressors. We study the interpretation of these conventional instrumental variable (IV) regressions in a general, yet simple, framework. Our aim...... quantitative implications for the field of long-run economic growth. We also use our framework to examine related empirical techniques. We find that two prominent regression methodologies - using gravity-based instruments for trade and including ancestry-adjusted variables in linear regression models - have...... is to estimate the long-run causal effect of changes in the endogenous explanatory variable. We find that conventional IV regressions generally cannot recover this parameter of interest. To estimate this parameter, therefore, we develop an augmented IV estimator that combines the conventional regression...

  11. Instrumental variables estimation of exposure effects on a time-to-event endpoint using structural cumulative survival models

    DEFF Research Database (Denmark)

    Martinussen, Torben; Vansteelandt, Stijn; Tchetgen Tchetgen, Eric J.

    2017-01-01

    The use of instrumental variables for estimating the effect of an exposure on an outcome is popular in econometrics, and increasingly so in epidemiology. This increasing popularity may be attributed to the natural occurrence of instrumental variables in observational studies that incorporate elem...

  12. Instrumental variables estimation of exposure effects on a time-to-event endpoint using structural cumulative survival models.

    Science.gov (United States)

    Martinussen, Torben; Vansteelandt, Stijn; Tchetgen Tchetgen, Eric J; Zucker, David M

    2017-12-01

    The use of instrumental variables for estimating the effect of an exposure on an outcome is popular in econometrics, and increasingly so in epidemiology. This increasing popularity may be attributed to the natural occurrence of instrumental variables in observational studies that incorporate elements of randomization, either by design or by nature (e.g., random inheritance of genes). Instrumental variables estimation of exposure effects is well established for continuous outcomes and to some extent for binary outcomes. It is, however, largely lacking for time-to-event outcomes because of complications due to censoring and survivorship bias. In this article, we make a novel proposal under a class of structural cumulative survival models which parameterize time-varying effects of a point exposure directly on the scale of the survival function; these models are essentially equivalent with a semi-parametric variant of the instrumental variables additive hazards model. We propose a class of recursive instrumental variable estimators for these exposure effects, and derive their large sample properties along with inferential tools. We examine the performance of the proposed method in simulation studies and illustrate it in a Mendelian randomization study to evaluate the effect of diabetes on mortality using data from the Health and Retirement Study. We further use the proposed method to investigate potential benefit from breast cancer screening on subsequent breast cancer mortality based on the HIP-study. © 2017, The International Biometric Society.

  13. Instrumental variables estimates of peer effects in social networks.

    Science.gov (United States)

    An, Weihua

    2015-03-01

    Estimating peer effects with observational data is very difficult because of contextual confounding, peer selection, simultaneity bias, and measurement error, etc. In this paper, I show that instrumental variables (IVs) can help to address these problems in order to provide causal estimates of peer effects. Based on data collected from over 4000 students in six middle schools in China, I use the IV methods to estimate peer effects on smoking. My design-based IV approach differs from previous ones in that it helps to construct potentially strong IVs and to directly test possible violation of exogeneity of the IVs. I show that measurement error in smoking can lead to both under- and imprecise estimations of peer effects. Based on a refined measure of smoking, I find consistent evidence for peer effects on smoking. If a student's best friend smoked within the past 30 days, the student was about one fifth (as indicated by the OLS estimate) or 40 percentage points (as indicated by the IV estimate) more likely to smoke in the same time period. The findings are robust to a variety of robustness checks. I also show that sharing cigarettes may be a mechanism for peer effects on smoking. A 10% increase in the number of cigarettes smoked by a student's best friend is associated with about 4% increase in the number of cigarettes smoked by the student in the same time period. Copyright © 2014 Elsevier Inc. All rights reserved.

  14. Invited Commentary: Using Financial Credits as Instrumental Variables for Estimating the Causal Relationship Between Income and Health.

    Science.gov (United States)

    Pega, Frank

    2016-05-01

    Social epidemiologists are interested in determining the causal relationship between income and health. Natural experiments in which individuals or groups receive income randomly or quasi-randomly from financial credits (e.g., tax credits or cash transfers) are increasingly being analyzed using instrumental variable analysis. For example, in this issue of the Journal, Hamad and Rehkopf (Am J Epidemiol. 2016;183(9):775-784) used an in-work tax credit called the Earned Income Tax Credit as an instrument to estimate the association between income and child development. However, under certain conditions, the use of financial credits as instruments could violate 2 key instrumental variable analytic assumptions. First, some financial credits may directly influence health, for example, through increasing a psychological sense of welfare security. Second, financial credits and health may have several unmeasured common causes, such as politics, other social policies, and the motivation to maximize the credit. If epidemiologists pursue such instrumental variable analyses, using the amount of an unconditional, universal credit that an individual or group has received as the instrument may produce the most conceptually convincing and generalizable evidence. However, other natural income experiments (e.g., lottery winnings) and other methods that allow better adjustment for confounding might be more promising approaches for estimating the causal relationship between income and health. © The Author 2016. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  15. Instrumental variables estimation under a structural Cox model

    DEFF Research Database (Denmark)

    Martinussen, Torben; Nørbo Sørensen, Ditte; Vansteelandt, Stijn

    2017-01-01

    Instrumental variable (IV) analysis is an increasingly popular tool for inferring the effect of an exposure on an outcome, as witnessed by the growing number of IV applications in epidemiology, for instance. The majority of IV analyses of time-to-event endpoints are, however, dominated by heurist...

  16. Impact of instrumental response on observed ozonesonde profiles: First-order estimates and implications for measures of variability

    Science.gov (United States)

    Clifton, G. T.; Merrill, J. T.; Johnson, B. J.; Oltmans, S. J.

    2009-12-01

    Ozonesondes provide information on the ozone distribution up to the middle stratosphere. Ozone profiles often feature layers, with vertically discrete maxima and minima in the mixing ratio. Layers are especially common in the UT/LS regions and originate from wave breaking, shearing and other transport processes. ECC sondes, however, have a moderate response time to significant changes in ozone. A sonde can ascend over 350 meters before it responds fully to a step change in ozone. This results in an overestimate of the altitude assigned to layers and an underestimate of the underlying variability in the amount of ozone. An estimate of the response time is made for each instrument during the preparation for flight, but the profile data are typically not processed to account for the response. Here we present a method of categorizing the response time of ECC instruments and an analysis of a low-pass filter approximation to the effects on profile data. Exponential functions were fit to the step-up and step-down responses using laboratory data. The resulting response time estimates were consistent with results from standard procedures, with the up-step response time exceeding the down-step value somewhat. A single-pole Butterworth filter that approximates the instrumental effect was used with synthetic layered profiles to make first-order estimates of the impact of the finite response time. Using a layer analysis program previously applied to observed profiles we find that instrumental effects can attenuate ozone variability by 20-45% in individual layers, but that the vertical offset in layer altitudes is moderate, up to about 150 meters. We will present results obtained using this approach, coupled with data on the distribution of layer characteristics found using the layer analysis procedure on profiles from Narragansett, Rhode Island and other US sites to quantify the impact on overall variability estimates given ambient distributions of layer occurrence, thickness

  17. On the Interpretation of Instrumental Variables in the Presence of Specification Errors

    Directory of Open Access Journals (Sweden)

    P.A.V.B. Swamy

    2015-01-01

    Full Text Available The method of instrumental variables (IV and the generalized method of moments (GMM, and their applications to the estimation of errors-in-variables and simultaneous equations models in econometrics, require data on a sufficient number of instrumental variables that are both exogenous and relevant. We argue that, in general, such instruments (weak or strong cannot exist.

  18. Testing concordance of instrumental variable effects in generalized linear models with application to Mendelian randomization

    Science.gov (United States)

    Dai, James Y.; Chan, Kwun Chuen Gary; Hsu, Li

    2014-01-01

    Instrumental variable regression is one way to overcome unmeasured confounding and estimate causal effect in observational studies. Built on structural mean models, there has been considerale work recently developed for consistent estimation of causal relative risk and causal odds ratio. Such models can sometimes suffer from identification issues for weak instruments. This hampered the applicability of Mendelian randomization analysis in genetic epidemiology. When there are multiple genetic variants available as instrumental variables, and causal effect is defined in a generalized linear model in the presence of unmeasured confounders, we propose to test concordance between instrumental variable effects on the intermediate exposure and instrumental variable effects on the disease outcome, as a means to test the causal effect. We show that a class of generalized least squares estimators provide valid and consistent tests of causality. For causal effect of a continuous exposure on a dichotomous outcome in logistic models, the proposed estimators are shown to be asymptotically conservative. When the disease outcome is rare, such estimators are consistent due to the log-linear approximation of the logistic function. Optimality of such estimators relative to the well-known two-stage least squares estimator and the double-logistic structural mean model is further discussed. PMID:24863158

  19. Econometrics in outcomes research: the use of instrumental variables.

    Science.gov (United States)

    Newhouse, J P; McClellan, M

    1998-01-01

    We describe an econometric technique, instrumental variables, that can be useful in estimating the effectiveness of clinical treatments in situations when a controlled trial has not or cannot be done. This technique relies upon the existence of one or more variables that induce substantial variation in the treatment variable but have no direct effect on the outcome variable of interest. We illustrate the use of the technique with an application to aggressive treatment of acute myocardial infarction in the elderly.

  20. Does the Early Bird Catch the Worm? Instrumental Variable Estimates of Educational Effects of Age of School Entry in Germany

    OpenAIRE

    Puhani, Patrick A.; Weber, Andrea M.

    2006-01-01

    We estimate the effect of age of school entry on educational outcomes using two different data sets for Germany, sampling pupils at the end of primary school and in the middle of secondary school. Results are obtained based on instrumental variable estimation exploiting the exogenous variation in month of birth. We find robust and significant positive effects on educational outcomes for pupils who enter school at seven instead of six years of age: Test scores at the end of primary school incr...

  1. Bias and Bias Correction in Multi-Site Instrumental Variables Analysis of Heterogeneous Mediator Effects

    Science.gov (United States)

    Reardon, Sean F.; Unlu, Faith; Zhu, Pei; Bloom, Howard

    2013-01-01

    We explore the use of instrumental variables (IV) analysis with a multi-site randomized trial to estimate the effect of a mediating variable on an outcome in cases where it can be assumed that the observed mediator is the only mechanism linking treatment assignment to outcomes, as assumption known in the instrumental variables literature as the…

  2. Causal null hypotheses of sustained treatment strategies: What can be tested with an instrumental variable?

    Science.gov (United States)

    Swanson, Sonja A; Labrecque, Jeremy; Hernán, Miguel A

    2018-05-02

    Sometimes instrumental variable methods are used to test whether a causal effect is null rather than to estimate the magnitude of a causal effect. However, when instrumental variable methods are applied to time-varying exposures, as in many Mendelian randomization studies, it is unclear what causal null hypothesis is tested. Here, we consider different versions of causal null hypotheses for time-varying exposures, show that the instrumental variable conditions alone are insufficient to test some of them, and describe additional assumptions that can be made to test a wider range of causal null hypotheses, including both sharp and average causal null hypotheses. Implications for interpretation and reporting of instrumental variable results are discussed.

  3. Health insurance and the demand for medical care: Instrumental variable estimates using health insurer claims data.

    Science.gov (United States)

    Dunn, Abe

    2016-07-01

    This paper takes a different approach to estimating demand for medical care that uses the negotiated prices between insurers and providers as an instrument. The instrument is viewed as a textbook "cost shifting" instrument that impacts plan offerings, but is unobserved by consumers. The paper finds a price elasticity of demand of around -0.20, matching the elasticity found in the RAND Health Insurance Experiment. The paper also studies within-market variation in demand for prescription drugs and other medical care services and obtains comparable price elasticity estimates. Published by Elsevier B.V.

  4. Sensitivity analysis and power for instrumental variable studies.

    Science.gov (United States)

    Wang, Xuran; Jiang, Yang; Zhang, Nancy R; Small, Dylan S

    2018-03-31

    In observational studies to estimate treatment effects, unmeasured confounding is often a concern. The instrumental variable (IV) method can control for unmeasured confounding when there is a valid IV. To be a valid IV, a variable needs to be independent of unmeasured confounders and only affect the outcome through affecting the treatment. When applying the IV method, there is often concern that a putative IV is invalid to some degree. We present an approach to sensitivity analysis for the IV method which examines the sensitivity of inferences to violations of IV validity. Specifically, we consider sensitivity when the magnitude of association between the putative IV and the unmeasured confounders and the direct effect of the IV on the outcome are limited in magnitude by a sensitivity parameter. Our approach is based on extending the Anderson-Rubin test and is valid regardless of the strength of the instrument. A power formula for this sensitivity analysis is presented. We illustrate its usage via examples about Mendelian randomization studies and its implications via a comparison of using rare versus common genetic variants as instruments. © 2018, The International Biometric Society.

  5. Robotic-surgical instrument wrist pose estimation.

    Science.gov (United States)

    Fabel, Stephan; Baek, Kyungim; Berkelman, Peter

    2010-01-01

    The Compact Lightweight Surgery Robot from the University of Hawaii includes two teleoperated instruments and one endoscope manipulator which act in accord to perform assisted interventional medicine. The relative positions and orientations of the robotic instruments and endoscope must be known to the teleoperation system so that the directions of the instrument motions can be controlled to correspond closely to the directions of the motions of the master manipulators, as seen by the the endoscope and displayed to the surgeon. If the manipulator bases are mounted in known locations and all manipulator joint variables are known, then the necessary coordinate transformations between the master and slave manipulators can be easily computed. The versatility and ease of use of the system can be increased, however, by allowing the endoscope or instrument manipulator bases to be moved to arbitrary positions and orientations without reinitializing each manipulator or remeasuring their relative positions. The aim of this work is to find the pose of the instrument end effectors using the video image from the endoscope camera. The P3P pose estimation algorithm is used with a Levenberg-Marquardt optimization to ensure convergence. The correct transformations between the master and slave coordinate frames can then be calculated and updated when the bases of the endoscope or instrument manipulators are moved to new, unknown, positions at any time before or during surgical procedures.

  6. The contextual effects of social capital on health: a cross-national instrumental variable analysis.

    Science.gov (United States)

    Kim, Daniel; Baum, Christopher F; Ganz, Michael L; Subramanian, S V; Kawachi, Ichiro

    2011-12-01

    Past research on the associations between area-level/contextual social capital and health has produced conflicting evidence. However, interpreting this rapidly growing literature is difficult because estimates using conventional regression are prone to major sources of bias including residual confounding and reverse causation. Instrumental variable (IV) analysis can reduce such bias. Using data on up to 167,344 adults in 64 nations in the European and World Values Surveys and applying IV and ordinary least squares (OLS) regression, we estimated the contextual effects of country-level social trust on individual self-rated health. We further explored whether these associations varied by gender and individual levels of trust. Using OLS regression, we found higher average country-level trust to be associated with better self-rated health in both women and men. Instrumental variable analysis yielded qualitatively similar results, although the estimates were more than double in size in both sexes when country population density and corruption were used as instruments. The estimated health effects of raising the percentage of a country's population that trusts others by 10 percentage points were at least as large as the estimated health effects of an individual developing trust in others. These findings were robust to alternative model specifications and instruments. Conventional regression and to a lesser extent IV analysis suggested that these associations are more salient in women and in women reporting social trust. In a large cross-national study, our findings, including those using instrumental variables, support the presence of beneficial effects of higher country-level trust on self-rated health. Previous findings for contextual social capital using traditional regression may have underestimated the true associations. Given the close linkages between self-rated health and all-cause mortality, the public health gains from raising social capital within and across

  7. Bias and Bias Correction in Multisite Instrumental Variables Analysis of Heterogeneous Mediator Effects

    Science.gov (United States)

    Reardon, Sean F.; Unlu, Fatih; Zhu, Pei; Bloom, Howard S.

    2014-01-01

    We explore the use of instrumental variables (IV) analysis with a multisite randomized trial to estimate the effect of a mediating variable on an outcome in cases where it can be assumed that the observed mediator is the only mechanism linking treatment assignment to outcomes, an assumption known in the IV literature as the exclusion restriction.…

  8. Semiparametric methods for estimation of a nonlinear exposure‐outcome relationship using instrumental variables with application to Mendelian randomization

    Science.gov (United States)

    Staley, James R.

    2017-01-01

    ABSTRACT Mendelian randomization, the use of genetic variants as instrumental variables (IV), can test for and estimate the causal effect of an exposure on an outcome. Most IV methods assume that the function relating the exposure to the expected value of the outcome (the exposure‐outcome relationship) is linear. However, in practice, this assumption may not hold. Indeed, often the primary question of interest is to assess the shape of this relationship. We present two novel IV methods for investigating the shape of the exposure‐outcome relationship: a fractional polynomial method and a piecewise linear method. We divide the population into strata using the exposure distribution, and estimate a causal effect, referred to as a localized average causal effect (LACE), in each stratum of population. The fractional polynomial method performs metaregression on these LACE estimates. The piecewise linear method estimates a continuous piecewise linear function, the gradient of which is the LACE estimate in each stratum. Both methods were demonstrated in a simulation study to estimate the true exposure‐outcome relationship well, particularly when the relationship was a fractional polynomial (for the fractional polynomial method) or was piecewise linear (for the piecewise linear method). The methods were used to investigate the shape of relationship of body mass index with systolic blood pressure and diastolic blood pressure. PMID:28317167

  9. Instrumental variable methods in comparative safety and effectiveness research.

    Science.gov (United States)

    Brookhart, M Alan; Rassen, Jeremy A; Schneeweiss, Sebastian

    2010-06-01

    Instrumental variable (IV) methods have been proposed as a potential approach to the common problem of uncontrolled confounding in comparative studies of medical interventions, but IV methods are unfamiliar to many researchers. The goal of this article is to provide a non-technical, practical introduction to IV methods for comparative safety and effectiveness research. We outline the principles and basic assumptions necessary for valid IV estimation, discuss how to interpret the results of an IV study, provide a review of instruments that have been used in comparative effectiveness research, and suggest some minimal reporting standards for an IV analysis. Finally, we offer our perspective of the role of IV estimation vis-à-vis more traditional approaches based on statistical modeling of the exposure or outcome. We anticipate that IV methods will be often underpowered for drug safety studies of very rare outcomes, but may be potentially useful in studies of intended effects where uncontrolled confounding may be substantial.

  10. Instrumental variable methods in comparative safety and effectiveness research†

    Science.gov (United States)

    Brookhart, M. Alan; Rassen, Jeremy A.; Schneeweiss, Sebastian

    2010-01-01

    Summary Instrumental variable (IV) methods have been proposed as a potential approach to the common problem of uncontrolled confounding in comparative studies of medical interventions, but IV methods are unfamiliar to many researchers. The goal of this article is to provide a non-technical, practical introduction to IV methods for comparative safety and effectiveness research. We outline the principles and basic assumptions necessary for valid IV estimation, discuss how to interpret the results of an IV study, provide a review of instruments that have been used in comparative effectiveness research, and suggest some minimal reporting standards for an IV analysis. Finally, we offer our perspective of the role of IV estimation vis-à-vis more traditional approaches based on statistical modeling of the exposure or outcome. We anticipate that IV methods will be often underpowered for drug safety studies of very rare outcomes, but may be potentially useful in studies of intended effects where uncontrolled confounding may be substantial. PMID:20354968

  11. Power calculator for instrumental variable analysis in pharmacoepidemiology.

    Science.gov (United States)

    Walker, Venexia M; Davies, Neil M; Windmeijer, Frank; Burgess, Stephen; Martin, Richard M

    2017-10-01

    Instrumental variable analysis, for example with physicians' prescribing preferences as an instrument for medications issued in primary care, is an increasingly popular method in the field of pharmacoepidemiology. Existing power calculators for studies using instrumental variable analysis, such as Mendelian randomization power calculators, do not allow for the structure of research questions in this field. This is because the analysis in pharmacoepidemiology will typically have stronger instruments and detect larger causal effects than in other fields. Consequently, there is a need for dedicated power calculators for pharmacoepidemiological research. The formula for calculating the power of a study using instrumental variable analysis in the context of pharmacoepidemiology is derived before being validated by a simulation study. The formula is applicable for studies using a single binary instrument to analyse the causal effect of a binary exposure on a continuous outcome. An online calculator, as well as packages in both R and Stata, are provided for the implementation of the formula by others. The statistical power of instrumental variable analysis in pharmacoepidemiological studies to detect a clinically meaningful treatment effect is an important consideration. Research questions in this field have distinct structures that must be accounted for when calculating power. The formula presented differs from existing instrumental variable power formulae due to its parametrization, which is designed specifically for ease of use by pharmacoepidemiologists. © The Author 2017. Published by Oxford University Press on behalf of the International Epidemiological Association

  12. The Effect of Birth Weight on Academic Performance: Instrumental Variable Analysis.

    Science.gov (United States)

    Lin, Shi Lin; Leung, Gabriel Matthew; Schooling, C Mary

    2017-05-01

    Observationally, lower birth weight is usually associated with poorer academic performance; whether this association is causal or the result of confounding is unknown. To investigate this question, we obtained an effect estimate, which can have a causal interpretation under specific assumptions, of birth weight on educational attainment using instrumental variable analysis based on single nucleotide polymorphisms determining birth weight combined with results from the Social Science Genetic Association Consortium study of 126,559 Caucasians. We similarly obtained an estimate of the effect of birth weight on academic performance in 4,067 adolescents from Hong Kong's (Chinese) Children of 1997 birth cohort (1997-2016), using twin status as an instrumental variable. Birth weight was not associated with years of schooling (per 100-g increase in birth weight, -0.006 years, 95% confidence interval (CI): -0.02, 0.01) or college completion (odds ratio = 1.00, 95% CI: 0.96, 1.03). Birth weight was also unrelated to academic performance in adolescents (per 100-g increase in birth weight, -0.004 grade, 95% CI: -0.04, 0.04) using instrumental variable analysis, although conventional regression gave a small positive association (0.02 higher grade, 95% CI: 0.01, 0.03). Observed associations of birth weight with academic performance may not be causal, suggesting that interventions should focus on the contextual factors generating this correlation. © The Author 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  13. Evaluating disease management programme effectiveness: an introduction to instrumental variables.

    Science.gov (United States)

    Linden, Ariel; Adams, John L

    2006-04-01

    This paper introduces the concept of instrumental variables (IVs) as a means of providing an unbiased estimate of treatment effects in evaluating disease management (DM) programme effectiveness. Model development is described using zip codes as the IV. Three diabetes DM outcomes were evaluated: annual diabetes costs, emergency department (ED) visits and hospital days. Both ordinary least squares (OLS) and IV estimates showed a significant treatment effect for diabetes costs (P = 0.011) but neither model produced a significant treatment effect for ED visits. However, the IV estimate showed a significant treatment effect for hospital days (P = 0.006) whereas the OLS model did not. These results illustrate the utility of IV estimation when the OLS model is sensitive to the confounding effect of hidden bias.

  14. The productivity of mental health care: an instrumental variable approach.

    Science.gov (United States)

    Lu, Mingshan

    1999-06-01

    BACKGROUND: Like many other medical technologies and treatments, there is a lack of reliable evidence on treatment effectiveness of mental health care. Increasingly, data from non-experimental settings are being used to study the effect of treatment. However, as in a number of studies using non-experimental data, a simple regression of outcome on treatment shows a puzzling negative and significant impact of mental health care on the improvement of mental health status, even after including a large number of potential control variables. The central problem in interpreting evidence from real-world or non-experimental settings is, therefore, the potential "selection bias" problem in observational data set. In other words, the choice/quantity of mental health care may be correlated with other variables, particularly unobserved variables, that influence outcome and this may lead to a bias in the estimate of the effect of care in conventional models. AIMS OF THE STUDY: This paper addresses the issue of estimating treatment effects using an observational data set. The information in a mental health data set obtained from two waves of data in Puerto Rico is explored. The results using conventional models - in which the potential selection bias is not controlled - and that from instrumental variable (IV) models - which is what was proposed in this study to correct for the contaminated estimation from conventional models - are compared. METHODS: Treatment effectiveness is estimated in a production function framework. Effectiveness is measured as the improvement in mental health status. To control for the potential selection bias problem, IV approaches are employed. The essence of the IV method is to use one or more instruments, which are observable factors that influence treatment but do not directly affect patient outcomes, to isolate the effect of treatment variation that is independent of unobserved patient characteristics. The data used in this study are the first (1992

  15. Falsification Testing of Instrumental Variables Methods for Comparative Effectiveness Research.

    Science.gov (United States)

    Pizer, Steven D

    2016-04-01

    To demonstrate how falsification tests can be used to evaluate instrumental variables methods applicable to a wide variety of comparative effectiveness research questions. Brief conceptual review of instrumental variables and falsification testing principles and techniques accompanied by an empirical application. Sample STATA code related to the empirical application is provided in the Appendix. Comparative long-term risks of sulfonylureas and thiazolidinediones for management of type 2 diabetes. Outcomes include mortality and hospitalization for an ambulatory care-sensitive condition. Prescribing pattern variations are used as instrumental variables. Falsification testing is an easily computed and powerful way to evaluate the validity of the key assumption underlying instrumental variables analysis. If falsification tests are used, instrumental variables techniques can help answer a multitude of important clinical questions. © Health Research and Educational Trust.

  16. Combining within and between instrument information to estimate precision

    International Nuclear Information System (INIS)

    Jost, J.W.; Devary, J.L.; Ward, J.E.

    1980-01-01

    When two instruments, both having replicated measurements, are used to measure the same set of items, between instrument information may be used to augment the within instrument precision estimate. A method is presented which combines the within and between instrument information to obtain an unbiased and minimum variance estimate of instrument precision. The method does not assume the instruments have equal precision

  17. Job demands and job strain as risk factors for employee wellbeing in elderly care: an instrumental-variables analysis.

    Science.gov (United States)

    Elovainio, Marko; Heponiemi, Tarja; Kuusio, Hannamaria; Jokela, Markus; Aalto, Anna-Mari; Pekkarinen, Laura; Noro, Anja; Finne-Soveri, Harriet; Kivimäki, Mika; Sinervo, Timo

    2015-02-01

    The association between psychosocial work environment and employee wellbeing has repeatedly been shown. However, as environmental evaluations have typically been self-reported, the observed associations may be attributable to reporting bias. Applying instrumental-variable regression, we used staffing level (the ratio of staff to residents) as an unconfounded instrument for self-reported job demands and job strain to predict various indicators of wellbeing (perceived stress, psychological distress and sleeping problems) among 1525 registered nurses, practical nurses and nursing assistants working in elderly care wards. In ordinary regression, higher self-reported job demands and job strain were associated with increased risk of perceived stress, psychological distress and sleeping problems. The effect estimates for the associations of these psychosocial factors with perceived stress and psychological distress were greater, but less precisely estimated, in an instrumental-variables analysis which took into account only the variation in self-reported job demands and job strain that was explained by staffing level. No association between psychosocial factors and sleeping problems was observed with the instrumental-variable analysis. These results support a causal interpretation of high self-reported job demands and job strain being risk factors for employee wellbeing. © The Author 2014. Published by Oxford University Press on behalf of the European Public Health Association. All rights reserved.

  18. Instrumental variable estimation in a survival context

    DEFF Research Database (Denmark)

    Tchetgen Tchetgen, Eric J; Walter, Stefan; Vansteelandt, Stijn

    2015-01-01

    for regression analysis in a survival context, primarily under an additive hazards model, for which we describe 2 simple methods for estimating causal effects. The first method is a straightforward 2-stage regression approach analogous to 2-stage least squares commonly used for IV analysis in linear regression....... The IV approach is very well developed in the context of linear regression and also for certain generalized linear models with a nonlinear link function. However, IV methods are not as well developed for regression analysis with a censored survival outcome. In this article, we develop the IV approach....... In this approach, the fitted value from a first-stage regression of the exposure on the IV is entered in place of the exposure in the second-stage hazard model to recover a valid estimate of the treatment effect of interest. The second method is a so-called control function approach, which entails adding...

  19. Institution, Financial Sector, and Economic Growth: Use The Institutions As An Instrument Variable

    Directory of Open Access Journals (Sweden)

    Albertus Girik Allo

    2016-06-01

    Full Text Available Institution has been investigated having indirect role on economic growth. This paper aims to evaluate whether the quality of institution matters for economic growth. By applying institution as instrumental variable at Foreign Direct Investment (FDI, quality of institution significantly influence economic growth. This study applies two set of data period, namely 1985-2013 and 2000-2013, available online in the World Bank (WB. The first data set, 1985-2013 is used to estimate the role of financial sector on economic growth, focuses on 67 countries. The second data set, 2000-2013 determine the role of institution on financial sector and economic growth by applying 2SLS estimation method. We define institutional variables as set of indicators: Control of Corruption, Political Stability and Absence of Violence, and Voice and Accountability provide declining impact of FDI to economic growth.

  20. Institution, Financial Sector, and Economic Growth: Use The Institutions As An Instrument Variable

    OpenAIRE

    Albertus Girik Allo

    2016-01-01

    Institution has been investigated having indirect role on economic growth. This paper aims to evaluate whether the quality of institution matters for economic growth. By applying institution as instrumental variable at Foreign Direct Investment (FDI), quality of institution significantly influence economic growth. This study applies two set of data period, namely 1985-2013 and 2000-2013, available online in the World Bank (WB). The first data set, 1985-2013 is used to estimate the role of fin...

  1. Association of Body Mass Index with Depression, Anxiety and Suicide-An Instrumental Variable Analysis of the HUNT Study.

    Directory of Open Access Journals (Sweden)

    Johan Håkon Bjørngaard

    Full Text Available While high body mass index is associated with an increased risk of depression and anxiety, cumulative evidence indicates that it is a protective factor for suicide. The associations from conventional observational studies of body mass index with mental health outcomes are likely to be influenced by reverse causality or confounding by ill-health. In the present study, we investigated the associations between offspring body mass index and parental anxiety, depression and suicide in order to avoid problems with reverse causality and confounding by ill-health.We used data from 32,457 mother-offspring and 27,753 father-offspring pairs from the Norwegian HUNT-study. Anxiety and depression were assessed using the Hospital Anxiety and Depression Scale and suicide death from national registers. Associations between offspring and own body mass index and symptoms of anxiety and depression and suicide mortality were estimated using logistic and Cox regression. Causal effect estimates were estimated with a two sample instrument variable approach using offspring body mass index as an instrument for parental body mass index.Both own and offspring body mass index were positively associated with depression, while the results did not indicate any substantial association between body mass index and anxiety. Although precision was low, suicide mortality was inversely associated with own body mass index and the results from the analysis using offspring body mass index supported these results. Adjusted odds ratios per standard deviation body mass index from the instrumental variable analysis were 1.22 (95% CI: 1.05, 1.43 for depression, 1.10 (95% CI: 0.95, 1.27 for anxiety, and the instrumental variable estimated hazard ratios for suicide was 0.69 (95% CI: 0.30, 1.63.The present study's results indicate that suicide mortality is inversely associated with body mass index. We also found support for a positive association between body mass index and depression, but not

  2. A selective review of the first 20 years of instrumental variables models in health-services research and medicine.

    Science.gov (United States)

    Cawley, John

    2015-01-01

    The method of instrumental variables (IV) is useful for estimating causal effects. Intuitively, it exploits exogenous variation in the treatment, sometimes called natural experiments or instruments. This study reviews the literature in health-services research and medical research that applies the method of instrumental variables, documents trends in its use, and offers examples of various types of instruments. A literature search of the PubMed and EconLit research databases for English-language journal articles published after 1990 yielded a total of 522 original research articles. Citations counts for each article were derived from the Web of Science. A selective review was conducted, with articles prioritized based on number of citations, validity and power of the instrument, and type of instrument. The average annual number of papers in health services research and medical research that apply the method of instrumental variables rose from 1.2 in 1991-1995 to 41.8 in 2006-2010. Commonly-used instruments (natural experiments) in health and medicine are relative distance to a medical care provider offering the treatment and the medical care provider's historic tendency to administer the treatment. Less common but still noteworthy instruments include randomization of treatment for reasons other than research, randomized encouragement to undertake the treatment, day of week of admission as an instrument for waiting time for surgery, and genes as an instrument for whether the respondent has a heritable condition. The use of the method of IV has increased dramatically in the past 20 years, and a wide range of instruments have been used. Applications of the method of IV have in several cases upended conventional wisdom that was based on correlations and led to important insights about health and healthcare. Future research should pursue new applications of existing instruments and search for new instruments that are powerful and valid.

  3. Fasting Glucose and the Risk of Depressive Symptoms: Instrumental-Variable Regression in the Cardiovascular Risk in Young Finns Study.

    Science.gov (United States)

    Wesołowska, Karolina; Elovainio, Marko; Hintsa, Taina; Jokela, Markus; Pulkki-Råback, Laura; Pitkänen, Niina; Lipsanen, Jari; Tukiainen, Janne; Lyytikäinen, Leo-Pekka; Lehtimäki, Terho; Juonala, Markus; Raitakari, Olli; Keltikangas-Järvinen, Liisa

    2017-12-01

    Type 2 diabetes (T2D) has been associated with depressive symptoms, but the causal direction of this association and the underlying mechanisms, such as increased glucose levels, remain unclear. We used instrumental-variable regression with a genetic instrument (Mendelian randomization) to examine a causal role of increased glucose concentrations in the development of depressive symptoms. Data were from the population-based Cardiovascular Risk in Young Finns Study (n = 1217). Depressive symptoms were assessed in 2012 using a modified Beck Depression Inventory (BDI-I). Fasting glucose was measured concurrently with depressive symptoms. A genetic risk score for fasting glucose (with 35 single nucleotide polymorphisms) was used as an instrumental variable for glucose. Glucose was not associated with depressive symptoms in the standard linear regression (B = -0.04, 95% CI [-0.12, 0.04], p = .34), but the instrumental-variable regression showed an inverse association between glucose and depressive symptoms (B = -0.43, 95% CI [-0.79, -0.07], p = .020). The difference between the estimates of standard linear regression and instrumental-variable regression was significant (p = .026) CONCLUSION: Our results suggest that the association between T2D and depressive symptoms is unlikely to be caused by increased glucose concentrations. It seems possible that T2D might be linked to depressive symptoms due to low glucose levels.

  4. Variable Kernel Density Estimation

    OpenAIRE

    Terrell, George R.; Scott, David W.

    1992-01-01

    We investigate some of the possibilities for improvement of univariate and multivariate kernel density estimates by varying the window over the domain of estimation, pointwise and globally. Two general approaches are to vary the window width by the point of estimation and by point of the sample observation. The first possibility is shown to be of little efficacy in one variable. In particular, nearest-neighbor estimators in all versions perform poorly in one and two dimensions, but begin to b...

  5. Optimal Inference for Instrumental Variables Regression with non-Gaussian Errors

    DEFF Research Database (Denmark)

    Cattaneo, Matias D.; Crump, Richard K.; Jansson, Michael

    This paper is concerned with inference on the coefficient on the endogenous regressor in a linear instrumental variables model with a single endogenous regressor, nonrandom exogenous regressors and instruments, and i.i.d. errors whose distribution is unknown. It is shown that under mild smoothness...

  6. Decoupling Solar Variability and Instrument Trends Using the Multiple Same-Irradiance-Level (MuSIL) Analysis Technique

    Science.gov (United States)

    Woods, Thomas N.; Eparvier, Francis G.; Harder, Jerald; Snow, Martin

    2018-05-01

    The solar spectral irradiance (SSI) dataset is a key record for studying and understanding the energetics and radiation balance in Earth's environment. Understanding the long-term variations of the SSI over timescales of the 11-year solar activity cycle and longer is critical for many Sun-Earth research topics. Satellite measurements of the SSI have been made since the 1970s, most of them in the ultraviolet, but recently also in the visible and near-infrared. A limiting factor for the accuracy of previous solar variability results is the uncertainties for the instrument degradation corrections, which need fairly large corrections relative to the amount of solar cycle variability at some wavelengths. The primary objective of this investigation has been to separate out solar cycle variability and any residual uncorrected instrumental trends in the SSI measurements from the Solar Radiation and Climate Experiment (SORCE) mission and the Thermosphere, Mesosphere, Ionosphere, Energetic, and Dynamics (TIMED) mission. A new technique called the Multiple Same-Irradiance-Level (MuSIL) analysis has been developed, which examines an SSI time series at different levels of solar activity to provide long-term trends in an SSI record, and the most common result is a downward trend that most likely stems from uncorrected instrument degradation. This technique has been applied to each wavelength in the SSI records from SORCE (2003 - present) and TIMED (2002 - present) to provide new solar cycle variability results between 27 nm and 1600 nm with a resolution of about 1 nm at most wavelengths. This technique, which was validated with the highly accurate total solar irradiance (TSI) record, has an estimated relative uncertainty of about 5% of the measured solar cycle variability. The MuSIL results are further validated with the comparison of the new solar cycle variability results from different solar cycles.

  7. Reliability Estimation for Digital Instrument/Control System

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Yaguang; Sydnor, Russell [U.S. Nuclear Regulatory Commission, Washington, D.C. (United States)

    2011-08-15

    Digital instrumentation and controls (DI and C) systems are widely adopted in various industries because of their flexibility and ability to implement various functions that can be used to automatically monitor, analyze, and control complicated systems. It is anticipated that the DI and C will replace the traditional analog instrumentation and controls (AI and C) systems in all future nuclear reactor designs. There is an increasing interest for reliability and risk analyses for safety critical DI and C systems in regulatory organizations, such as The United States Nuclear Regulatory Commission. Developing reliability models and reliability estimation methods for digital reactor control and protection systems will involve every part of the DI and C system, such as sensors, signal conditioning and processing components, transmission lines and digital communication systems, D/A and A/D converters, computer system, signal processing software, control and protection software, power supply system, and actuators. Some of these components are hardware, such as sensors and actuators, their failure mechanisms are well understood, and the traditional reliability model and estimation methods can be directly applied. But many of these components are firmware which has software embedded in the hardware, and software needs special consideration because its failure mechanism is unique, and the reliability estimation method for a software system will be different from the ones used for hardware systems. In this paper, we will propose a reliability estimation method for the entire DI and C system reliability using a recently developed software reliability estimation method and a traditional hardware reliability estimation method.

  8. Reliability Estimation for Digital Instrument/Control System

    International Nuclear Information System (INIS)

    Yang, Yaguang; Sydnor, Russell

    2011-01-01

    Digital instrumentation and controls (DI and C) systems are widely adopted in various industries because of their flexibility and ability to implement various functions that can be used to automatically monitor, analyze, and control complicated systems. It is anticipated that the DI and C will replace the traditional analog instrumentation and controls (AI and C) systems in all future nuclear reactor designs. There is an increasing interest for reliability and risk analyses for safety critical DI and C systems in regulatory organizations, such as The United States Nuclear Regulatory Commission. Developing reliability models and reliability estimation methods for digital reactor control and protection systems will involve every part of the DI and C system, such as sensors, signal conditioning and processing components, transmission lines and digital communication systems, D/A and A/D converters, computer system, signal processing software, control and protection software, power supply system, and actuators. Some of these components are hardware, such as sensors and actuators, their failure mechanisms are well understood, and the traditional reliability model and estimation methods can be directly applied. But many of these components are firmware which has software embedded in the hardware, and software needs special consideration because its failure mechanism is unique, and the reliability estimation method for a software system will be different from the ones used for hardware systems. In this paper, we will propose a reliability estimation method for the entire DI and C system reliability using a recently developed software reliability estimation method and a traditional hardware reliability estimation method

  9. Estimations of natural variability between satellite measurements of trace species concentrations

    Science.gov (United States)

    Sheese, P.; Walker, K. A.; Boone, C. D.; Degenstein, D. A.; Kolonjari, F.; Plummer, D. A.; von Clarmann, T.

    2017-12-01

    In order to validate satellite measurements of atmospheric states, it is necessary to understand the range of random and systematic errors inherent in the measurements. On occasions where the measurements do not agree within those errors, a common "go-to" explanation is that the unexplained difference can be chalked up to "natural variability". However, the expected natural variability is often left ambiguous and rarely quantified. This study will look to quantify the expected natural variability of both O3 and NO2 between two satellite instruments: ACE-FTS (Atmospheric Chemistry Experiment - Fourier Transform Spectrometer) and OSIRIS (Optical Spectrograph and Infrared Imaging System). By sampling the CMAM30 (30-year specified dynamics simulation of the Canadian Middle Atmosphere Model) climate chemistry model throughout the upper troposphere and stratosphere at times and geolocations of coincident ACE-FTS and OSIRIS measurements at varying coincidence criteria, height-dependent expected values of O3 and NO2 variability will be estimated and reported on. The results could also be used to better optimize the coincidence criteria used in satellite measurement validation studies.

  10. College quality and hourly wages: evidence from the self-revelation model, sibling models and instrumental variables.

    Science.gov (United States)

    Borgen, Nicolai T

    2014-11-01

    This paper addresses the recent discussion on confounding in the returns to college quality literature using the Norwegian case. The main advantage of studying Norway is the quality of the data. Norwegian administrative data provide information on college applications, family relations and a rich set of control variables for all Norwegian citizens applying to college between 1997 and 2004 (N = 141,319) and their succeeding wages between 2003 and 2010 (676,079 person-year observations). With these data, this paper uses a subset of the models that have rendered mixed findings in the literature in order to investigate to what extent confounding biases the returns to college quality. I compare estimates obtained using standard regression models to estimates obtained using the self-revelation model of Dale and Krueger (2002), a sibling fixed effects model and the instrumental variable model used by Long (2008). Using these methods, I consistently find increasing returns to college quality over the course of students' work careers, with positive returns only later in students' work careers. I conclude that the standard regression estimate provides a reasonable estimate of the returns to college quality. Copyright © 2014 Elsevier Inc. All rights reserved.

  11. Instrumental variable estimation of the causal effect of plasma 25-hydroxy-vitamin D on colorectal cancer risk: a mendelian randomization analysis.

    Directory of Open Access Journals (Sweden)

    Evropi Theodoratou

    Full Text Available Vitamin D deficiency has been associated with several common diseases, including cancer and is being investigated as a possible risk factor for these conditions. We reported the striking prevalence of vitamin D deficiency in Scotland. Previous epidemiological studies have reported an association between low dietary vitamin D and colorectal cancer (CRC. Using a case-control study design, we tested the association between plasma 25-hydroxy-vitamin D (25-OHD and CRC (2,001 cases, 2,237 controls. To determine whether plasma 25-OHD levels are causally linked to CRC risk, we applied the control function instrumental variable (IV method of the mendelian randomization (MR approach using four single nucleotide polymorphisms (rs2282679, rs12785878, rs10741657, rs6013897 previously shown to be associated with plasma 25-OHD. Low plasma 25-OHD levels were associated with CRC risk in the crude model (odds ratio (OR: 0.76, 95% Confidence Interval (CI: 0.71, 0.81, p: 1.4×10(-14 and after adjusting for age, sex and other confounding factors. Using an allele score that combined all four SNPs as the IV, the estimated causal effect was OR 1.16 (95% CI 0.60, 2.23, whilst it was 0.94 (95% CI 0.46, 1.91 and 0.93 (0.53, 1.63 when using an upstream (rs12785878, rs10741657 and a downstream allele score (rs2282679, rs6013897, respectively. 25-OHD levels were inversely associated with CRC risk, in agreement with recent meta-analyses. The fact that this finding was not replicated when the MR approach was employed might be due to weak instruments, giving low power to demonstrate an effect (<0.35. The prevalence and degree of vitamin D deficiency amongst individuals living in northerly latitudes is of considerable importance because of its relationship to disease. To elucidate the effect of vitamin D on CRC cancer risk, additional large studies of vitamin D and CRC risk are required and/or the application of alternative methods that are less sensitive to weak instrument

  12. College education and wages in the UK : Estimating conditional average structural functions in nonadditive models with binary endogenous variables

    NARCIS (Netherlands)

    Klein, T.J.

    2013-01-01

    Recent studies debate how the unobserved dependence between the monetary return to college education and selection into college can be characterised. This paper examines this question using British data. We develop a semiparametric local instrumental variables estimator for identified features of a

  13. Breastfeeding and the risk of childhood asthma: A two-stage instrumental variable analysis to address endogeneity.

    Science.gov (United States)

    Sharma, Nivita D

    2017-09-01

    Several explanations for the inconsistent results on the effects of breastfeeding on childhood asthma have been suggested. The purpose of this study was to investigate one unexplored explanation, which is the presence of a potential endogenous relationship between breastfeeding and childhood asthma. Endogeneity exists when an explanatory variable is correlated with the error term for reasons such as selection bias, reverse causality, and unmeasured confounders. Unadjusted endogeneity will bias the effect of breastfeeding on childhood asthma. To investigate potential endogeneity, a cross-sectional study of breastfeeding practices and incidence of childhood asthma in 87 pediatric patients in Georgia, the USA, was conducted using generalized linear modeling and a two-stage instrumental variable analysis. First, the relationship between breastfeeding and childhood asthma was analyzed without considering endogeneity. Second, tests for presence of endogeneity were performed and having detected endogeneity between breastfeeding and childhood asthma, a two-stage instrumental variable analysis was performed. The first stage of this analysis estimated the duration of breastfeeding and the second-stage estimated the risk of childhood asthma. When endogeneity was not taken into account, duration of breastfeeding was found to significantly increase the risk of childhood asthma (relative risk ratio [RR]=2.020, 95% confidence interval [CI]: [1.143-3.570]). After adjusting for endogeneity, duration of breastfeeding significantly reduced the risk of childhood asthma (RR=0.003, 95% CI: [0.000-0.240]). The findings suggest that researchers should consider evaluating how the presence of endogeneity could affect the relationship between duration of breastfeeding and the risk of childhood asthma. © 2017 EAACI and John Wiley and Sons A/S. Published by John Wiley and Sons Ltd.

  14. Variable selection and estimation for longitudinal survey data

    KAUST Repository

    Wang, Li

    2014-09-01

    There is wide interest in studying longitudinal surveys where sample subjects are observed successively over time. Longitudinal surveys have been used in many areas today, for example, in the health and social sciences, to explore relationships or to identify significant variables in regression settings. This paper develops a general strategy for the model selection problem in longitudinal sample surveys. A survey weighted penalized estimating equation approach is proposed to select significant variables and estimate the coefficients simultaneously. The proposed estimators are design consistent and perform as well as the oracle procedure when the correct submodel was known. The estimating function bootstrap is applied to obtain the standard errors of the estimated parameters with good accuracy. A fast and efficient variable selection algorithm is developed to identify significant variables for complex longitudinal survey data. Simulated examples are illustrated to show the usefulness of the proposed methodology under various model settings and sampling designs. © 2014 Elsevier Inc.

  15. College Education and Wages in the U.K. : Estimating Conditional Average Structural Functions in Nonadditive Models with Binary Endogenous Variables

    NARCIS (Netherlands)

    Klein, T.J.

    2009-01-01

    Recent studies debate how the unobserved dependence between the monetary return to college education and selection into college can be characterized. This paper examines this question using British data. We develop a semiparametric local instrumental variables estimator for identified features of a

  16. Instrumented Impact Testing: Influence of Machine Variables and Specimen Position

    International Nuclear Information System (INIS)

    Lucon, E.; McCowan, C. N.; Santoyo, R. A.

    2008-01-01

    An investigation has been conducted on the influence of impact machine variables and specimen positioning on characteristic forces and absorbed energies from instrumented Charpy tests. Brittle and ductile fracture behavior has been investigated by testing NIST reference samples of low, high and super-high energy levels. Test machine variables included tightness of foundation, anvil and striker bolts, and the position of the center of percussion with respect to the center of strike. For specimen positioning, we tested samples which had been moved away or sideways with respect to the anvils. In order to assess the influence of the various factors, we compared mean values in the reference (unaltered) and altered conditions; for machine variables, t-test analyses were also performed in order to evaluate the statistical significance of the observed differences. Our results indicate that the only circumstance which resulted in variations larger than 5 percent for both brittle and ductile specimens is when the sample is not in contact with the anvils. These findings should be taken into account in future revisions of instrumented Charpy test standards.

  17. Instrumented Impact Testing: Influence of Machine Variables and Specimen Position

    Energy Technology Data Exchange (ETDEWEB)

    Lucon, E.; McCowan, C. N.; Santoyo, R. A.

    2008-09-15

    An investigation has been conducted on the influence of impact machine variables and specimen positioning on characteristic forces and absorbed energies from instrumented Charpy tests. Brittle and ductile fracture behavior has been investigated by testing NIST reference samples of low, high and super-high energy levels. Test machine variables included tightness of foundation, anvil and striker bolts, and the position of the center of percussion with respect to the center of strike. For specimen positioning, we tested samples which had been moved away or sideways with respect to the anvils. In order to assess the influence of the various factors, we compared mean values in the reference (unaltered) and altered conditions; for machine variables, t-test analyses were also performed in order to evaluate the statistical significance of the observed differences. Our results indicate that the only circumstance which resulted in variations larger than 5 percent for both brittle and ductile specimens is when the sample is not in contact with the anvils. These findings should be taken into account in future revisions of instrumented Charpy test standards.

  18. kVp estimate intercomparison between Unfors XI, Radcal 4075 and a new CDTN multipurpose instrument

    International Nuclear Information System (INIS)

    Baptista Neto, A.T.; Oliveira, B.B.; Faria, L.O.

    2015-01-01

    In this work we compare the kVp estimate between CDTN multipurpose instrument, UnforsXI and Radcal 4075 meters under different combinations of voltage and filtration. The non-invasively measurements made using x-ray diagnostic and interventional radiology devices show similar tendencies to increase the kVp estimate when aluminum filters are placed in the path of the x-ray beam. The results reveal that the kVp estimate made by the CDTN multipurpose instrument is always satisfactory for highly filtered beam intensities. - Highlights: • We compare the kVp estimate between CDTN instrument and 2 different kVp meters. • The new CDTN multipurpose instrument performance was found to be satisfactory. • All instruments increase kVp estimative for increasing additional filtration. • They are suitable for quality control routines in x-ray diagnostic radiology

  19. The XRF spectrometer and the selection of analysis conditions (instrumental variables)

    International Nuclear Information System (INIS)

    Willis, J.P.

    2002-01-01

    Full text: This presentation will begin with a brief discussion of EDXRF and flat- and curved-crystal WDXRF spectrometers, contrasting the major differences between the three types. The remainder of the presentation will contain a detailed overview of the choice and settings of the many instrumental variables contained in a modern WDXRF spectrometer, and will discuss critically the choices facing the analyst in setting up a WDXRF spectrometer for different elements and applications. In particular it will discuss the choice of tube target (when a choice is possible), the kV and mA settings, tube filters, collimator masks, collimators, analyzing crystals, secondary collimators, detectors, pulse height selection, X-ray path medium (air, nitrogen, vacuum or helium), counting times for peak and background positions and their effect on counting statistics and lower limit of detection (LLD). The use of Figure of Merit (FOM) calculations to objectively choose the best combination of instrumental variables also will be discussed. This presentation will be followed by a shorter session on a subsequent day entitled - A Selection of XRF Conditions - Practical Session, where participants will be given the opportunity to discuss in groups the selection of the best instrumental variables for three very diverse applications. Copyright (2002) Australian X-ray Analytical Association Inc

  20. Estimates of genetic variability in mutated population of triticum aestivum

    International Nuclear Information System (INIS)

    Larik, A.S.; Siddiqui, K.A.; Soomoro, A.H.

    1980-01-01

    M 2 populations of four cultivars of Mexican origin (Mexipak-65, Nayab, Pak-70 and 6134 x C-271) and two locally bred cultivars (H-68 and C-591) of bread wheat, triticum aestivum (2n = 6x = AA BB DD) derived from six irradiation treatments (gamma rays 60sub(Co); 10, 15 and 20 kR and fast neutrons; 300, 600 and 900 RADS) were critically examined for spike length, spikelets per spike, grains per spike and grain yield. Genotypes varied significantly (p>=0.01) for all the characters. Irradiation treatment were instrumental in creating significant variability for all the characters, indicating that varieties did not perform uniformly across different gamma rays as well as fast neutron treatments. In the M 2 generation there was a considerable increase in variance for all the four metrical traits. Comparisons were made between controls and treated populations. Mutagenic treatments shifted the mean values mostly towards the negative direction, but the shift was not unidirectional nor equally effective for all the characters. The differences in mean values and the nature of variability observed in M 2 indicated a possible preference of selection M 3 generation. In general, estimates of genetic variability and heritability (b.s) increased with increasing doses of gamma rays and fast neutrons. Genetic advance also exhibited similar trend. The observed variability can be utilized in the evolution of new varieties. (authors)

  1. Finite-sample instrumental variables inference using an asymptotically pivotal statistic

    NARCIS (Netherlands)

    Bekker, P; Kleibergen, F

    2003-01-01

    We consider the K-statistic, Kleibergen's (2002, Econometrica 70, 1781-1803) adaptation of the Anderson-Rubin (AR) statistic in instrumental variables regression. Whereas Kleibergen (2002) especially analyzes the asymptotic behavior of the statistic, we focus on finite-sample properties in, a

  2. Finite-sample instrumental variables Inference using an Asymptotically Pivotal Statistic

    NARCIS (Netherlands)

    Bekker, P.; Kleibergen, F.R.

    2001-01-01

    The paper considers the K-statistic, Kleibergen’s (2000) adaptation ofthe Anderson-Rubin (AR) statistic in instrumental variables regression.Compared to the AR-statistic this K-statistic shows improvedasymptotic efficiency in terms of degrees of freedom in overidentifiedmodels and yet it shares,

  3. Finite-sample instrumental variables inference using an asymptotically pivotal statistic

    NARCIS (Netherlands)

    Bekker, Paul A.; Kleibergen, Frank

    2001-01-01

    The paper considers the K-statistic, Kleibergen’s (2000) adaptation of the Anderson-Rubin (AR) statistic in instrumental variables regression. Compared to the AR-statistic this K-statistic shows improved asymptotic efficiency in terms of degrees of freedom in overidenti?ed models and yet it shares,

  4. Density dependence and climate effects in Rocky Mountain elk: an application of regression with instrumental variables for population time series with sampling error.

    Science.gov (United States)

    Creel, Scott; Creel, Michael

    2009-11-01

    1. Sampling error in annual estimates of population size creates two widely recognized problems for the analysis of population growth. First, if sampling error is mistakenly treated as process error, one obtains inflated estimates of the variation in true population trajectories (Staples, Taper & Dennis 2004). Second, treating sampling error as process error is thought to overestimate the importance of density dependence in population growth (Viljugrein et al. 2005; Dennis et al. 2006). 2. In ecology, state-space models are used to account for sampling error when estimating the effects of density and other variables on population growth (Staples et al. 2004; Dennis et al. 2006). In econometrics, regression with instrumental variables is a well-established method that addresses the problem of correlation between regressors and the error term, but requires fewer assumptions than state-space models (Davidson & MacKinnon 1993; Cameron & Trivedi 2005). 3. We used instrumental variables to account for sampling error and fit a generalized linear model to 472 annual observations of population size for 35 Elk Management Units in Montana, from 1928 to 2004. We compared this model with state-space models fit with the likelihood function of Dennis et al. (2006). We discuss the general advantages and disadvantages of each method. Briefly, regression with instrumental variables is valid with fewer distributional assumptions, but state-space models are more efficient when their distributional assumptions are met. 4. Both methods found that population growth was negatively related to population density and winter snow accumulation. Summer rainfall and wolf (Canis lupus) presence had much weaker effects on elk (Cervus elaphus) dynamics [though limitation by wolves is strong in some elk populations with well-established wolf populations (Creel et al. 2007; Creel & Christianson 2008)]. 5. Coupled with predictions for Montana from global and regional climate models, our results

  5. The effect of patient satisfaction with pharmacist consultation on medication adherence: an instrumental variable approach

    Directory of Open Access Journals (Sweden)

    Gu NY

    2008-12-01

    Full Text Available There are limited studies on quantifying the impact of patient satisfaction with pharmacist consultation on patient medication adherence. Objectives: The objective of this study is to evaluate the effect of patient satisfaction with pharmacist consultation services on medication adherence in a large managed care organization. Methods: We analyzed data from a patient satisfaction survey of 6,916 patients who had used pharmacist consultation services in Kaiser Permanente Southern California from 1993 to 1996. We compared treating patient satisfaction as exogenous, in a single-equation probit model, with a bivariate probit model where patient satisfaction was treated as endogenous. Different sets of instrumental variables were employed, including measures of patients' emotional well-being and patients' propensity to fill their prescriptions at a non-Kaiser Permanente (KP pharmacy. The Smith-Blundell test was used to test whether patient satisfaction was endogenous. Over-identification tests were used to test the validity of the instrumental variables. The Staiger-Stock weak instrument test was used to evaluate the explanatory power of the instrumental variables. Results: All tests indicated that the instrumental variables method was valid and the instrumental variables used have significant explanatory power. The single equation probit model indicated that the effect of patient satisfaction with pharmacist consultation was significant (p<0.010. However, the bivariate probit models revealed that the marginal effect of pharmacist consultation on medication adherence was significantly greater than the single equation probit. The effect increased from 7% to 30% (p<0.010 after controlling for endogeneity bias. Conclusion: After appropriate adjustment for endogeneity bias, patients satisfied with their pharmacy services are substantially more likely to adhere to their medication. The results have important policy implications given the increasing focus

  6. K vp estimate intercomparison between Unfors XI, Radcal 4075 and a new CDTN multipurpose instrument

    International Nuclear Information System (INIS)

    Baptista N, A. T.; Oliveira, B. B.; Faria, L. O.

    2014-08-01

    This work compares results obtained using 3 (three) different instruments capable of non-invasively estimating the voltage applied to the electrodes of an x-ray emission equipment, namely the Unfors model Xi R/F, the Radcal Corporation model 4075 R/F and a new CDTN multipurpose instrument. Tests were carried out using the Pantak Seifert Model 320 Hs x-ray machine with equal setups for all instruments undergoing comparison. Irradiations were performed for different conditions of voltage and filtration. Although all instruments show a similar tendency to increase the k Vp estimate when aluminum filters are placed in the path of the x-ray beam, they may all be satisfactorily adopted in quality control routines of x-ray equipment by means of estimation of the applied voltage. The importance of using equally calibrated measurement instruments and according to manufacturers instructions became clear; in case it is not possible to follow these requirements, measurement-correcting methods must be applied. Using the new multipurpose instrument, the k Vp estimate is satisfactory even if the x-ray beam intensity is filtered in approximately one-tenth value layer. (author)

  7. K vp estimate intercomparison between Unfors XI, Radcal 4075 and a new CDTN multipurpose instrument

    Energy Technology Data Exchange (ETDEWEB)

    Baptista N, A. T.; Oliveira, B. B.; Faria, L. O., E-mail: annibal@cdtn.br [Centro de Desenvolvimento da Tecnologia Nuclear - CNEN, Av. Presidente Antonio Carlos 6627, Campus UFMG, Pampulha, CEP 31270-901 Belo Horizonte, Minas Gerais (Brazil)

    2014-08-15

    This work compares results obtained using 3 (three) different instruments capable of non-invasively estimating the voltage applied to the electrodes of an x-ray emission equipment, namely the Unfors model Xi R/F, the Radcal Corporation model 4075 R/F and a new CDTN multipurpose instrument. Tests were carried out using the Pantak Seifert Model 320 Hs x-ray machine with equal setups for all instruments undergoing comparison. Irradiations were performed for different conditions of voltage and filtration. Although all instruments show a similar tendency to increase the k Vp estimate when aluminum filters are placed in the path of the x-ray beam, they may all be satisfactorily adopted in quality control routines of x-ray equipment by means of estimation of the applied voltage. The importance of using equally calibrated measurement instruments and according to manufacturers instructions became clear; in case it is not possible to follow these requirements, measurement-correcting methods must be applied. Using the new multipurpose instrument, the k Vp estimate is satisfactory even if the x-ray beam intensity is filtered in approximately one-tenth value layer. (author)

  8. Estimation of biochemical variables using quantumbehaved particle ...

    African Journals Online (AJOL)

    To generate a more efficient neural network estimator, we employed the previously proposed quantum-behaved particle swarm optimization (QPSO) algorithm for neural network training. The experiment results of L-glutamic acid fermentation process showed that our established estimator could predict variables such as the ...

  9. Standard Errors of Estimated Latent Variable Scores with Estimated Structural Parameters

    Science.gov (United States)

    Hoshino, Takahiro; Shigemasu, Kazuo

    2008-01-01

    The authors propose a concise formula to evaluate the standard error of the estimated latent variable score when the true values of the structural parameters are not known and must be estimated. The formula can be applied to factor scores in factor analysis or ability parameters in item response theory, without bootstrap or Markov chain Monte…

  10. Pollen parameters estimates of genetic variability among newly ...

    African Journals Online (AJOL)

    Pollen parameters estimates of genetic variability among newly selected Nigerian roselle (Hibiscus sabdariffa L.) genotypes. ... Estimates of some pollen parameters where used to assess the genetic diversity among ... HOW TO USE AJOL.

  11. Improved Variable Window Kernel Estimates of Probability Densities

    OpenAIRE

    Hall, Peter; Hu, Tien Chung; Marron, J. S.

    1995-01-01

    Variable window width kernel density estimators, with the width varying proportionally to the square root of the density, have been thought to have superior asymptotic properties. The rate of convergence has been claimed to be as good as those typical for higher-order kernels, which makes the variable width estimators more attractive because no adjustment is needed to handle the negativity usually entailed by the latter. However, in a recent paper, Terrell and Scott show that these results ca...

  12. Auditory/visual distance estimation: accuracy and variability

    Directory of Open Access Journals (Sweden)

    Paul Wallace Anderson

    2014-10-01

    Full Text Available Past research has shown that auditory distance estimation improves when listeners are given the opportunity to see all possible sound sources when compared to no visual input. It has also been established that distance estimation is more accurate in vision than in audition. The present study investigates the degree to which auditory distance estimation is improved when matched with a congruent visual stimulus. Virtual sound sources based on binaural room impulse response (BRIR measurements made from distances ranging from approximately 0.3 to 9.8 m in a concert hall were used as auditory stimuli. Visual stimuli were photographs taken from the listener’s perspective at each distance in the impulse response measurement setup presented on a large HDTV monitor. Listeners were asked to estimate egocentric distance to the sound source in each of three conditions: auditory only (A, visual only (V, and congruent auditory/visual stimuli (A+V. Each condition was presented within its own block. Sixty-two listeners were tested in order to quantify the response variability inherent in auditory distance perception. Distance estimates from both the V and A+V conditions were found to be considerably more accurate and less variable than estimates from the A condition.

  13. Statistical Analysis for Multisite Trials Using Instrumental Variables with Random Coefficients

    Science.gov (United States)

    Raudenbush, Stephen W.; Reardon, Sean F.; Nomi, Takako

    2012-01-01

    Multisite trials can clarify the average impact of a new program and the heterogeneity of impacts across sites. Unfortunately, in many applications, compliance with treatment assignment is imperfect. For these applications, we propose an instrumental variable (IV) model with person-specific and site-specific random coefficients. Site-specific IV…

  14. CONSTRUCTING ACCOUNTING UNCERTAINITY ESTIMATES VARIABLE

    Directory of Open Access Journals (Sweden)

    Nino Serdarevic

    2012-10-01

    Full Text Available This paper presents research results on the BIH firms’ financial reporting quality, utilizing empirical relation between accounting conservatism, generated in created critical accounting policy choices, and management abilities in estimates and prediction power of domicile private sector accounting. Primary research is conducted based on firms’ financial statements, constructing CAPCBIH (Critical Accounting Policy Choices relevant in B&H variable that presents particular internal control system and risk assessment; and that influences financial reporting positions in accordance with specific business environment. I argue that firms’ management possesses no relevant capacity to determine risks and true consumption of economic benefits, leading to creation of hidden reserves in inventories and accounts payable; and latent losses for bad debt and assets revaluations. I draw special attention to recent IFRS convergences to US GAAP, especially in harmonizing with FAS 130 Reporting comprehensive income (in revised IAS 1 and FAS 157 Fair value measurement. CAPCBIH variable, resulted in very poor performance, presents considerable lack of recognizing environment specifics. Furthermore, I underline the importance of revised ISAE and re-enforced role of auditors in assessing relevance of management estimates.

  15. Important variables for parents' postnatal sense of security: evaluating a new Swedish instrument (the PPSS instrument).

    Science.gov (United States)

    Persson, Eva K; Dykes, Anna-Karin

    2009-08-01

    to evaluate dimensions of both parents' postnatal sense of security the first week after childbirth, and to determine associations between the PPSS instrument and different sociodemographic and situational background variables. evaluative, cross-sectional design. 113 mothers and 99 fathers with children live born at term, from five hospitals in southern Sweden. mothers and fathers had similar feelings concerning postnatal sense of security. Of the dimensions in the PPSS instrument, a sense of midwives'/nurses' empowering behaviour, a sense of one's own general well-being and a sense of the mother's well-being as experienced by the father were the most important dimensions for parents' experienced security. A sense of affinity within the family (for both parents) and a sense of manageable breast feeding (for mothers) were not significantly associated with their experienced security. A sense of participation during pregnancy and general anxiety were significantly associated background variables for postnatal sense of security for both parents. For the mothers, parity and a sense that the father was participating during pregnancy were also significantly associated. more focus on parents' participation during pregnancy as well as midwives'/nurses' empowering behaviour during the postnatal period will be beneficial for both parents' postnatal sense of security.

  16. Variable kernel density estimation in high-dimensional feature spaces

    CSIR Research Space (South Africa)

    Van der Walt, Christiaan M

    2017-02-01

    Full Text Available Estimating the joint probability density function of a dataset is a central task in many machine learning applications. In this work we address the fundamental problem of kernel bandwidth estimation for variable kernel density estimation in high...

  17. Estimating water equivalent snow depth from related meteorological variables

    International Nuclear Information System (INIS)

    Steyaert, L.T.; LeDuc, S.K.; Strommen, N.D.; Nicodemus, M.L.; Guttman, N.B.

    1980-05-01

    Engineering design must take into consideration natural loads and stresses caused by meteorological elements, such as, wind, snow, precipitation and temperature. The purpose of this study was to determine a relationship of water equivalent snow depth measurements to meteorological variables. Several predictor models were evaluated for use in estimating water equivalent values. These models include linear regression, principal component regression, and non-linear regression models. Linear, non-linear and Scandanavian models are used to generate annual water equivalent estimates for approximately 1100 cooperative data stations where predictor variables are available, but which have no water equivalent measurements. These estimates are used to develop probability estimates of snow load for each station. Map analyses for 3 probability levels are presented

  18. Estimating net present value variability for deterministic models

    NARCIS (Netherlands)

    van Groenendaal, W.J.H.

    1995-01-01

    For decision makers the variability in the net present value (NPV) of an investment project is an indication of the project's risk. So-called risk analysis is one way to estimate this variability. However, risk analysis requires knowledge about the stochastic character of the inputs. For large,

  19. An Instrumental Variable Probit (IVP Analysis on Depressed Mood in Korea: The Impact of Gender Differences and Other Socio-Economic Factors

    Directory of Open Access Journals (Sweden)

    Lara Gitto

    2015-08-01

    Full Text Available Background Depression is a mental health state whose frequency has been increasing in modern societies. It imposes a great burden, because of the strong impact on people’s quality of life and happiness. Depression can be reliably diagnosed and treated in primary care: if more people could get effective treatments earlier, the costs related to depression would be reversed. The aim of this study was to examine the influence of socio-economic factors and gender on depressed mood, focusing on Korea. In fact, in spite of the great amount of empirical studies carried out for other countries, few epidemiological studies have examined the socio-economic determinants of depression in Korea and they were either limited to samples of employed women or did not control for individual health status. Moreover, as the likely data endogeneity (i.e. the possibility of correlation between the dependent variable and the error term as a result of autocorrelation or simultaneity, such as, in this case, the depressed mood due to health factors that, in turn might be caused by depression, might bias the results, the present study proposes an empirical approach, based on instrumental variables, to deal with this problem. Methods Data for the year 2008 from the Korea National Health and Nutrition Examination Survey (KNHANES were employed. About seven thousands of people (N= 6,751, of which 43% were males and 57% females, aged from 19 to 75 years old, were included in the sample considered in the analysis. In order to take into account the possible endogeneity of some explanatory variables, two Instrumental Variables Probit (IVP regressions were estimated; the variables for which instrumental equations were estimated were related to the participation of women to the workforce and to good health, as reported by people in the sample. Explanatory variables were related to age, gender, family factors (such as the number of family members and marital status and socio

  20. An Instrumental Variable Probit (IVP) analysis on depressed mood in Korea: the impact of gender differences and other socio-economic factors.

    Science.gov (United States)

    Gitto, Lara; Noh, Yong-Hwan; Andrés, Antonio Rodríguez

    2015-04-16

    Depression is a mental health state whose frequency has been increasing in modern societies. It imposes a great burden, because of the strong impact on people's quality of life and happiness. Depression can be reliably diagnosed and treated in primary care: if more people could get effective treatments earlier, the costs related to depression would be reversed. The aim of this study was to examine the influence of socio-economic factors and gender on depressed mood, focusing on Korea. In fact, in spite of the great amount of empirical studies carried out for other countries, few epidemiological studies have examined the socio-economic determinants of depression in Korea and they were either limited to samples of employed women or did not control for individual health status. Moreover, as the likely data endogeneity (i.e. the possibility of correlation between the dependent variable and the error term as a result of autocorrelation or simultaneity, such as, in this case, the depressed mood due to health factors that, in turn might be caused by depression), might bias the results, the present study proposes an empirical approach, based on instrumental variables, to deal with this problem. Data for the year 2008 from the Korea National Health and Nutrition Examination Survey (KNHANES) were employed. About seven thousands of people (N= 6,751, of which 43% were males and 57% females), aged from 19 to 75 years old, were included in the sample considered in the analysis. In order to take into account the possible endogeneity of some explanatory variables, two Instrumental Variables Probit (IVP) regressions were estimated; the variables for which instrumental equations were estimated were related to the participation of women to the workforce and to good health, as reported by people in the sample. Explanatory variables were related to age, gender, family factors (such as the number of family members and marital status) and socio-economic factors (such as education

  1. Comparing surgical trays with redundant instruments with trays with reduced instruments: a cost analysis.

    Science.gov (United States)

    John-Baptiste, A; Sowerby, L J; Chin, C J; Martin, J; Rotenberg, B W

    2016-01-01

    When prearranged standard surgical trays contain instruments that are repeatedly unused, the redundancy can result in unnecessary health care costs. Our objective was to estimate potential savings by performing an economic evaluation comparing the cost of surgical trays with redundant instruments with surgical trays with reduced instruments ("reduced trays"). We performed a cost-analysis from the hospital perspective over a 1-year period. Using a mathematical model, we compared the direct costs of trays containing redundant instruments to reduced trays for 5 otolaryngology procedures. We incorporated data from several sources including local hospital data on surgical volume, the number of instruments on redundant and reduced trays, wages of personnel and time required to pack instruments. From the literature, we incorporated instrument depreciation costs and the time required to decontaminate an instrument. We performed 1-way sensitivity analyses on all variables, including surgical volume. Costs were estimated in 2013 Canadian dollars. The cost of redundant trays was $21 806 and the cost of reduced trays was $8803, for a 1-year cost saving of $13 003. In sensitivity analyses, cost savings ranged from $3262 to $21 395, based on the surgical volume at the institution. Variation in surgical volume resulted in a wider range of estimates, with a minimum of $3253 for low-volume to a maximum of $52 012 for high-volume institutions. Our study suggests moderate savings may be achieved by reducing surgical tray redundancy and, if applied to other surgical specialties, may result in savings to Canadian health care systems.

  2. Variable disparity-motion estimation based fast three-view video coding

    Science.gov (United States)

    Bae, Kyung-Hoon; Kim, Seung-Cheol; Hwang, Yong Seok; Kim, Eun-Soo

    2009-02-01

    In this paper, variable disparity-motion estimation (VDME) based 3-view video coding is proposed. In the encoding, key-frame coding (KFC) based motion estimation and variable disparity estimation (VDE) for effectively fast three-view video encoding are processed. These proposed algorithms enhance the performance of 3-D video encoding/decoding system in terms of accuracy of disparity estimation and computational overhead. From some experiments, stereo sequences of 'Pot Plant' and 'IVO', it is shown that the proposed algorithm's PSNRs is 37.66 and 40.55 dB, and the processing time is 0.139 and 0.124 sec/frame, respectively.

  3. Accuracy of latent-variable estimation in Bayesian semi-supervised learning.

    Science.gov (United States)

    Yamazaki, Keisuke

    2015-09-01

    Hierarchical probabilistic models, such as Gaussian mixture models, are widely used for unsupervised learning tasks. These models consist of observable and latent variables, which represent the observable data and the underlying data-generation process, respectively. Unsupervised learning tasks, such as cluster analysis, are regarded as estimations of latent variables based on the observable ones. The estimation of latent variables in semi-supervised learning, where some labels are observed, will be more precise than that in unsupervised, and one of the concerns is to clarify the effect of the labeled data. However, there has not been sufficient theoretical analysis of the accuracy of the estimation of latent variables. In a previous study, a distribution-based error function was formulated, and its asymptotic form was calculated for unsupervised learning with generative models. It has been shown that, for the estimation of latent variables, the Bayes method is more accurate than the maximum-likelihood method. The present paper reveals the asymptotic forms of the error function in Bayesian semi-supervised learning for both discriminative and generative models. The results show that the generative model, which uses all of the given data, performs better when the model is well specified. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. A NEW MODIFIED RATIO ESTIMATOR FOR ESTIMATION OF POPULATION MEAN WHEN MEDIAN OF THE AUXILIARY VARIABLE IS KNOWN

    Directory of Open Access Journals (Sweden)

    Jambulingam Subramani

    2013-10-01

    Full Text Available The present paper deals with a modified ratio estimator for estimation of population mean of the study variable when the population median of the auxiliary variable is known. The bias and mean squared error of the proposed estimator are derived and are compared with that of existing modified ratio estimators for certain known populations. Further we have also derived the conditions for which the proposed estimator performs better than the existing modified ratio estimators. From the numerical study it is also observed that the proposed modified ratio estimator performs better than the existing modified ratio estimators for certain known populations.

  5. Explicit estimating equations for semiparametric generalized linear latent variable models

    KAUST Repository

    Ma, Yanyuan

    2010-07-05

    We study generalized linear latent variable models without requiring a distributional assumption of the latent variables. Using a geometric approach, we derive consistent semiparametric estimators. We demonstrate that these models have a property which is similar to that of a sufficient complete statistic, which enables us to simplify the estimating procedure and explicitly to formulate the semiparametric estimating equations. We further show that the explicit estimators have the usual root n consistency and asymptotic normality. We explain the computational implementation of our method and illustrate the numerical performance of the estimators in finite sample situations via extensive simulation studies. The advantage of our estimators over the existing likelihood approach is also shown via numerical comparison. We employ the method to analyse a real data example from economics. © 2010 Royal Statistical Society.

  6. Linear latent variable models: the lava-package

    DEFF Research Database (Denmark)

    Holst, Klaus Kähler; Budtz-Jørgensen, Esben

    2013-01-01

    are implemented including robust standard errors for clustered correlated data, multigroup analyses, non-linear parameter constraints, inference with incomplete data, maximum likelihood estimation with censored and binary observations, and instrumental variable estimators. In addition an extensive simulation......An R package for specifying and estimating linear latent variable models is presented. The philosophy of the implementation is to separate the model specification from the actual data, which leads to a dynamic and easy way of modeling complex hierarchical structures. Several advanced features...

  7. Efficient Estimation of Spectral Moments and the Polarimetric Variables on Weather Radars, Sonars, Sodars, Acoustic Flow Meters, Lidars, and Similar Active Remote Sensing Instruments

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — A method for estimation of Doppler spectrum, its moments, and polarimetric variables on pulsed weather radars which uses over sampled echo components at a rate...

  8. On-line scheme for parameter estimation of nonlinear lithium ion battery equivalent circuit models using the simplified refined instrumental variable method for a modified Wiener continuous-time model

    International Nuclear Information System (INIS)

    Allafi, Walid; Uddin, Kotub; Zhang, Cheng; Mazuir Raja Ahsan Sha, Raja; Marco, James

    2017-01-01

    Highlights: •Off-line estimation approach for continuous-time domain for non-invertible function. •Model reformulated to multi-input-single-output; nonlinearity described by sigmoid. •Method directly estimates parameters of nonlinear ECM from the measured-data. •Iterative on-line technique leads to smoother convergence. •The model is validated off-line and on-line using NCA battery. -- Abstract: The accuracy of identifying the parameters of models describing lithium ion batteries (LIBs) in typical battery management system (BMS) applications is critical to the estimation of key states such as the state of charge (SoC) and state of health (SoH). In applications such as electric vehicles (EVs) where LIBs are subjected to highly demanding cycles of operation and varying environmental conditions leading to non-trivial interactions of ageing stress factors, this identification is more challenging. This paper proposes an algorithm that directly estimates the parameters of a nonlinear battery model from measured input and output data in the continuous time-domain. The simplified refined instrumental variable method is extended to estimate the parameters of a Wiener model where there is no requirement for the nonlinear function to be invertible. To account for nonlinear battery dynamics, in this paper, the typical linear equivalent circuit model (ECM) is enhanced by a block-oriented Wiener configuration where the nonlinear memoryless block following the typical ECM is defined to be a sigmoid static nonlinearity. The nonlinear Weiner model is reformulated in the form of a multi-input, single-output linear model. This linear form allows the parameters of the nonlinear model to be estimated using any linear estimator such as the well-established least squares (LS) algorithm. In this paper, the recursive least square (RLS) method is adopted for online parameter estimation. The approach was validated on experimental data measured from an 18650-type Graphite

  9. Method and system to estimate variables in an integrated gasification combined cycle (IGCC) plant

    Science.gov (United States)

    Kumar, Aditya; Shi, Ruijie; Dokucu, Mustafa

    2013-09-17

    System and method to estimate variables in an integrated gasification combined cycle (IGCC) plant are provided. The system includes a sensor suite to measure respective plant input and output variables. An extended Kalman filter (EKF) receives sensed plant input variables and includes a dynamic model to generate a plurality of plant state estimates and a covariance matrix for the state estimates. A preemptive-constraining processor is configured to preemptively constrain the state estimates and covariance matrix to be free of constraint violations. A measurement-correction processor may be configured to correct constrained state estimates and a constrained covariance matrix based on processing of sensed plant output variables. The measurement-correction processor is coupled to update the dynamic model with corrected state estimates and a corrected covariance matrix. The updated dynamic model may be configured to estimate values for at least one plant variable not originally sensed by the sensor suite.

  10. Context Tree Estimation in Variable Length Hidden Markov Models

    OpenAIRE

    Dumont, Thierry

    2011-01-01

    We address the issue of context tree estimation in variable length hidden Markov models. We propose an estimator of the context tree of the hidden Markov process which needs no prior upper bound on the depth of the context tree. We prove that the estimator is strongly consistent. This uses information-theoretic mixture inequalities in the spirit of Finesso and Lorenzo(Consistent estimation of the order for Markov and hidden Markov chains(1990)) and E.Gassiat and S.Boucheron (Optimal error exp...

  11. Nonparametric instrumental regression with non-convex constraints

    International Nuclear Information System (INIS)

    Grasmair, M; Scherzer, O; Vanhems, A

    2013-01-01

    This paper considers the nonparametric regression model with an additive error that is dependent on the explanatory variables. As is common in empirical studies in epidemiology and economics, it also supposes that valid instrumental variables are observed. A classical example in microeconomics considers the consumer demand function as a function of the price of goods and the income, both variables often considered as endogenous. In this framework, the economic theory also imposes shape restrictions on the demand function, such as integrability conditions. Motivated by this illustration in microeconomics, we study an estimator of a nonparametric constrained regression function using instrumental variables by means of Tikhonov regularization. We derive rates of convergence for the regularized model both in a deterministic and stochastic setting under the assumption that the true regression function satisfies a projected source condition including, because of the non-convexity of the imposed constraints, an additional smallness condition. (paper)

  12. Nonparametric instrumental regression with non-convex constraints

    Science.gov (United States)

    Grasmair, M.; Scherzer, O.; Vanhems, A.

    2013-03-01

    This paper considers the nonparametric regression model with an additive error that is dependent on the explanatory variables. As is common in empirical studies in epidemiology and economics, it also supposes that valid instrumental variables are observed. A classical example in microeconomics considers the consumer demand function as a function of the price of goods and the income, both variables often considered as endogenous. In this framework, the economic theory also imposes shape restrictions on the demand function, such as integrability conditions. Motivated by this illustration in microeconomics, we study an estimator of a nonparametric constrained regression function using instrumental variables by means of Tikhonov regularization. We derive rates of convergence for the regularized model both in a deterministic and stochastic setting under the assumption that the true regression function satisfies a projected source condition including, because of the non-convexity of the imposed constraints, an additional smallness condition.

  13. Observer variability in estimating numbers: An experiment

    Science.gov (United States)

    Erwin, R.M.

    1982-01-01

    Census estimates of bird populations provide an essential framework for a host of research and management questions. However, with some exceptions, the reliability of numerical estimates and the factors influencing them have received insufficient attention. Independent of the problems associated with habitat type, weather conditions, cryptic coloration, ete., estimates may vary widely due only to intrinsic differences in observers? abilities to estimate numbers. Lessons learned in the field of perceptual psychology may be usefully applied to 'real world' problems in field ornithology. Based largely on dot discrimination tests in the laboratory, it was found that numerical abundance, density of objects, spatial configuration, color, background, and other variables influence individual accuracy in estimating numbers. The primary purpose of the present experiment was to assess the effects of observer, prior experience, and numerical range on accuracy in estimating numbers of waterfowl from black-and-white photographs. By using photographs of animals rather than black dots, I felt the results could be applied more meaningfully to field situations. Further, reinforcement was provided throughout some experiments to examine the influence of training on accuracy.

  14. Instrument Variables for Reducing Noise in Parallel MRI Reconstruction

    Directory of Open Access Journals (Sweden)

    Yuchou Chang

    2017-01-01

    Full Text Available Generalized autocalibrating partially parallel acquisition (GRAPPA has been a widely used parallel MRI technique. However, noise deteriorates the reconstructed image when reduction factor increases or even at low reduction factor for some noisy datasets. Noise, initially generated from scanner, propagates noise-related errors during fitting and interpolation procedures of GRAPPA to distort the final reconstructed image quality. The basic idea we proposed to improve GRAPPA is to remove noise from a system identification perspective. In this paper, we first analyze the GRAPPA noise problem from a noisy input-output system perspective; then, a new framework based on errors-in-variables (EIV model is developed for analyzing noise generation mechanism in GRAPPA and designing a concrete method—instrument variables (IV GRAPPA to remove noise. The proposed EIV framework provides possibilities that noiseless GRAPPA reconstruction could be achieved by existing methods that solve EIV problem other than IV method. Experimental results show that the proposed reconstruction algorithm can better remove the noise compared to the conventional GRAPPA, as validated with both of phantom and in vivo brain data.

  15. Combining fixed effects and instrumental variable approaches for estimating the effect of psychosocial job quality on mental health: evidence from 13 waves of a nationally representative cohort study.

    Science.gov (United States)

    Milner, Allison; Aitken, Zoe; Kavanagh, Anne; LaMontagne, Anthony D; Pega, Frank; Petrie, Dennis

    2017-06-23

    Previous studies suggest that poor psychosocial job quality is a risk factor for mental health problems, but they use conventional regression analytic methods that cannot rule out reverse causation, unmeasured time-invariant confounding and reporting bias. This study combines two quasi-experimental approaches to improve causal inference by better accounting for these biases: (i) linear fixed effects regression analysis and (ii) linear instrumental variable analysis. We extract 13 annual waves of national cohort data including 13 260 working-age (18-64 years) employees. The exposure variable is self-reported level of psychosocial job quality. The instruments used are two common workplace entitlements. The outcome variable is the Mental Health Inventory (MHI-5). We adjust for measured time-varying confounders. In the fixed effects regression analysis adjusted for time-varying confounders, a 1-point increase in psychosocial job quality is associated with a 1.28-point improvement in mental health on the MHI-5 scale (95% CI: 1.17, 1.40; P variable analysis, a 1-point increase psychosocial job quality is related to 1.62-point improvement on the MHI-5 scale (95% CI: -0.24, 3.48; P = 0.088). Our quasi-experimental results provide evidence to confirm job stressors as risk factors for mental ill health using methods that improve causal inference. © The Author 2017. Published by Oxford University Press on behalf of Faculty of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  16. Estimates and sampling schemes for the instrumentation of accountability systems

    International Nuclear Information System (INIS)

    Jewell, W.S.; Kwiatkowski, J.W.

    1976-10-01

    The problem of estimation of a physical quantity from a set of measurements is considered, where the measurements are made on samples with a hierarchical error structure, and where within-groups error variances may vary from group to group at each level of the structure; minimum mean squared-error estimators are developed, and the case where the physical quantity is a random variable with known prior mean and variance is included. Estimators for the error variances are also given, and optimization of experimental design is considered

  17. Unifying parameter estimation and the Deutsch-Jozsa algorithm for continuous variables

    International Nuclear Information System (INIS)

    Zwierz, Marcin; Perez-Delgado, Carlos A.; Kok, Pieter

    2010-01-01

    We reveal a close relationship between quantum metrology and the Deutsch-Jozsa algorithm on continuous-variable quantum systems. We develop a general procedure, characterized by two parameters, that unifies parameter estimation and the Deutsch-Jozsa algorithm. Depending on which parameter we keep constant, the procedure implements either the parameter-estimation protocol or the Deutsch-Jozsa algorithm. The parameter-estimation part of the procedure attains the Heisenberg limit and is therefore optimal. Due to the use of approximate normalizable continuous-variable eigenstates, the Deutsch-Jozsa algorithm is probabilistic. The procedure estimates a value of an unknown parameter and solves the Deutsch-Jozsa problem without the use of any entanglement.

  18. Estimating particle number size distributions from multi-instrument observations with Kalman Filtering

    Energy Technology Data Exchange (ETDEWEB)

    Viskari, T.

    2012-07-01

    Atmospheric aerosol particles have several important effects on the environment and human society. The exact impact of aerosol particles is largely determined by their particle size distributions. However, no single instrument is able to measure the whole range of the particle size distribution. Estimating a particle size distribution from multiple simultaneous measurements remains a challenge in aerosol physical research. Current methods to combine different measurements require assumptions concerning the overlapping measurement ranges and have difficulties in accounting for measurement uncertainties. In this thesis, Extended Kalman Filter (EKF) is presented as a promising method to estimate particle number size distributions from multiple simultaneous measurements. The particle number size distribution estimated by EKF includes information from prior particle number size distributions as propagated by a dynamical model and is based on the reliabilities of the applied information sources. Known physical processes and dynamically evolving error covariances constrain the estimate both over time and particle size. The method was tested with measurements from Differential Mobility Particle Sizer (DMPS), Aerodynamic Particle Sizer (APS) and nephelometer. The particle number concentration was chosen as the state of interest. The initial EKF implementation presented here includes simplifications, yet the results are positive and the estimate successfully incorporated information from the chosen instruments. For particle sizes smaller than 4 micrometers, the estimate fits the available measurements and smooths the particle number size distribution over both time and particle diameter. The estimate has difficulties with particles larger than 4 micrometers due to issues with both measurements and the dynamical model in that particle size range. The EKF implementation appears to reduce the impact of measurement noise on the estimate, but has a delayed reaction to sudden

  19. Antimicrobial breakpoint estimation accounting for variability in pharmacokinetics

    Directory of Open Access Journals (Sweden)

    Nekka Fahima

    2009-06-01

    Full Text Available Abstract Background Pharmacokinetic and pharmacodynamic (PK/PD indices are increasingly being used in the microbiological field to assess the efficacy of a dosing regimen. In contrast to methods using MIC, PK/PD-based methods reflect in vivo conditions and are more predictive of efficacy. Unfortunately, they entail the use of one PK-derived value such as AUC or Cmax and may thus lead to biased efficiency information when the variability is large. The aim of the present work was to evaluate the efficacy of a treatment by adjusting classical breakpoint estimation methods to the situation of variable PK profiles. Methods and results We propose a logical generalisation of the usual AUC methods by introducing the concept of "efficiency" for a PK profile, which involves the efficacy function as a weight. We formulated these methods for both classes of concentration- and time-dependent antibiotics. Using drug models and in silico approaches, we provide a theoretical basis for characterizing the efficiency of a PK profile under in vivo conditions. We also used the particular case of variable drug intake to assess the effect of the variable PK profiles generated and to analyse the implications for breakpoint estimation. Conclusion Compared to traditional methods, our weighted AUC approach gives a more powerful PK/PD link and reveals, through examples, interesting issues about the uniqueness of therapeutic outcome indices and antibiotic resistance problems.

  20. 8 years of Solar Spectral Irradiance Variability Observed from the ISS with the SOLAR/SOLSPEC Instrument

    Science.gov (United States)

    Damé, Luc; Bolsée, David; Meftah, Mustapha; Irbah, Abdenour; Hauchecorne, Alain; Bekki, Slimane; Pereira, Nuno; Cessateur, Marchand; Gäel; , Marion; et al.

    2016-10-01

    Accurate measurements of Solar Spectral Irradiance (SSI) are of primary importance for a better understanding of solar physics and of the impact of solar variability on climate (via Earth's atmospheric photochemistry). The acquisition of a top of atmosphere reference solar spectrum and of its temporal and spectral variability during the unusual solar cycle 24 is of prime interest for these studies. These measurements are performed since April 2008 with the SOLSPEC spectro-radiometer from the far ultraviolet to the infrared (166 nm to 3088 nm). This instrument, developed under a fruitful LATMOS/BIRA-IASB collaboration, is part of the Solar Monitoring Observatory (SOLAR) payload, externally mounted on the Columbus module of the International Space Station (ISS). The SOLAR mission, with its actual 8 years duration, will cover almost the entire solar cycle 24. We present here the in-flight operations and performances of the SOLSPEC instrument, including the engineering corrections, calibrations and improved know-how procedure for aging corrections. Accordingly, a SSI reference spectrum from the UV to the NIR will be presented, together with its variability in the UV, as measured by SOLAR/SOLSPEC for 8 years. Uncertainties on these measurements and comparisons with other instruments will be briefly discussed.

  1. Linear solvation energy relationships: "rule of thumb" for estimation of variable values

    Science.gov (United States)

    Hickey, James P.; Passino-Reader, Dora R.

    1991-01-01

    For the linear solvation energy relationship (LSER), values are listed for each of the variables (Vi/100, π*, &betam, αm) for fundamental organic structures and functional groups. We give the guidelines to estimate LSER variable values quickly for a vast array of possible organic compounds such as those found in the environment. The difficulty in generating these variables has greatly discouraged the application of this quantitative structure-activity relationship (QSAR) method. This paper present the first compilation of molecular functional group values together with a utilitarian set of the LSER variable estimation rules. The availability of these variable values and rules should facilitate widespread application of LSER for hazard evaluation of environmental contaminants.

  2. Comparing proxy and model estimates of hydroclimate variability and change over the Common Era

    Science.gov (United States)

    Hydro2k Consortium, Pages

    2017-12-01

    Water availability is fundamental to societies and ecosystems, but our understanding of variations in hydroclimate (including extreme events, flooding, and decadal periods of drought) is limited because of a paucity of modern instrumental observations that are distributed unevenly across the globe and only span parts of the 20th and 21st centuries. Such data coverage is insufficient for characterizing hydroclimate and its associated dynamics because of its multidecadal to centennial variability and highly regionalized spatial signature. High-resolution (seasonal to decadal) hydroclimatic proxies that span all or parts of the Common Era (CE) and paleoclimate simulations from climate models are therefore important tools for augmenting our understanding of hydroclimate variability. In particular, the comparison of the two sources of information is critical for addressing the uncertainties and limitations of both while enriching each of their interpretations. We review the principal proxy data available for hydroclimatic reconstructions over the CE and highlight the contemporary understanding of how these proxies are interpreted as hydroclimate indicators. We also review the available last-millennium simulations from fully coupled climate models and discuss several outstanding challenges associated with simulating hydroclimate variability and change over the CE. A specific review of simulated hydroclimatic changes forced by volcanic events is provided, as is a discussion of expected improvements in estimated radiative forcings, models, and their implementation in the future. Our review of hydroclimatic proxies and last-millennium model simulations is used as the basis for articulating a variety of considerations and best practices for how to perform proxy-model comparisons of CE hydroclimate. This discussion provides a framework for how best to evaluate hydroclimate variability and its associated dynamics using these comparisons and how they can better inform

  3. A Design-Adaptive Local Polynomial Estimator for the Errors-in-Variables Problem

    KAUST Repository

    Delaigle, Aurore

    2009-03-01

    Local polynomial estimators are popular techniques for nonparametric regression estimation and have received great attention in the literature. Their simplest version, the local constant estimator, can be easily extended to the errors-in-variables context by exploiting its similarity with the deconvolution kernel density estimator. The generalization of the higher order versions of the estimator, however, is not straightforward and has remained an open problem for the last 15 years. We propose an innovative local polynomial estimator of any order in the errors-in-variables context, derive its design-adaptive asymptotic properties and study its finite sample performance on simulated examples. We provide not only a solution to a long-standing open problem, but also provide methodological contributions to error-invariable regression, including local polynomial estimation of derivative functions.

  4. Tyre-road grip coefficient assessment - Part II: online estimation using instrumented vehicle, extended Kalman filter, and neural network

    Science.gov (United States)

    Luque, Pablo; Mántaras, Daniel A.; Fidalgo, Eloy; Álvarez, Javier; Riva, Paolo; Girón, Pablo; Compadre, Diego; Ferran, Jordi

    2013-12-01

    The main objective of this work is to determine the limit of safe driving conditions by identifying the maximal friction coefficient in a real vehicle. The study will focus on finding a method to determine this limit before reaching the skid, which is valuable information in the context of traffic safety. Since it is not possible to measure the friction coefficient directly, it will be estimated using the appropriate tools in order to get the most accurate information. A real vehicle is instrumented to collect information of general kinematics and steering tie-rod forces. A real-time algorithm is developed to estimate forces and aligning torque in the tyres using an extended Kalman filter and neural networks techniques. The methodology is based on determining the aligning torque; this variable allows evaluation of the behaviour of the tyre. It transmits interesting information from the tyre-road contact and can be used to predict the maximal tyre grip and safety margin. The maximal grip coefficient is estimated according to a knowledge base, extracted from computer simulation of a high detailed three-dimensional model, using Adams® software. The proposed methodology is validated and applied to real driving conditions, in which maximal grip and safety margin are properly estimated.

  5. Study on the Leak Rate Estimation of SG Tubes and Residual Stress Estimation based on Plastic Deformation

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Young Jin; Chang, Yoon Suk; Lee, Dock Jin; Lee, Tae Rin; Choi, Shin Beom; Jeong, Jae Uk; Yeum, Seung Won [Sungkyunkwan University, Seoul (Korea, Republic of)

    2009-02-15

    In this research project, a leak rate estimation model was developed for steam generator tubes with through wall cracks. The modelling was based on the leak data from 23 tube specimens. Also, the procedure of finite element analysis was developed for residual stress calculation of dissimilar metal weld in a bottom mounted instrumentation. The effect of geometric variables related with the residual stress in penetration weld part was investigated by using the developed analysis procedure. The key subjects dealt in this research are: 1. Development of leak rate estimation model for steam generator tubes with through wall cracks 2. Development of the program which can perform the structure and leakage integrity evaluation for steam generator tubes 3. Development of analysis procedure for bottom mounted instrumentation weld residual stress 4. Analysis on the effects of geometric variables on weld residual stress It is anticipated that the technologies developed in this study are applicable for integrity estimation of steam generator tubes and weld part in NPP.

  6. Asymptotics of diagonal elements of projection matrices under many instruments/regressors

    Czech Academy of Sciences Publication Activity Database

    Anatolyev, Stanislav; Yaskov, P.

    2017-01-01

    Roč. 33, č. 3 (2017), s. 717-738 ISSN 0266-4666 Institutional support: RVO:67985998 Keywords : instrumental variable estimation * inference * models Subject RIV: AH - Economics OBOR OECD: Applied Economics, Econometrics Impact factor: 1.011, year: 2016

  7. Asymptotics of diagonal elements of projection matrices under many instruments/regressors

    Czech Academy of Sciences Publication Activity Database

    Anatolyev, Stanislav; Yaskov, P.

    2017-01-01

    Roč. 33, č. 3 (2017), s. 717-738 ISSN 0266-4666 Institutional support: Progres-Q24 Keywords : instrumental variable estimation * inference * models Subject RIV: AH - Economics OBOR OECD: Applied Economics, Econometrics Impact factor: 1.011, year: 2016

  8. Estimation of Finite Population Ratio When Other Auxiliary Variables are Available in the Study

    Directory of Open Access Journals (Sweden)

    Jehad Al-Jararha

    2014-12-01

    Full Text Available The estimation of the population total $t_y,$ by using one or moreauxiliary variables, and the population ratio $\\theta_{xy}=t_y/t_x,$$t_x$ is the population total for the auxiliary variable $X$, for afinite population are heavily discussed in the literature. In thispaper, the idea of estimation the finite population ratio$\\theta_{xy}$ is extended to use the availability of auxiliaryvariable $Z$ in the study, such auxiliary variable  is not used inthe definition of the population ratio. This idea may be  supported by the fact that the variable $Z$  is highly correlated with the interest variable $Y$ than the correlation between the variables $X$ and $Y.$ The availability of such auxiliary variable can be used to improve the precision of the estimation of the population ratio.  To our knowledge, this idea is not discussed in the literature.  The bias, variance and the mean squares error  are given for our approach. Simulation from real data set,  the empirical relative bias and  the empirical relative mean squares error are computed for our approach and different estimators proposed in the literature  for estimating the population ratio $\\theta_{xy}.$ Analytically and the simulation results show that, by suitable choices, our approach gives negligible bias and has less mean squares error.  

  9. Estimating structural equation models with non-normal variables by using transformations

    NARCIS (Netherlands)

    Montfort, van K.; Mooijaart, A.; Meijerink, F.

    2009-01-01

    We discuss structural equation models for non-normal variables. In this situation the maximum likelihood and the generalized least-squares estimates of the model parameters can give incorrect estimates of the standard errors and the associated goodness-of-fit chi-squared statistics. If the sample

  10. Estimating variability in functional images using a synthetic resampling approach

    International Nuclear Information System (INIS)

    Maitra, R.; O'Sullivan, F.

    1996-01-01

    Functional imaging of biologic parameters like in vivo tissue metabolism is made possible by Positron Emission Tomography (PET). Many techniques, such as mixture analysis, have been suggested for extracting such images from dynamic sequences of reconstructed PET scans. Methods for assessing the variability in these functional images are of scientific interest. The nonlinearity of the methods used in the mixture analysis approach makes analytic formulae for estimating variability intractable. The usual resampling approach is infeasible because of the prohibitive computational effort in simulating a number of sinogram. datasets, applying image reconstruction, and generating parametric images for each replication. Here we introduce an approach that approximates the distribution of the reconstructed PET images by a Gaussian random field and generates synthetic realizations in the imaging domain. This eliminates the reconstruction steps in generating each simulated functional image and is therefore practical. Results of experiments done to evaluate the approach on a model one-dimensional problem are very encouraging. Post-processing of the estimated variances is seen to improve the accuracy of the estimation method. Mixture analysis is used to estimate functional images; however, the suggested approach is general enough to extend to other parametric imaging methods

  11. Solving the Omitted Variables Problem of Regression Analysis Using the Relative Vertical Position of Observations

    Directory of Open Access Journals (Sweden)

    Jonathan E. Leightner

    2012-01-01

    Full Text Available The omitted variables problem is one of regression analysis’ most serious problems. The standard approach to the omitted variables problem is to find instruments, or proxies, for the omitted variables, but this approach makes strong assumptions that are rarely met in practice. This paper introduces best projection reiterative truncated projected least squares (BP-RTPLS, the third generation of a technique that solves the omitted variables problem without using proxies or instruments. This paper presents a theoretical argument that BP-RTPLS produces unbiased reduced form estimates when there are omitted variables. This paper also provides simulation evidence that shows OLS produces between 250% and 2450% more errors than BP-RTPLS when there are omitted variables and when measurement and round-off error is 1 percent or less. In an example, the government spending multiplier, , is estimated using annual data for the USA between 1929 and 2010.

  12. Parameter Estimation of a Closed Loop Coupled Tank Time Varying System using Recursive Methods

    International Nuclear Information System (INIS)

    Basir, Siti Nora; Yussof, Hanafiah; Shamsuddin, Syamimi; Selamat, Hazlina; Zahari, Nur Ismarrubie

    2013-01-01

    This project investigates the direct identification of closed loop plant using discrete-time approach. The uses of Recursive Least Squares (RLS), Recursive Instrumental Variable (RIV) and Recursive Instrumental Variable with Centre-Of-Triangle (RIV + COT) in the parameter estimation of closed loop time varying system have been considered. The algorithms were applied in a coupled tank system that employs covariance resetting technique where the time of parameter changes occur is unknown. The performances of all the parameter estimation methods, RLS, RIV and RIV + COT were compared. The estimation of the system whose output was corrupted with white and coloured noises were investigated. Covariance resetting technique successfully executed when the parameters change. RIV + COT gives better estimates than RLS and RIV in terms of convergence and maximum overshoot

  13. Evaluation of Validity and Reliability for Hierarchical Scales Using Latent Variable Modeling

    Science.gov (United States)

    Raykov, Tenko; Marcoulides, George A.

    2012-01-01

    A latent variable modeling method is outlined, which accomplishes estimation of criterion validity and reliability for a multicomponent measuring instrument with hierarchical structure. The approach provides point and interval estimates for the scale criterion validity and reliability coefficients, and can also be used for testing composite or…

  14. Inconsistencies in net radiation estimates from use of several models of instruments in a desert environment

    International Nuclear Information System (INIS)

    Kustas, W.P.; Prueger, J.H.; Hipps, L.E.; Hatfield, J.L.; Meek, D.

    1998-01-01

    Studies of surface energy and water balance generally require an accurate estimate of net radiation and its spatial distribution. A project quantifying both short term and seasonal water use of shrub and grass vegetation in the Jornada Experimental Range in New Mexico prompted a study to compare net radiation observations using two types of net radiometers currently being used in research. A set of 12 REBS net radiometers were compared with each other and one Swissteco, over wet and dry surfaces in an arid landscape under clear skies. The set of REBS exhibited significant differences in output over both surfaces. However, they could be cross calibrated to yield values within 10 W m −2 , on average. There was also a significant bias between the REBS and Swissteco over a dry surface, but not over a wet one. The two makes of instrument could be made to agree under the dry conditions by using regression or autoregression techniques. However, the resulting equations would induce bias for the wet surface condition. Thus, it is not possible to cross calibrate these two makes of radiometer over the range of environmental conditions observed. This result indicates that determination of spatial distribution of net radiation over a variable surface should be made with identical instruments which have been cross calibrated. The need still exists for development of a radiometer and calibration procedures which will produce accurate and consistent measurements over a range of surface conditions. (author)

  15. Stable Graphical Model Estimation with Random Forests for Discrete, Continuous, and Mixed Variables

    OpenAIRE

    Fellinghauer, Bernd; Bühlmann, Peter; Ryffel, Martin; von Rhein, Michael; Reinhardt, Jan D.

    2011-01-01

    A conditional independence graph is a concise representation of pairwise conditional independence among many variables. Graphical Random Forests (GRaFo) are a novel method for estimating pairwise conditional independence relationships among mixed-type, i.e. continuous and discrete, variables. The number of edges is a tuning parameter in any graphical model estimator and there is no obvious number that constitutes a good choice. Stability Selection helps choosing this parameter with respect to...

  16. Parameter estimation of variable-parameter nonlinear Muskingum model using excel solver

    Science.gov (United States)

    Kang, Ling; Zhou, Liwei

    2018-02-01

    Abstract . The Muskingum model is an effective flood routing technology in hydrology and water resources Engineering. With the development of optimization technology, more and more variable-parameter Muskingum models were presented to improve effectiveness of the Muskingum model in recent decades. A variable-parameter nonlinear Muskingum model (NVPNLMM) was proposed in this paper. According to the results of two real and frequently-used case studies by various models, the NVPNLMM could obtain better values of evaluation criteria, which are used to describe the superiority of the estimated outflows and compare the accuracies of flood routing using various models, and the optimal estimated outflows by the NVPNLMM were closer to the observed outflows than the ones by other models.

  17. An improved estimator for the hydration of fat-free mass from in vivo measurements subject to additive technical errors

    International Nuclear Information System (INIS)

    Kinnamon, Daniel D; Ludwig, David A; Lipshultz, Steven E; Miller, Tracie L; Lipsitz, Stuart R

    2010-01-01

    The hydration of fat-free mass, or hydration fraction (HF), is often defined as a constant body composition parameter in a two-compartment model and then estimated from in vivo measurements. We showed that the widely used estimator for the HF parameter in this model, the mean of the ratios of measured total body water (TBW) to fat-free mass (FFM) in individual subjects, can be inaccurate in the presence of additive technical errors. We then proposed a new instrumental variables estimator that accurately estimates the HF parameter in the presence of such errors. In Monte Carlo simulations, the mean of the ratios of TBW to FFM was an inaccurate estimator of the HF parameter, and inferences based on it had actual type I error rates more than 13 times the nominal 0.05 level under certain conditions. The instrumental variables estimator was accurate and maintained an actual type I error rate close to the nominal level in all simulations. When estimating and performing inference on the HF parameter, the proposed instrumental variables estimator should yield accurate estimates and correct inferences in the presence of additive technical errors, but the mean of the ratios of TBW to FFM in individual subjects may not

  18. Estimating the effects of wages on obesity.

    Science.gov (United States)

    Kim, DaeHwan; Leigh, John Paul

    2010-05-01

    To estimate the effects of wages on obesity and body mass. Data on household heads, aged 20 to 65 years, with full-time jobs, were drawn from the Panel Study of Income Dynamics for 2003 to 2007. The Panel Study of Income Dynamics is a nationally representative sample. Instrumental variables (IV) for wages were created using knowledge of computer software and state legal minimum wages. Least squares (linear regression) with corrected standard errors were used to estimate the equations. Statistical tests revealed both instruments were strong and tests for over-identifying restrictions were favorable. Wages were found to be predictive (P low wages increase obesity prevalence and body mass.

  19. Social interactions and college enrollment: A combined school fixed effects/instrumental variables approach.

    Science.gov (United States)

    Fletcher, Jason M

    2015-07-01

    This paper provides some of the first evidence of peer effects in college enrollment decisions. There are several empirical challenges in assessing the influences of peers in this context, including the endogeneity of high school, shared group-level unobservables, and identifying policy-relevant parameters of social interactions models. This paper addresses these issues by using an instrumental variables/fixed effects approach that compares students in the same school but different grade-levels who are thus exposed to different sets of classmates. In particular, plausibly exogenous variation in peers' parents' college expectations are used as an instrument for peers' college choices. Preferred specifications indicate that increasing a student's exposure to college-going peers by ten percentage points is predicted to raise the student's probability of enrolling in college by 4 percentage points. This effect is roughly half the magnitude of growing up in a household with married parents (vs. an unmarried household). Copyright © 2015 Elsevier Inc. All rights reserved.

  20. Estimation and variable selection for generalized additive partial linear models

    KAUST Repository

    Wang, Li

    2011-08-01

    We study generalized additive partial linear models, proposing the use of polynomial spline smoothing for estimation of nonparametric functions, and deriving quasi-likelihood based estimators for the linear parameters. We establish asymptotic normality for the estimators of the parametric components. The procedure avoids solving large systems of equations as in kernel-based procedures and thus results in gains in computational simplicity. We further develop a class of variable selection procedures for the linear parameters by employing a nonconcave penalized quasi-likelihood, which is shown to have an asymptotic oracle property. Monte Carlo simulations and an empirical example are presented for illustration. © Institute of Mathematical Statistics, 2011.

  1. Incorporating Latent Variables into Discrete Choice Models - A Simultaneous Estimation Approach Using SEM Software

    Directory of Open Access Journals (Sweden)

    Dirk Temme

    2008-12-01

    Full Text Available Integrated choice and latent variable (ICLV models represent a promising new class of models which merge classic choice models with the structural equation approach (SEM for latent variables. Despite their conceptual appeal, applications of ICLV models in marketing remain rare. We extend previous ICLV applications by first estimating a multinomial choice model and, second, by estimating hierarchical relations between latent variables. An empirical study on travel mode choice clearly demonstrates the value of ICLV models to enhance the understanding of choice processes. In addition to the usually studied directly observable variables such as travel time, we show how abstract motivations such as power and hedonism as well as attitudes such as a desire for flexibility impact on travel mode choice. Furthermore, we show that it is possible to estimate such a complex ICLV model with the widely available structural equation modeling package Mplus. This finding is likely to encourage more widespread application of this appealing model class in the marketing field.

  2. Sleep Quality Estimation based on Chaos Analysis for Heart Rate Variability

    Science.gov (United States)

    Fukuda, Toshio; Wakuda, Yuki; Hasegawa, Yasuhisa; Arai, Fumihito; Kawaguchi, Mitsuo; Noda, Akiko

    In this paper, we propose an algorithm to estimate sleep quality based on a heart rate variability using chaos analysis. Polysomnography(PSG) is a conventional and reliable system to diagnose sleep disorder and to evaluate its severity and therapeatic effect, by estimating sleep quality based on multiple channels. However, a recording process requires a lot of time and a controlled environment for measurement and then an analyzing process of PSG data is hard work because the huge sensed data should be manually evaluated. On the other hand, it is focused that some people make a mistake or cause an accident due to lost of regular sleep and of homeostasis these days. Therefore a simple home system for checking own sleep is required and then the estimation algorithm for the system should be developed. Therefore we propose an algorithm to estimate sleep quality based only on a heart rate variability which can be measured by a simple sensor such as a pressure sensor and an infrared sensor in an uncontrolled environment, by experimentally finding the relationship between chaos indices and sleep quality. The system including the estimation algorithm can inform patterns and quality of own daily sleep to a user, and then the user can previously arranges his life schedule, pays more attention based on sleep results and consult with a doctor.

  3. Feedback control of acoustic musical instruments: collocated control using physical analogs.

    Science.gov (United States)

    Berdahl, Edgar; Smith, Julius O; Niemeyer, Günter

    2012-01-01

    Traditionally, the average professional musician has owned numerous acoustic musical instruments, many of them having distinctive acoustic qualities. However, a modern musician could prefer to have a single musical instrument whose acoustics are programmable by feedback control, where acoustic variables are estimated from sensor measurements in real time and then fed back in order to influence the controlled variables. In this paper, theory is presented that describes stable feedback control of an acoustic musical instrument. The presentation should be accessible to members of the musical acoustics community who may have limited or no experience with feedback control. First, the only control strategy guaranteed to be stable subject to any musical instrument mobility is described: the sensors and actuators must be collocated, and the controller must emulate a physical analog system. Next, the most fundamental feedback controllers and the corresponding physical analog systems are presented. The effects that these controllers have on acoustic musical instruments are described. Finally, practical design challenges are discussed. A proof explains why changing the resonance frequency of a musical resonance requires much more control power than changing the decay time of the resonance. © 2012 Acoustical Society of America.

  4. Stochastic Optimal Estimation with Fuzzy Random Variables and Fuzzy Kalman Filtering

    Institute of Scientific and Technical Information of China (English)

    FENG Yu-hu

    2005-01-01

    By constructing a mean-square performance index in the case of fuzzy random variable, the optimal estimation theorem for unknown fuzzy state using the fuzzy observation data are given. The state and output of linear discrete-time dynamic fuzzy system with Gaussian noise are Gaussian fuzzy random variable sequences. An approach to fuzzy Kalman filtering is discussed. Fuzzy Kalman filtering contains two parts: a real-valued non-random recurrence equation and the standard Kalman filtering.

  5. Estimating Marginal Returns to Education. NBER Working Paper No. 16474

    Science.gov (United States)

    Carneiro, Pedro; Heckman, James J.; Vytlacil, Edward J.

    2010-01-01

    This paper estimates the marginal returns to college for individuals induced to enroll in college by different marginal policy changes. The recent instrumental variables literature seeks to estimate this parameter, but in general it does so only under strong assumptions that are tested and found wanting. We show how to utilize economic theory and…

  6. Estimation of pharmacokinetic parameters from non-compartmental variables using Microsoft Excel.

    Science.gov (United States)

    Dansirikul, Chantaratsamon; Choi, Malcolm; Duffull, Stephen B

    2005-06-01

    This study was conducted to develop a method, termed 'back analysis (BA)', for converting non-compartmental variables to compartment model dependent pharmacokinetic parameters for both one- and two-compartment models. A Microsoft Excel spreadsheet was implemented with the use of Solver and visual basic functions. The performance of the BA method in estimating pharmacokinetic parameter values was evaluated by comparing the parameter values obtained to a standard modelling software program, NONMEM, using simulated data. The results show that the BA method was reasonably precise and provided low bias in estimating fixed and random effect parameters for both one- and two-compartment models. The pharmacokinetic parameters estimated from the BA method were similar to those of NONMEM estimation.

  7. A simulation study on estimating biomarker-treatment interaction effects in randomized trials with prognostic variables.

    Science.gov (United States)

    Haller, Bernhard; Ulm, Kurt

    2018-02-20

    To individualize treatment decisions based on patient characteristics, identification of an interaction between a biomarker and treatment is necessary. Often such potential interactions are analysed using data from randomized clinical trials intended for comparison of two treatments. Tests of interactions are often lacking statistical power and we investigated if and how a consideration of further prognostic variables can improve power and decrease the bias of estimated biomarker-treatment interactions in randomized clinical trials with time-to-event outcomes. A simulation study was performed to assess how prognostic factors affect the estimate of the biomarker-treatment interaction for a time-to-event outcome, when different approaches, like ignoring other prognostic factors, including all available covariates or using variable selection strategies, are applied. Different scenarios regarding the proportion of censored observations, the correlation structure between the covariate of interest and further potential prognostic variables, and the strength of the interaction were considered. The simulation study revealed that in a regression model for estimating a biomarker-treatment interaction, the probability of detecting a biomarker-treatment interaction can be increased by including prognostic variables that are associated with the outcome, and that the interaction estimate is biased when relevant prognostic variables are not considered. However, the probability of a false-positive finding increases if too many potential predictors are included or if variable selection is performed inadequately. We recommend undertaking an adequate literature search before data analysis to derive information about potential prognostic variables and to gain power for detecting true interaction effects and pre-specifying analyses to avoid selective reporting and increased false-positive rates.

  8. Validity of Two New Brief Instruments to Estimate Vegetable Intake in Adults

    Directory of Open Access Journals (Sweden)

    Janine Wright

    2015-08-01

    Full Text Available Cost effective population-based monitoring tools are needed for nutritional surveillance and interventions. The aim was to evaluate the relative validity of two new brief instruments (three item: VEG3 and five item: VEG5 for estimating usual total vegetable intake in comparison to a 7-day dietary record (7DDR. Sixty-four Australian adult volunteers aged 30 to 69 years (30 males, mean age ± SD 56.3 ± 9.2 years and 34 female mean age ± SD 55.3 ± 10.0 years. Pearson correlations between 7DDR and VEG3 and VEG5 were modest, at 0.50 and 0.56, respectively. VEG3 significantly (p < 0.001 underestimated mean vegetable intake compared to 7DDR measures (2.9 ± 1.3 vs. 3.6 ± 1.6 serves/day, respectively, whereas mean vegetable intake assessed by VEG5 did not differ from 7DDR measures (3.3 ± 1.5 vs. 3.6 ± 1.6 serves/day. VEG5 was also able to correctly identify 95%, 88% and 75% of those subjects not consuming five, four and three serves/day of vegetables according to their 7DDR classification. VEG5, but not VEG3, can estimate usual total vegetable intake of population groups and had superior performance to VEG3 in identifying those not meeting different levels of vegetable intake. VEG5, a brief instrument, shows measurement characteristics useful for population-based monitoring and intervention targeting.

  9. Is it feasible to estimate radiosonde biases from interlaced measurements?

    Science.gov (United States)

    Kremser, Stefanie; Tradowsky, Jordis S.; Rust, Henning W.; Bodeker, Greg E.

    2018-05-01

    Upper-air measurements of essential climate variables (ECVs), such as temperature, are crucial for climate monitoring and climate change detection. Because of the internal variability of the climate system, many decades of measurements are typically required to robustly detect any trend in the climate data record. It is imperative for the records to be temporally homogeneous over many decades to confidently estimate any trend. Historically, records of upper-air measurements were primarily made for short-term weather forecasts and as such are seldom suitable for studying long-term climate change as they lack the required continuity and homogeneity. Recognizing this, the Global Climate Observing System (GCOS) Reference Upper-Air Network (GRUAN) has been established to provide reference-quality measurements of climate variables, such as temperature, pressure, and humidity, together with well-characterized and traceable estimates of the measurement uncertainty. To ensure that GRUAN data products are suitable to detect climate change, a scientifically robust instrument replacement strategy must always be adopted whenever there is a change in instrumentation. By fully characterizing any systematic differences between the old and new measurement system a temporally homogeneous data series can be created. One strategy is to operate both the old and new instruments in tandem for some overlap period to characterize any inter-instrument biases. However, this strategy can be prohibitively expensive at measurement sites operated by national weather services or research institutes. An alternative strategy that has been proposed is to alternate between the old and new instruments, so-called interlacing, and then statistically derive the systematic biases between the two instruments. Here we investigate the feasibility of such an approach specifically for radiosondes, i.e. flying the old and new instruments on alternating days. Synthetic data sets are used to explore the

  10. Generalized Spatial Two Stage Least Squares Estimation of Spatial Autoregressive Models with Autoregressive Disturbances in the Presence of Endogenous Regressors and Many Instruments

    Directory of Open Access Journals (Sweden)

    Fei Jin

    2013-05-01

    Full Text Available This paper studies the generalized spatial two stage least squares (GS2SLS estimation of spatial autoregressive models with autoregressive disturbances when there are endogenous regressors with many valid instruments. Using many instruments may improve the efficiency of estimators asymptotically, but the bias might be large in finite samples, making the inference inaccurate. We consider the case that the number of instruments K increases with, but at a rate slower than, the sample size, and derive the approximate mean square errors (MSE that account for the trade-offs between the bias and variance, for both the GS2SLS estimator and a bias-corrected GS2SLS estimator. A criterion function for the optimal K selection can be based on the approximate MSEs. Monte Carlo experiments are provided to show the performance of our procedure of choosing K.

  11. Estimating dew formation in rice, using seasonally averaged diel patterns of weather variables

    NARCIS (Netherlands)

    Luo, W.; Goudriaan, J.

    2004-01-01

    If dew formation cannot be measured it has to be estimated. Available simulation models for estimating dew formation require hourly weather data as input. However, such data are not available for places without an automatic weather station. In such cases the diel pattern of weather variables might

  12. Surface Estimation, Variable Selection, and the Nonparametric Oracle Property.

    Science.gov (United States)

    Storlie, Curtis B; Bondell, Howard D; Reich, Brian J; Zhang, Hao Helen

    2011-04-01

    Variable selection for multivariate nonparametric regression is an important, yet challenging, problem due, in part, to the infinite dimensionality of the function space. An ideal selection procedure should be automatic, stable, easy to use, and have desirable asymptotic properties. In particular, we define a selection procedure to be nonparametric oracle (np-oracle) if it consistently selects the correct subset of predictors and at the same time estimates the smooth surface at the optimal nonparametric rate, as the sample size goes to infinity. In this paper, we propose a model selection procedure for nonparametric models, and explore the conditions under which the new method enjoys the aforementioned properties. Developed in the framework of smoothing spline ANOVA, our estimator is obtained via solving a regularization problem with a novel adaptive penalty on the sum of functional component norms. Theoretical properties of the new estimator are established. Additionally, numerous simulated and real examples further demonstrate that the new approach substantially outperforms other existing methods in the finite sample setting.

  13. The effects of competition on premiums: using United Healthcare's 2015 entry into Affordable Care Act's marketplaces as an instrumental variable.

    Science.gov (United States)

    Agirdas, Cagdas; Krebs, Robert J; Yano, Masato

    2018-01-08

    One goal of the Affordable Care Act is to increase insurance coverage by improving competition and lowering premiums. To facilitate this goal, the federal government enacted online marketplaces in the 395 rating areas spanning 34 states that chose not to establish their own state-run marketplaces. Few multivariate regression studies analyzing the effects of competition on premiums suffer from endogeneity, due to simultaneity and omitted variable biases. However, United Healthcare's decision to enter these marketplaces in 2015 provides the researcher with an opportunity to address this endogeneity problem. Exploiting the variation caused by United Healthcare's entry decision as an instrument for competition, we study the impact of competition on premiums during the first 2 years of these marketplaces. Combining panel data from five different sources and controlling for 12 variables, we find that one more insurer in a rating area leads to a 6.97% reduction in the second-lowest-priced silver plan premium, which is larger than the estimated effects in existing literature. Furthermore, we run a threshold analysis and find that competition's effects on premiums become statistically insignificant if there are four or more insurers in a rating area. These findings are robust to alternative measures of premiums, inclusion of a non-linear term in the regression models and a county-level analysis.

  14. The long view: Causes of climate change over the instrumental period

    Science.gov (United States)

    Hegerl, G. C.; Schurer, A. P.; Polson, D.; Iles, C. E.; Bronnimann, S.

    2016-12-01

    The period of instrumentally recorded data has seen remarkable changes in climate, with periods of rapid warming, and periods of stagnation or cooling. A recent analysis of the observed temperature change from the instrumental record confirms that most of the warming recorded since the middle of the 20rst century has been caused by human influences, but shows large uncertainty in separating greenhouse gas from aerosol response if accounting for model uncertainty. The contribution by natural forcing and internal variability to the recent warming is estimated to be small, but becomes more important when analysing climate change over earlier or shorter time periods. For example, the enigmatic early 20th century warming was a period of strong climate anomalies, including the US dustbowl drought and exceptional heat waves, and pronounced Arctic warming. Attribution results suggests that about half of the global warming 1901-1950 was forced by greenhouse gases increases, with an anomalously strong contribution by climate variability, and contributions by natural forcing. Long term variations in circulation are important for some regional climate anomalies. Precipitation is important for impacts of climate change and precipitation changes are uncertain in models. Analysis of the instrumental record suggests a human influence on mean and heavy precipitation, and supports climate model estimates of the spatial pattern of precipitation sensitivity to warming. Broadly, and particularly over ocean, wet regions are getting wetter and dry regions are getting drier. In conclusion, the historical record provides evidence for a strong response to external forcings, supports climate models, and raises questions about multi-decadal variability.

  15. Quantifying measurement uncertainty and spatial variability in the context of model evaluation

    Science.gov (United States)

    Choukulkar, A.; Brewer, A.; Pichugina, Y. L.; Bonin, T.; Banta, R. M.; Sandberg, S.; Weickmann, A. M.; Djalalova, I.; McCaffrey, K.; Bianco, L.; Wilczak, J. M.; Newman, J. F.; Draxl, C.; Lundquist, J. K.; Wharton, S.; Olson, J.; Kenyon, J.; Marquis, M.

    2017-12-01

    In an effort to improve wind forecasts for the wind energy sector, the Department of Energy and the NOAA funded the second Wind Forecast Improvement Project (WFIP2). As part of the WFIP2 field campaign, a large suite of in-situ and remote sensing instrumentation was deployed to the Columbia River Gorge in Oregon and Washington from October 2015 - March 2017. The array of instrumentation deployed included 915-MHz wind profiling radars, sodars, wind- profiling lidars, and scanning lidars. The role of these instruments was to provide wind measurements at high spatial and temporal resolution for model evaluation and improvement of model physics. To properly determine model errors, the uncertainties in instrument-model comparisons need to be quantified accurately. These uncertainties arise from several factors such as measurement uncertainty, spatial variability, and interpolation of model output to instrument locations, to name a few. In this presentation, we will introduce a formalism to quantify measurement uncertainty and spatial variability. The accuracy of this formalism will be tested using existing datasets such as the eXperimental Planetary boundary layer Instrumentation Assessment (XPIA) campaign. Finally, the uncertainties in wind measurement and the spatial variability estimates from the WFIP2 field campaign will be discussed to understand the challenges involved in model evaluation.

  16. Impact of energy policy instruments on the estimated level of underlying energy efficiency in the EU residential sector

    International Nuclear Information System (INIS)

    Filippini, Massimo; Hunt, Lester C.; Zorić, Jelena

    2014-01-01

    The promotion of energy efficiency is seen as one of the top priorities of EU energy policy (EC, 2010). In order to design and implement effective energy policy instruments, it is necessary to have information on energy demand price and income elasticities in addition to sound indicators of energy efficiency. This research combines the approaches taken in energy demand modelling and frontier analysis in order to econometrically estimate the level of energy efficiency for the residential sector in the EU-27 member states for the period 1996 to 2009. The estimates for the energy efficiency confirm that the EU residential sector indeed holds a relatively high potential for energy savings from reduced inefficiency. Therefore, despite the common objective to decrease ‘wasteful’ energy consumption, considerable variation in energy efficiency between the EU member states is established. Furthermore, an attempt is made to evaluate the impact of energy-efficiency measures undertaken in the EU residential sector by introducing an additional set of variables into the model and the results suggest that financial incentives and energy performance standards play an important role in promoting energy efficiency improvements, whereas informative measures do not have a significant impact. - Highlights: • The level of energy efficiency of the EU residential sector is estimated. • Considerable potential for energy savings from reduced inefficiency is established. • The impact of introduced energy-efficiency policy measures is also evaluated. • Financial incentives are found to promote energy efficiency improvements. • Energy performance standards also play an important role

  17. 26 CFR 1.1275-5 - Variable rate debt instruments.

    Science.gov (United States)

    2010-04-01

    ... nonpublicly traded property. A debt instrument (other than a tax-exempt obligation) that would otherwise... variations in the cost of newly borrowed funds in the currency in which the debt instrument is denominated... on the yield of actively traded personal property (within the meaning of section 1092(d)(1)). (ii...

  18. Multiengine Speech Processing Using SNR Estimator in Variable Noisy Environments

    Directory of Open Access Journals (Sweden)

    Ahmad R. Abu-El-Quran

    2012-01-01

    Full Text Available We introduce a multiengine speech processing system that can detect the location and the type of audio signal in variable noisy environments. This system detects the location of the audio source using a microphone array; the system examines the audio first, determines if it is speech/nonspeech, then estimates the value of the signal to noise (SNR using a Discrete-Valued SNR Estimator. Using this SNR value, instead of trying to adapt the speech signal to the speech processing system, we adapt the speech processing system to the surrounding environment of the captured speech signal. In this paper, we introduced the Discrete-Valued SNR Estimator and a multiengine classifier, using Multiengine Selection or Multiengine Weighted Fusion. Also we use the SI as example of the speech processing. The Discrete-Valued SNR Estimator achieves an accuracy of 98.4% in characterizing the environment's SNR. Compared to a conventional single engine SI system, the improvement in accuracy was as high as 9.0% and 10.0% for the Multiengine Selection and Multiengine Weighted Fusion, respectively.

  19. Robust total energy demand estimation with a hybrid Variable Neighborhood Search – Extreme Learning Machine algorithm

    International Nuclear Information System (INIS)

    Sánchez-Oro, J.; Duarte, A.; Salcedo-Sanz, S.

    2016-01-01

    Highlights: • The total energy demand in Spain is estimated with a Variable Neighborhood algorithm. • Socio-economic variables are used, and one year ahead prediction horizon is considered. • Improvement of the prediction with an Extreme Learning Machine network is considered. • Experiments are carried out in real data for the case of Spain. - Abstract: Energy demand prediction is an important problem whose solution is evaluated by policy makers in order to take key decisions affecting the economy of a country. A number of previous approaches to improve the quality of this estimation have been proposed in the last decade, the majority of them applying different machine learning techniques. In this paper, the performance of a robust hybrid approach, composed of a Variable Neighborhood Search algorithm and a new class of neural network called Extreme Learning Machine, is discussed. The Variable Neighborhood Search algorithm is focused on obtaining the most relevant features among the set of initial ones, by including an exponential prediction model. While previous approaches consider that the number of macroeconomic variables used for prediction is a parameter of the algorithm (i.e., it is fixed a priori), the proposed Variable Neighborhood Search method optimizes both: the number of variables and the best ones. After this first step of feature selection, an Extreme Learning Machine network is applied to obtain the final energy demand prediction. Experiments in a real case of energy demand estimation in Spain show the excellent performance of the proposed approach. In particular, the whole method obtains an estimation of the energy demand with an error lower than 2%, even when considering the crisis years, which are a real challenge.

  20. A new virtual instrument for estimating punch velocity in combat sports.

    Science.gov (United States)

    Urbinati, K S; Scheeren, E; Nohama, P

    2013-01-01

    For improving the performance in combat sport, especially percussion, it is necessary achieving high velocity in punches and kicks. The aim of this study was to evaluate the applicability of 3D accelerometry in a Virtual Instrumentation System (VIS) designed for estimating punch velocity in combat sports. It was conducted in two phases: (1) integration of the 3D accelerometer with the communication interface and software for processing and visualization, and (2) applicability of the system. Fifteen karate athletes performed five gyaku zuki type punches (with reverse leg) using the accelerometer on the 3rd metacarpal on the back of the hand. It was performed nonparametric Mann-Whitney U-test to determine differences in the mean linear velocity among three punches performed sequentially (p sport.

  1. Estimation of genetic variability level in inbred CF1 mouse lines ...

    Indian Academy of Sciences (India)

    To estimate the genetic variability levels maintained by inbred lines selected for body weight and to compare them with a nonselected population from which the lines were derived, we calculated the per cent polymorphic loci (P) and marker diversity (MD) index from data on 43 putative loci of inter simple sequence repeats ...

  2. Estimation of road profile variability from measured vehicle responses

    Science.gov (United States)

    Fauriat, W.; Mattrand, C.; Gayton, N.; Beakou, A.; Cembrzynski, T.

    2016-05-01

    When assessing the statistical variability of fatigue loads acting throughout the life of a vehicle, the question of the variability of road roughness naturally arises, as both quantities are strongly related. For car manufacturers, gathering information on the environment in which vehicles evolve is a long and costly but necessary process to adapt their products to durability requirements. In the present paper, a data processing algorithm is proposed in order to estimate the road profiles covered by a given vehicle, from the dynamic responses measured on this vehicle. The algorithm based on Kalman filtering theory aims at solving a so-called inverse problem, in a stochastic framework. It is validated using experimental data obtained from simulations and real measurements. The proposed method is subsequently applied to extract valuable statistical information on road roughness from an existing load characterisation campaign carried out by Renault within one of its markets.

  3. Application of a user-friendly comprehensive circulatory model for estimation of hemodynamic and ventricular variables

    NARCIS (Netherlands)

    Ferrari, G.; Kozarski, M.; Gu, Y. J.; De Lazzari, C.; Di Molfetta, A.; Palko, K. J.; Zielinski, K.; Gorczynska, K.; Darowski, M.; Rakhorst, G.

    2008-01-01

    Purpose: Application of a comprehensive, user-friendly, digital computer circulatory model to estimate hemodynamic and ventricular variables. Methods: The closed-loop lumped parameter circulatory model represents the circulation at the level of large vessels. A variable elastance model reproduces

  4. Estimating the costs of consumer-facing cybercrime: A tailored instrument and representative data for six EU countries.

    NARCIS (Netherlands)

    Riek, Markus; Boehme, Rainer; Ciere, M.; Hernandez Ganan, C.; van Eeten, M.J.G.

    2016-01-01

    While cybercrime has existed for many years and is still reported to be a growing problem, reliable estimates of the economic impacts are rare. We develop a survey instrument tailored to measure the costs of consumer-facing cybercrime systematically, by aggregating different cost factors into direct

  5. Center of gravity estimation using a reaction board instrumented with fiber Bragg gratings

    Science.gov (United States)

    Oliveira, Rui; Roriz, Paulo; Marques, Manuel B.; Frazão, Orlando

    2018-03-01

    The purpose of the present work is to construct a reaction board based on fiber Bragg gratings (FBGs) that could be used for estimation of the 2D coordinates of the projection of center of gravity (CG) of an object. The apparatus is consisted of a rigid equilateral triangular board mounted on three supports at the vertices, two of which have cantilevers instrumented with FBGs. When an object of known weight is placed on the board, the bending strain of the cantilevers is measured by a proportional wavelength shift of the FBGs. Applying the equilibrium conditions of a rigid body and proper calibration procedures, the wavelength shift is used to estimate the vertical reaction forces and moments of force at the supports and the coordinates of the object's CG projection on the board. This method can be used on a regular basis to estimate the CG of the human body or objects with complex geometry and density distribution. An example is provided for the estimation of the CG projection coordinates of two orthopaedic femur bone models, one intact, and the other with a hip stem implant encased. The clinical implications of changing the normal CG location by means of a prosthesis have been discussed.

  6. Unit Root Testing in Heteroscedastic Panels Using the Cauchy Estimator

    NARCIS (Netherlands)

    Demetrescu, Matei; Hanck, Christoph

    The Cauchy estimator of an autoregressive root uses the sign of the first lag as instrumental variable. The resulting IV t-type statistic follows a standard normal limiting distribution under a unit root case even under unconditional heteroscedasticity, if the series to be tested has no

  7. Latest NASA Instrument Cost Model (NICM): Version VI

    Science.gov (United States)

    Mrozinski, Joe; Habib-Agahi, Hamid; Fox, George; Ball, Gary

    2014-01-01

    The NASA Instrument Cost Model, NICM, is a suite of tools which allow for probabilistic cost estimation of NASA's space-flight instruments at both the system and subsystem level. NICM also includes the ability to perform cost by analogy as well as joint confidence level (JCL) analysis. The latest version of NICM, Version VI, was released in Spring 2014. This paper will focus on the new features released with NICM VI, which include: 1) The NICM-E cost estimating relationship, which is applicable for instruments flying on Explorer-like class missions; 2) The new cluster analysis ability which, alongside the results of the parametric cost estimation for the user's instrument, also provides a visualization of the user's instrument's similarity to previously flown instruments; and 3) includes new cost estimating relationships for in-situ instruments.

  8. “You Can’t Play a Sad Song on the Banjo:” Acoustic Factors in the Judgment of Instrument Capacity to Convey Sadness

    Directory of Open Access Journals (Sweden)

    David Huron

    2014-05-01

    Full Text Available Forty-four Western-enculturated musicians completed two studies. The first group was asked to judge the relative sadness of forty-four familiar Western instruments. An independent group was asked to assess a number of acoustical properties for those same instruments. Using the estimated acoustical properties as predictor variables in a multiple regression analysis, a significant correlation was found between those properties known to contribute to sad prosody in speech and the judged sadness of the instruments. The best predictor variable was the ability of the instrument to make small pitch movements. Other variables investigated included the darkness of the timbre, the ability to play low pitches, the ability to play quietly, and the capacity of the instrument to "mumble." Four of the acoustical factors were found to exhibit a considerable amount of shared variance, suggesting that they may originate in a common underlying factor. It is suggested that the shared proximal cause of these acoustical features may be low physical energy.

  9. Estimators for longitudinal latent exposure models: examining measurement model assumptions.

    Science.gov (United States)

    Sánchez, Brisa N; Kim, Sehee; Sammel, Mary D

    2017-06-15

    Latent variable (LV) models are increasingly being used in environmental epidemiology as a way to summarize multiple environmental exposures and thus minimize statistical concerns that arise in multiple regression. LV models may be especially useful when multivariate exposures are collected repeatedly over time. LV models can accommodate a variety of assumptions but, at the same time, present the user with many choices for model specification particularly in the case of exposure data collected repeatedly over time. For instance, the user could assume conditional independence of observed exposure biomarkers given the latent exposure and, in the case of longitudinal latent exposure variables, time invariance of the measurement model. Choosing which assumptions to relax is not always straightforward. We were motivated by a study of prenatal lead exposure and mental development, where assumptions of the measurement model for the time-changing longitudinal exposure have appreciable impact on (maximum-likelihood) inferences about the health effects of lead exposure. Although we were not particularly interested in characterizing the change of the LV itself, imposing a longitudinal LV structure on the repeated multivariate exposure measures could result in high efficiency gains for the exposure-disease association. We examine the biases of maximum likelihood estimators when assumptions about the measurement model for the longitudinal latent exposure variable are violated. We adapt existing instrumental variable estimators to the case of longitudinal exposures and propose them as an alternative to estimate the health effects of a time-changing latent predictor. We show that instrumental variable estimators remain unbiased for a wide range of data generating models and have advantages in terms of mean squared error. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  10. On the shape of posterior densities and credible sets in instrumental variable regression models with reduced rank: an application of flexible sampling methods using neural networks

    NARCIS (Netherlands)

    Hoogerheide, L.F.; Kaashoek, J.F.; van Dijk, H.K.

    2007-01-01

    Likelihoods and posteriors of instrumental variable (IV) regression models with strong endogeneity and/or weak instruments may exhibit rather non-elliptical contours in the parameter space. This may seriously affect inference based on Bayesian credible sets. When approximating posterior

  11. Variability in dose estimates associated with the food-chain transport and ingestion of selected radionuclides

    International Nuclear Information System (INIS)

    Hoffman, F.O.; Gardner, R.H.; Eckerman, K.F.

    1982-06-01

    Dose predictions for the ingestion of 90 Sr and 137 Cs, using aquatic and terrestrial food chain transport models similar to those in the Nuclear Regulatory Commission's Regulatory Guide 1.109, are evaluated through estimating the variability of model parameters and determining the effect of this variability on model output. The variability in the predicted dose equivalent is determined using analytical and numerical procedures. In addition, a detailed discussion is included on 90 Sr dosimetry. The overall estimates of uncertainty are most relevant to conditions where site-specific data is unavailable and when model structure and parameter estimates are unbiased. Based on the comparisons performed in this report, it is concluded that the use of the generic default parameters in Regulatory Guide 1.109 will usually produce conservative dose estimates that exceed the 90th percentile of the predicted distribution of dose equivalents. An exception is the meat pathway for 137 Cs, in which use of generic default values results in a dose estimate at the 24th percentile. Among the terrestrial pathways of exposure, the non-leafy vegetable pathway is the most important for 90 Sr. For 90 Sr, the parameters for soil retention, soil-to-plant transfer, and internal dosimetry contribute most significantly to the variability in the predicted dose for the combined exposure to all terrestrial pathways. For 137 Cs, the meat transfer coefficient the mass interception factor for pasture forage, and the ingestion dose factor are the most important parameters. The freshwater finfish bioaccumulation factor is the most important parameter for the dose prediction of 90 Sr and 137 Cs transported over the water-fish-man pathway

  12. Bias correction by use of errors-in-variables regression models in studies with K-X-ray fluorescence bone lead measurements.

    Science.gov (United States)

    Lamadrid-Figueroa, Héctor; Téllez-Rojo, Martha M; Angeles, Gustavo; Hernández-Ávila, Mauricio; Hu, Howard

    2011-01-01

    In-vivo measurement of bone lead by means of K-X-ray fluorescence (KXRF) is the preferred biological marker of chronic exposure to lead. Unfortunately, considerable measurement error associated with KXRF estimations can introduce bias in estimates of the effect of bone lead when this variable is included as the exposure in a regression model. Estimates of uncertainty reported by the KXRF instrument reflect the variance of the measurement error and, although they can be used to correct the measurement error bias, they are seldom used in epidemiological statistical analyzes. Errors-in-variables regression (EIV) allows for correction of bias caused by measurement error in predictor variables, based on the knowledge of the reliability of such variables. The authors propose a way to obtain reliability coefficients for bone lead measurements from uncertainty data reported by the KXRF instrument and compare, by the use of Monte Carlo simulations, results obtained using EIV regression models vs. those obtained by the standard procedures. Results of the simulations show that Ordinary Least Square (OLS) regression models provide severely biased estimates of effect, and that EIV provides nearly unbiased estimates. Although EIV effect estimates are more imprecise, their mean squared error is much smaller than that of OLS estimates. In conclusion, EIV is a better alternative than OLS to estimate the effect of bone lead when measured by KXRF. Copyright © 2010 Elsevier Inc. All rights reserved.

  13. Does social trust increase willingness to pay taxes to improve public healthcare? Cross-sectional cross-country instrumental variable analysis.

    Science.gov (United States)

    Habibov, Nazim; Cheung, Alex; Auchynnikava, Alena

    2017-09-01

    The purpose of this paper is to investigate the effect of social trust on the willingness to pay more taxes to improve public healthcare in post-communist countries. The well-documented association between higher levels of social trust and better health has traditionally been assumed to reflect the notion that social trust is positively associated with support for public healthcare system through its encouragement of cooperative behaviour, social cohesion, social solidarity, and collective action. Hence, in this paper, we have explicitly tested the notion that social trust contributes to an increase in willingness to financially support public healthcare. We use micro data from the 2010 Life-in-Transition survey (N = 29,526). Classic binomial probit and instrumental variables ivprobit regressions are estimated to model the relationship between social trust and paying more taxes to improve public healthcare. We found that an increase in social trust is associated with a greater willingness to pay more taxes to improve public healthcare. From the perspective of policy-making, healthcare administrators, policy-makers, and international donors should be aware that social trust is an important factor in determining the willingness of the population to provide much-needed financial resources to supporting public healthcare. From a theoretical perspective, we found that estimating the effect of trust on support for healthcare without taking confounding and measurement error problems into consideration will likely lead to an underestimation of the true effect of trust. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Effect of corruption on healthcare satisfaction in post-soviet nations: A cross-country instrumental variable analysis of twelve countries.

    Science.gov (United States)

    Habibov, Nazim

    2016-03-01

    There is the lack of consensus about the effect of corruption on healthcare satisfaction in transitional countries. Interpreting the burgeoning literature on this topic has proven difficult due to reverse causality and omitted variable bias. In this study, the effect of corruption on healthcare satisfaction is investigated in a set of 12 Post-Socialist countries using instrumental variable regression on the sample of 2010 Life in Transition survey (N = 8655). The results indicate that experiencing corruption significantly reduces healthcare satisfaction. Copyright © 2016 The Author. Published by Elsevier Ltd.. All rights reserved.

  15. On the shape of posterior densities and credible sets in instrumental variable regression models with reduced rank: an application of flexible sampling methods using neural networks

    NARCIS (Netherlands)

    L.F. Hoogerheide (Lennart); J.F. Kaashoek (Johan); H.K. van Dijk (Herman)

    2005-01-01

    textabstractLikelihoods and posteriors of instrumental variable regression models with strong endogeneity and/or weak instruments may exhibit rather non-elliptical contours in the parameter space. This may seriously affect inference based on Bayesian credible sets. When approximating such contours

  16. The estimation of soil parameters using observations on crop biophysical variables and the crop model STICS improve the predictions of agro environmental variables.

    Science.gov (United States)

    Varella, H.-V.

    2009-04-01

    Dynamic crop models are very useful to predict the behavior of crops in their environment and are widely used in a lot of agro-environmental work. These models have many parameters and their spatial application require a good knowledge of these parameters, especially of the soil parameters. These parameters can be estimated from soil analysis at different points but this is very costly and requires a lot of experimental work. Nevertheless, observations on crops provided by new techniques like remote sensing or yield monitoring, is a possibility for estimating soil parameters through the inversion of crop models. In this work, the STICS crop model is studied for the wheat and the sugar beet and it includes more than 200 parameters. After a previous work based on a large experimental database for calibrate parameters related to the characteristics of the crop, a global sensitivity analysis of the observed variables (leaf area index LAI and absorbed nitrogen QN provided by remote sensing data, and yield at harvest provided by yield monitoring) to the soil parameters is made, in order to determine which of them have to be estimated. This study was made in different climatic and agronomic conditions and it reveals that 7 soil parameters (4 related to the water and 3 related to the nitrogen) have a clearly influence on the variance of the observed variables and have to be therefore estimated. For estimating these 7 soil parameters, a Bayesian data assimilation method is chosen (because of available prior information on these parameters) named Importance Sampling by using observations, on wheat and sugar beet crop, of LAI and QN at various dates and yield at harvest acquired on different climatic and agronomic conditions. The quality of parameter estimation is then determined by comparing the result of parameter estimation with only prior information and the result with the posterior information provided by the Bayesian data assimilation method. The result of the

  17. Using small area estimation and Lidar-derived variables for multivariate prediction of forest attributes

    Science.gov (United States)

    F. Mauro; Vicente Monleon; H. Temesgen

    2015-01-01

    Small area estimation (SAE) techniques have been successfully applied in forest inventories to provide reliable estimates for domains where the sample size is small (i.e. small areas). Previous studies have explored the use of either Area Level or Unit Level Empirical Best Linear Unbiased Predictors (EBLUPs) in a univariate framework, modeling each variable of interest...

  18. Quantitative estimation of time-variable earthquake hazard by using fuzzy set theory

    Science.gov (United States)

    Deyi, Feng; Ichikawa, M.

    1989-11-01

    In this paper, the various methods of fuzzy set theory, called fuzzy mathematics, have been applied to the quantitative estimation of the time-variable earthquake hazard. The results obtained consist of the following. (1) Quantitative estimation of the earthquake hazard on the basis of seismicity data. By using some methods of fuzzy mathematics, seismicity patterns before large earthquakes can be studied more clearly and more quantitatively, highly active periods in a given region and quiet periods of seismic activity before large earthquakes can be recognized, similarities in temporal variation of seismic activity and seismic gaps can be examined and, on the other hand, the time-variable earthquake hazard can be assessed directly on the basis of a series of statistical indices of seismicity. Two methods of fuzzy clustering analysis, the method of fuzzy similarity, and the direct method of fuzzy pattern recognition, have been studied is particular. One method of fuzzy clustering analysis is based on fuzzy netting, and another is based on the fuzzy equivalent relation. (2) Quantitative estimation of the earthquake hazard on the basis of observational data for different precursors. The direct method of fuzzy pattern recognition has been applied to research on earthquake precursors of different kinds. On the basis of the temporal and spatial characteristics of recognized precursors, earthquake hazards in different terms can be estimated. This paper mainly deals with medium-short-term precursors observed in Japan and China.

  19. Climate Informed Economic Instruments to Enhance Urban Water Supply Resilience to Hydroclimatological Variability and Change

    Science.gov (United States)

    Brown, C.; Carriquiry, M.; Souza Filho, F. A.

    2006-12-01

    Hydroclimatological variability presents acute challenges to urban water supply providers. The impact is often most severe in developing nations where hydrologic and climate variability can be very high, water demand is unmet and increasing, and the financial resources to mitigate the social effects of that variability are limited. Furthermore, existing urban water systems face a reduced solution space, constrained by competing and conflicting interests, such as irrigation demand, recreation and hydropower production, and new (relative to system design) demands to satisfy environmental flow requirements. These constraints magnify the impacts of hydroclimatic variability and increase the vulnerability of urban areas to climate change. The high economic and social costs of structural responses to hydrologic variability, such as groundwater utilization and the construction or expansion of dams, create a need for innovative alternatives. Advances in hydrologic and climate forecasting, and the increasing sophistication and acceptance of incentive-based mechanisms for achieving economically efficient water allocation offer potential for improving the resilience of existing water systems to the challenge of variable supply. This presentation will explore the performance of a system of climate informed economic instruments designed to facilitate the reduction of hydroclimatologic variability-induced impacts on water-sensitive stakeholders. The system is comprised of bulk water option contracts between urban water suppliers and agricultural users and insurance indexed on reservoir inflows designed to cover the financial needs of the water supplier in situations where the option is likely to be exercised. Contract and insurance parameters are linked to forecasts and the evolution of seasonal precipitation and streamflow and designed for financial and political viability. A simulation of system performance is presented based on ongoing work in Metro Manila, Philippines. The

  20. Variability in abundance of temperate reef fishes estimated by visual census.

    Directory of Open Access Journals (Sweden)

    Alejo J Irigoyen

    Full Text Available Identifying sources of sampling variation and quantifying their magnitude is critical to the interpretation of ecological field data. Yet, most monitoring programs of reef fish populations based on underwater visual censuses (UVC consider only a few of the factors that may influence fish counts, such as the diver or census methodology. Recent studies, however, have drawn attention to a broader range of processes that introduce variability at different temporal scales. This study analyzes the magnitude of different sources of variation in UVCs of temperate reef fishes off Patagonia (Argentina. The variability associated with time-of-day, tidal state, and time elapsed between censuses (minutes, days, weeks and months was quantified for censuses conducted on the five most conspicuous and common species: Pinguipes brasilianus, Pseudopercis semifasciata, Sebastes oculatus, Acanthistius patachonicus and Nemadactylus bergi. Variance components corresponding to spatial heterogeneity and to the different temporal scales were estimated using nested random models. The levels of variability estimated for the different species were related to their life history attributes and behavior. Neither time-of-day nor tidal state had a significant effect on counts, except for the influence of tide on P. brasilianus. Spatial heterogeneity was the dominant source of variance in all but one species. Among the temporal scales, the intra-annual variation was the highest component for most species due to marked seasonal fluctuations in abundance, followed by the weekly and the instantaneous variation; the daily component was not significant. The variability between censuses conducted at different tidal levels and time-of-day was similar in magnitude to the instantaneous variation, reinforcing the conclusion that stochastic variation at very short time scales is non-negligible and should be taken into account in the design of monitoring programs and experiments. The present

  1. Estimating Causal Effects of Local Air Pollution on Daily Deaths: Effect of Low Levels.

    Science.gov (United States)

    Schwartz, Joel; Bind, Marie-Abele; Koutrakis, Petros

    2017-01-01

    Although many time-series studies have established associations of daily pollution variations with daily deaths, there are fewer at low concentrations, or focused on locally generated pollution, which is becoming more important as regulations reduce regional transport. Causal modeling approaches are also lacking. We used causal modeling to estimate the impact of local air pollution on mortality at low concentrations. Using an instrumental variable approach, we developed an instrument for variations in local pollution concentrations that is unlikely to be correlated with other causes of death, and examined its association with daily deaths in the Boston, Massachusetts, area. We combined height of the planetary boundary layer and wind speed, which affect concentrations of local emissions, to develop the instrument for particulate matter ≤ 2.5 μm (PM2.5), black carbon (BC), or nitrogen dioxide (NO2) variations that were independent of year, month, and temperature. We also used Granger causality to assess whether omitted variable confounding existed. We estimated that an interquartile range increase in the instrument for local PM2.5 was associated with a 0.90% increase in daily deaths (95% CI: 0.25, 1.56). A similar result was found for BC, and a weaker association with NO2. The Granger test found no evidence of omitted variable confounding for the instrument. A separate test confirmed the instrument was not associated with mortality independent of pollution. Furthermore, the association remained when all days with PM2.5 concentrations > 30 μg/m3 were excluded from the analysis (0.84% increase in daily deaths; 95% CI: 0.19, 1.50). We conclude that there is a causal association of local air pollution with daily deaths at concentrations below U.S. EPA standards. The estimated attributable risk in Boston exceeded 1,800 deaths during the study period, indicating that important public health benefits can follow from further control efforts. Citation: Schwartz J, Bind MA

  2. SECOND ORDER LEAST SQUARE ESTIMATION ON ARCH(1 MODEL WITH BOX-COX TRANSFORMED DEPENDENT VARIABLE

    Directory of Open Access Journals (Sweden)

    Herni Utami

    2014-03-01

    Full Text Available Box-Cox transformation is often used to reduce heterogeneity and to achieve a symmetric distribution of response variable. In this paper, we estimate the parameters of Box-Cox transformed ARCH(1 model using second-order leastsquare method and then we study the consistency and asymptotic normality for second-order least square (SLS estimators. The SLS estimation was introduced byWang (2003, 2004 to estimate the parameters of nonlinear regression models with independent and identically distributed errors

  3. Estimating the effect of treatment rate changes when treatment benefits are heterogeneous: antibiotics and otitis media.

    Science.gov (United States)

    Park, Tae-Ryong; Brooks, John M; Chrischilles, Elizabeth A; Bergus, George

    2008-01-01

    Contrast methods to assess the health effects of a treatment rate change when treatment benefits are heterogeneous across patients. Antibiotic prescribing for children with otitis media (OM) in Iowa Medicaid is the empirical example. Instrumental variable (IV) and linear probability model (LPM) are used to estimate the effect of antibiotic treatments on cure probabilities for children with OM in Iowa Medicaid. Local area physician supply per capita is the instrument in the IV models. Estimates are contrasted in terms of their ability to make inferences for patients whose treatment choices may be affected by a change in population treatment rates. The instrument was positively related to the probability of being prescribed an antibiotic. LPM estimates showed a positive effect of antibiotics on OM patient cure probability while IV estimates showed no relationship between antibiotics and patient cure probability. Linear probability model estimation yields the average effects of the treatment on patients that were treated. IV estimation yields the average effects for patients whose treatment choices were affected by the instrument. As antibiotic treatment effects are heterogeneous across OM patients, our estimates from these approaches are aligned with clinical evidence and theory. The average estimate for treated patients (higher severity) from the LPM model is greater than estimates for patients whose treatment choices are affected by the instrument (lower severity) from the IV models. Based on our IV estimates it appears that lowering antibiotic use in OM patients in Iowa Medicaid did not result in lost cures.

  4. Polyphonic pitch detection and instrument separation

    Science.gov (United States)

    Bay, Mert; Beauchamp, James W.

    2005-09-01

    An algorithm for polyphonic pitch detection and musical instrument separation is presented. Each instrument is represented as a time-varying harmonic series. Spectral information is obtained from a monaural input signal using a spectral peak tracking method. Fundamental frequencies (F0s) for each time frame are estimated from the spectral data using an Expectation Maximization (EM) algorithm with a Gaussian mixture model representing the harmonic series. The method first estimates the most predominant F0, suppresses its series in the input, and then the EM algorithm is run iteratively to estimate each next F0. Collisions between instrument harmonics, which frequently occur, are predicted from the estimated F0s, and the resulting corrupted harmonics are ignored. The amplitudes of these corrupted harmonics are replaced by harmonics taken from a library of spectral envelopes for different instruments, where the spectrum which most closely matches the important characteristics of each extracted spectrum is chosen. Finally, each voice is separately resynthesized by additive synthesis. This algorithm is demonstrated for a trio piece that consists of 3 different instruments.

  5. Secondary task for full flight simulation incorporating tasks that commonly cause pilot error: Time estimation

    Science.gov (United States)

    Rosch, E.

    1975-01-01

    The task of time estimation, an activity occasionally performed by pilots during actual flight, was investigated with the objective of providing human factors investigators with an unobtrusive and minimally loading additional task that is sensitive to differences in flying conditions and flight instrumentation associated with the main task of piloting an aircraft simulator. Previous research indicated that the duration and consistency of time estimates is associated with the cognitive, perceptual, and motor loads imposed by concurrent simple tasks. The relationships between the length and variability of time estimates and concurrent task variables under a more complex situation involving simulated flight were clarified. The wrap-around effect with respect to baseline duration, a consequence of mode switching at intermediate levels of concurrent task distraction, should contribute substantially to estimate variability and have a complex effect on the shape of the resulting distribution of estimates.

  6. Simple, efficient estimators of treatment effects in randomized trials using generalized linear models to leverage baseline variables.

    Science.gov (United States)

    Rosenblum, Michael; van der Laan, Mark J

    2010-04-01

    Models, such as logistic regression and Poisson regression models, are often used to estimate treatment effects in randomized trials. These models leverage information in variables collected before randomization, in order to obtain more precise estimates of treatment effects. However, there is the danger that model misspecification will lead to bias. We show that certain easy to compute, model-based estimators are asymptotically unbiased even when the working model used is arbitrarily misspecified. Furthermore, these estimators are locally efficient. As a special case of our main result, we consider a simple Poisson working model containing only main terms; in this case, we prove the maximum likelihood estimate of the coefficient corresponding to the treatment variable is an asymptotically unbiased estimator of the marginal log rate ratio, even when the working model is arbitrarily misspecified. This is the log-linear analog of ANCOVA for linear models. Our results demonstrate one application of targeted maximum likelihood estimation.

  7. Simple, Efficient Estimators of Treatment Effects in Randomized Trials Using Generalized Linear Models to Leverage Baseline Variables

    Science.gov (United States)

    Rosenblum, Michael; van der Laan, Mark J.

    2010-01-01

    Models, such as logistic regression and Poisson regression models, are often used to estimate treatment effects in randomized trials. These models leverage information in variables collected before randomization, in order to obtain more precise estimates of treatment effects. However, there is the danger that model misspecification will lead to bias. We show that certain easy to compute, model-based estimators are asymptotically unbiased even when the working model used is arbitrarily misspecified. Furthermore, these estimators are locally efficient. As a special case of our main result, we consider a simple Poisson working model containing only main terms; in this case, we prove the maximum likelihood estimate of the coefficient corresponding to the treatment variable is an asymptotically unbiased estimator of the marginal log rate ratio, even when the working model is arbitrarily misspecified. This is the log-linear analog of ANCOVA for linear models. Our results demonstrate one application of targeted maximum likelihood estimation. PMID:20628636

  8. Tyre effective radius and vehicle velocity estimation: a variable structure observer solution

    International Nuclear Information System (INIS)

    El Tannoury, C.; Plestan, F.; Moussaoui, S.; ROMANi, N. RENAULT

    2011-01-01

    This paper proposes an application of a variable structure observer for wheel effective radius and velocity of automotive vehicles. This observer is based on high order sliding approach allowing robustness and finite time convergence. Its originality consists in assuming a nonlinear relation between the slip ratio and the friction coefficient and providing an estimation of both variables, wheel radius and vehicle velocity, from measurement of wheel angular velocity and torque. These signals being available on major modern vehicle CAN (Controller Area Network) buses, this system does not require additional sensors. A simulation example is given to illustrate the relevance of this approach.

  9. The OCO-3 Mission: Science Objectives and Instrument Performance

    Science.gov (United States)

    Eldering, A.; Basilio, R. R.; Bennett, M. W.

    2017-12-01

    The Orbiting Carbon Observatory 3 (OCO-3) will continue global CO2 and solar-induced chlorophyll fluorescence (SIF) using the flight spare instrument from OCO-2. The instrument is currently being tested, and will be packaged for installation on the International Space Station (ISS) (launch readiness in early 2018.) This talk will focus on the science objectives, updated simulations of the science data products, and the outcome of recent instrument performance tests. The low-inclination ISS orbit lets OCO-3 sample the tropics and sub-tropics across the full range of daylight hours with dense observations at northern and southern mid-latitudes (+/- 52º). The combination of these dense CO2 and SIF measurements provides continuity of data for global flux estimates as well as a unique opportunity to address key deficiencies in our understanding of the global carbon cycle. The instrument utilizes an agile, 2-axis pointing mechanism (PMA), providing the capability to look towards the bright reflection from the ocean and validation targets. The PMA also allows for a snapshot mapping mode to collect dense datasets over 100km by 100km areas. Measurements over urban centers could aid in making estimates of fossil fuel CO2 emissions. Similarly, the snapshot mapping mode can be used to sample regions of interest for the terrestrial carbon cycle. In addition, there is potential to utilize data from ISS instruments ECOSTRESS (ECOsystem Spaceborne Thermal Radiometer Experiment on Space Station) and GEDI (Global Ecosystem Dynamics Investigation), which measure other key variables of the control of carbon uptake by plants, to complement OCO-3 data in science analysis. In 2017, the OCO-2 instrument was transformed into the ISS-ready OCO-3 payload. The transformed instrument was thoroughly tested and characterized. Key characteristics, such as instrument ILS, spectral resolution, and radiometric performance will be described. Analysis of direct sun measurements taken during testing

  10. A SHARIA RETURN AS AN ALTERNATIVE INSTRUMENT FOR MONETARY POLICY

    Directory of Open Access Journals (Sweden)

    Ashief Hamam

    2011-09-01

    Full Text Available Rapid development in Islamic financial industry has not been supported by sharia monetary policy instruments. This study looks at the possibility of sharia returns as the instrument. Using both error correction model and vector error correction model to estimate the data from 2002(1 to 2010(12, this paper finds that sharia return has the same effect as the interest rate in the demand for money. The shock effect of sharia return on broad money supply, Gross Domestic Product, and Consumer Price Index is greater than that of interest rate. In addition, these three variables are more quickly become stable following the shock of sharia return. Keywords: Sharia return, islamic financial system, vector error correction modelJEL classification numbers: E52, G15

  11. Problems with radiological surveillance instrumentation

    International Nuclear Information System (INIS)

    Swinth, K.L.; Tanner, J.E.; Fleming, D.M.

    1984-09-01

    Many radiological surveillance instruments are in use at DOE facilities throughout the country. These instruments are an essential part of all health physics programs, and poor instrument performance can increase program costs or compromise program effectiveness. Generic data from simple tests on newly purchased instruments shows that many instruments will not meet requirements due to manufacturing defects. In other cases, lack of consideration of instrument use has resulted in poor acceptance of instruments and poor reliability. The performance of instruments is highly variable for electronic and mechanical performance, radiation response, susceptibility to interferences and response to environmental factors. Poor instrument performance in these areas can lead to errors or poor accuracy in measurements

  12. Problems with radiological surveillance instrumentation

    International Nuclear Information System (INIS)

    Swinth, K.L.; Tanner, J.E.; Fleming, D.M.

    1985-01-01

    Many radiological surveillance instruments are in use at DOE facilities throughout the country. These instruments are an essential part of all health physics programs, and poor instrument performance can increase program costs or compromise program effectiveness. Generic data from simple tests on newly purchased instruments shows that many instruments will not meet requirements due to manufacturing defects. In other cases, lack of consideration of instrument use has resulted in poor acceptance of instruments and poor reliability. The performance of instruments is highly variable for electronic and mechanical performance, radiation response, susceptibility to interferences and response to environmental factors. Poor instrument performance in these areas can lead to errors or poor accuracy in measurements

  13. Statistically extracted fundamental watershed variables for estimating the loads of total nitrogen in small streams

    Science.gov (United States)

    Kronholm, Scott C.; Capel, Paul D.; Terziotti, Silvia

    2016-01-01

    Accurate estimation of total nitrogen loads is essential for evaluating conditions in the aquatic environment. Extrapolation of estimates beyond measured streams will greatly expand our understanding of total nitrogen loading to streams. Recursive partitioning and random forest regression were used to assess 85 geospatial, environmental, and watershed variables across 636 small (monitoring may be beneficial.

  14. A New Temperature-Vegetation Triangle Algorithm with Variable Edges (TAVE for Satellite-Based Actual Evapotranspiration Estimation

    Directory of Open Access Journals (Sweden)

    Hua Zhang

    2016-09-01

    Full Text Available The estimation of spatially-variable actual evapotranspiration (AET is a critical challenge to regional water resources management. We propose a new remote sensing method, the Triangle Algorithm with Variable Edges (TAVE, to generate daily AET estimates based on satellite-derived land surface temperature and the vegetation index NDVI. The TAVE captures heterogeneity in AET across elevation zones and permits variability in determining local values of wet and dry end-member classes (known as edges. Compared to traditional triangle methods, TAVE introduces three unique features: (i the discretization of the domain as overlapping elevation zones; (ii a variable wet edge that is a function of elevation zone; and (iii variable values of a combined-effect parameter (that accounts for aerodynamic and surface resistance, vapor pressure gradient, and soil moisture availability along both wet and dry edges. With these features, TAVE effectively addresses the combined influence of terrain and water stress on semi-arid environment AET estimates. We demonstrate the effectiveness of this method in one of the driest countries in the world—Jordan, and compare it to a traditional triangle method (TA and a global AET product (MOD16 over different land use types. In irrigated agricultural lands, TAVE matched the results of the single crop coefficient model (−3%, in contrast to substantial overestimation by TA (+234% and underestimation by MOD16 (−50%. In forested (non-irrigated, water consuming regions, TA and MOD16 produced AET average deviations 15.5 times and −3.5 times of those based on TAVE. As TAVE has a simple structure and low data requirements, it provides an efficient means to satisfy the increasing need for evapotranspiration estimation in data-scarce semi-arid regions. This study constitutes a much needed step towards the satellite-based quantification of agricultural water consumption in Jordan.

  15. Assessment of the quality and variability of health information on chronic pain websites using the DISCERN instrument

    Directory of Open Access Journals (Sweden)

    Buckley Norman

    2010-10-01

    Full Text Available Abstract Background The Internet is used increasingly by providers as a tool for disseminating pain-related health information and by patients as a resource about health conditions and treatment options. However, health information on the Internet remains unregulated and varies in quality, accuracy and readability. The objective of this study was to determine the quality of pain websites, and explain variability in quality and readability between pain websites. Methods Five key terms (pain, chronic pain, back pain, arthritis, and fibromyalgia were entered into the Google, Yahoo and MSN search engines. Websites were assessed using the DISCERN instrument as a quality index. Grade level readability ratings were assessed using the Flesch-Kincaid Readability Algorithm. Univariate (using alpha = 0.20 and multivariable regression (using alpha = 0.05 analyses were used to explain the variability in DISCERN scores and grade level readability using potential for commercial gain, health related seals of approval, language(s and multimedia features as independent variables. Results A total of 300 websites were assessed, 21 excluded in accordance with the exclusion criteria and 110 duplicate websites, leaving 161 unique sites. About 6.8% (11/161 websites of the websites offered patients' commercial products for their pain condition, 36.0% (58/161 websites had a health related seal of approval, 75.8% (122/161 websites presented information in English only and 40.4% (65/161 websites offered an interactive multimedia experience. In assessing the quality of the unique websites, of a maximum score of 80, the overall average DISCERN Score was 55.9 (13.6 and readability (grade level of 10.9 (3.9. The multivariable regressions demonstrated that website seals of approval (P = 0.015 and potential for commercial gain (P = 0.189 were contributing factors to higher DISCERN scores, while seals of approval (P = 0.168 and interactive multimedia (P = 0.244 contributed to

  16. BN-FLEMOps pluvial - A probabilistic multi-variable loss estimation model for pluvial floods

    Science.gov (United States)

    Roezer, V.; Kreibich, H.; Schroeter, K.; Doss-Gollin, J.; Lall, U.; Merz, B.

    2017-12-01

    Pluvial flood events, such as in Copenhagen (Denmark) in 2011, Beijing (China) in 2012 or Houston (USA) in 2016, have caused severe losses to urban dwellings in recent years. These floods are caused by storm events with high rainfall rates well above the design levels of urban drainage systems, which lead to inundation of streets and buildings. A projected increase in frequency and intensity of heavy rainfall events in many areas and an ongoing urbanization may increase pluvial flood losses in the future. For an efficient risk assessment and adaptation to pluvial floods, a quantification of the flood risk is needed. Few loss models have been developed particularly for pluvial floods. These models usually use simple waterlevel- or rainfall-loss functions and come with very high uncertainties. To account for these uncertainties and improve the loss estimation, we present a probabilistic multi-variable loss estimation model for pluvial floods based on empirical data. The model was developed in a two-step process using a machine learning approach and a comprehensive database comprising 783 records of direct building and content damage of private households. The data was gathered through surveys after four different pluvial flood events in Germany between 2005 and 2014. In a first step, linear and non-linear machine learning algorithms, such as tree-based and penalized regression models were used to identify the most important loss influencing factors among a set of 55 candidate variables. These variables comprise hydrological and hydraulic aspects, early warning, precaution, building characteristics and the socio-economic status of the household. In a second step, the most important loss influencing variables were used to derive a probabilistic multi-variable pluvial flood loss estimation model based on Bayesian Networks. Two different networks were tested: a score-based network learned from the data and a network based on expert knowledge. Loss predictions are made

  17. A comparison of methods to estimate daily global solar irradiation from other climatic variables on the Canadian prairies

    International Nuclear Information System (INIS)

    Barr, A.G.; McGinn, S.M.; Cheng, S.B.

    1996-01-01

    Historic estimates of daily global solar irradiation are often required for climatic impact studies. Regression equations with daily global solar irradiation, H, as the dependent variable and other climatic variables as the independent variables provide a practical way to estimate H at locations where it is not measured. They may also have potential to estimate H before 1953, the year of the first routine H measurements in Canada. This study compares several regression equations for calculating H on the Canadian prairies. Simple linear regression with daily bright sunshine duration as the dependent variable accounted for 90% of the variation of H in summer and 75% of the variation of H in winter. Linear regression with the daily air temperature range as the dependent variable accounted for 45% of the variation of H in summer and only 6% of the variation of H in winter. Linear regression with precipitation status (wet or dry) as the dependent variable accounted for only 35% of the summer-time variation in H, but stratifying other regression analyses into wet and dry days reduced their root-mean-squared errors. For periods with sufficiently dense bright sunshine observations (i.e. after 1960), however, H was more accurately estimated from spatially interpolated bright sunshine duration than from locally observed air temperature range or precipitation status. The daily air temperature range and precipitation status may have utility for estimating H for periods before 1953, when they are the only widely available climatic data on the Canadian prairies. Between 1953 and 1989, a period of large climatic variation, the regression coefficients did not vary significantly between contrasting years with cool-wet, intermediate and warm-dry summers. They should apply equally well earlier in the century. (author)

  18. Variable selection for confounder control, flexible modeling and Collaborative Targeted Minimum Loss-based Estimation in causal inference

    Science.gov (United States)

    Schnitzer, Mireille E.; Lok, Judith J.; Gruber, Susan

    2015-01-01

    This paper investigates the appropriateness of the integration of flexible propensity score modeling (nonparametric or machine learning approaches) in semiparametric models for the estimation of a causal quantity, such as the mean outcome under treatment. We begin with an overview of some of the issues involved in knowledge-based and statistical variable selection in causal inference and the potential pitfalls of automated selection based on the fit of the propensity score. Using a simple example, we directly show the consequences of adjusting for pure causes of the exposure when using inverse probability of treatment weighting (IPTW). Such variables are likely to be selected when using a naive approach to model selection for the propensity score. We describe how the method of Collaborative Targeted minimum loss-based estimation (C-TMLE; van der Laan and Gruber, 2010) capitalizes on the collaborative double robustness property of semiparametric efficient estimators to select covariates for the propensity score based on the error in the conditional outcome model. Finally, we compare several approaches to automated variable selection in low-and high-dimensional settings through a simulation study. From this simulation study, we conclude that using IPTW with flexible prediction for the propensity score can result in inferior estimation, while Targeted minimum loss-based estimation and C-TMLE may benefit from flexible prediction and remain robust to the presence of variables that are highly correlated with treatment. However, in our study, standard influence function-based methods for the variance underestimated the standard errors, resulting in poor coverage under certain data-generating scenarios. PMID:26226129

  19. Disability as deprivation of capabilities: Estimation using a large-scale survey in Morocco and Tunisia and an instrumental variable approach.

    Science.gov (United States)

    Trani, Jean-Francois; Bakhshi, Parul; Brown, Derek; Lopez, Dominique; Gall, Fiona

    2018-05-25

    The capability approach pioneered by Amartya Sen and Martha Nussbaum offers a new paradigm to examine disability, poverty and their complex associations. Disability is hence defined as a situation in which a person with an impairment faces various forms of restrictions in functionings and capabilities. Additionally, poverty is not the mere absence of income but a lack of ability to achieve essential functionings; disability is consequently the poverty of capabilities of persons with impairment. It is the lack of opportunities in a given context and agency that leads to persons with disabilities being poorer than other social groups. Consequently, poverty of people with disabilities comprises of complex processes of social exclusion and disempowerment. Despite growing evidence that persons with disabilities face higher levels of poverty, the literature from low and middle-income countries that analyzes the causal link between disability and poverty, remains limited. Drawing on data from a large case control field survey carried out between December 24th , 2013 and February 16th , 2014 in Tunisia and between November 4th , 2013 and June 12th , 2014 in Morocco, we examined the effect of impairment on various basic capabilities, health related quality of life and multidimensional poverty - indicators of poor wellbeing-in Morocco and Tunisia. To demonstrate a causal link between impairment and deprivation of capabilities, we used instrumental variable regression analyses. In both countries, we found lower access to jobs for persons with impairment. Health related quality of life was also lower for this group who also faced a higher risk of multidimensional poverty. There was no significant direct effect of impairment on access to school and acquiring literacy in both countries, and on access to health care and expenses in Tunisia, while having an impairment reduced access to healthcare facilities in Morocco and out of pocket expenditures. These results suggest that

  20. Instrumental variable analysis as a complementary analysis in studies of adverse effects : venous thromboembolism and second-generation versus third-generation oral contraceptives

    NARCIS (Netherlands)

    Boef, Anna G C; Souverein, Patrick C|info:eu-repo/dai/nl/243074948; Vandenbroucke, Jan P; van Hylckama Vlieg, Astrid; de Boer, Anthonius|info:eu-repo/dai/nl/075097346; le Cessie, Saskia; Dekkers, Olaf M

    2016-01-01

    PURPOSE: A potentially useful role for instrumental variable (IV) analysis may be as a complementary analysis to assess the presence of confounding when studying adverse drug effects. There has been discussion on whether the observed increased risk of venous thromboembolism (VTE) for

  1. Prevalence Estimation and Validation of New Instruments in Psychiatric Research: An Application of Latent Class Analysis and Sensitivity Analysis

    Science.gov (United States)

    Pence, Brian Wells; Miller, William C.; Gaynes, Bradley N.

    2009-01-01

    Prevalence and validation studies rely on imperfect reference standard (RS) diagnostic instruments that can bias prevalence and test characteristic estimates. The authors illustrate 2 methods to account for RS misclassification. Latent class analysis (LCA) combines information from multiple imperfect measures of an unmeasurable latent condition to…

  2. Spatial and temporal variability of rainfall and their effects on hydrological response in urban areas – a review

    OpenAIRE

    E. Cristiano; M.-C. ten Veldhuis; N. van de Giesen

    2017-01-01

    In urban areas, hydrological processes are characterized by high variability in space and time, making them sensitive to small-scale temporal and spatial rainfall variability. In the last decades new instruments, techniques, and methods have been developed to capture rainfall and hydrological processes at high resolution. Weather radars have been introduced to estimate high spatial and temporal rainfall variability. At the same time, new models have been proposed to reproduce hydrological res...

  3. Relations estimate between investment in digital media and financial variables of companies: a Colombian overview

    OpenAIRE

    Amalia Novoa Hoyos; Mauricio Sabogal Salamanca; Camilo Vargas Walteros

    2016-01-01

    This article shows a first estimate about the relationship between investment in digital media and some financial variables in Colombia. First, a literature review is made about the impact of marketing and digital marketing in Company performance. Then, an analysis of the sectorial variables such as liquidity, profitability, indebtedness and concentration in sectors like food, personal grooming, automotive, drinking and tobacco, construction, entertainment, furniture, services, telecommunicat...

  4. Accurate Lithium-ion battery parameter estimation with continuous-time system identification methods

    International Nuclear Information System (INIS)

    Xia, Bing; Zhao, Xin; Callafon, Raymond de; Garnier, Hugues; Nguyen, Truong; Mi, Chris

    2016-01-01

    Highlights: • Continuous-time system identification is applied in Lithium-ion battery modeling. • Continuous-time and discrete-time identification methods are compared in detail. • The instrumental variable method is employed to further improve the estimation. • Simulations and experiments validate the advantages of continuous-time methods. - Abstract: The modeling of Lithium-ion batteries usually utilizes discrete-time system identification methods to estimate parameters of discrete models. However, in real applications, there is a fundamental limitation of the discrete-time methods in dealing with sensitivity when the system is stiff and the storage resolutions are limited. To overcome this problem, this paper adopts direct continuous-time system identification methods to estimate the parameters of equivalent circuit models for Lithium-ion batteries. Compared with discrete-time system identification methods, the continuous-time system identification methods provide more accurate estimates to both fast and slow dynamics in battery systems and are less sensitive to disturbances. A case of a 2"n"d-order equivalent circuit model is studied which shows that the continuous-time estimates are more robust to high sampling rates, measurement noises and rounding errors. In addition, the estimation by the conventional continuous-time least squares method is further improved in the case of noisy output measurement by introducing the instrumental variable method. Simulation and experiment results validate the analysis and demonstrate the advantages of the continuous-time system identification methods in battery applications.

  5. Final Report: Wireless Instrument for Automated Measurement of Clean Cookstove Usage and Black Carbon Emissions

    Energy Technology Data Exchange (ETDEWEB)

    Lukac, Martin [Cirrus Sense LLC, Los Angeles, CA (United States); Ramanathan, Nithya [Cirrus Sense LLC, Los Angeles, CA (United States); Graham, Eric [Cirrus Sense LLC, Los Angeles, CA (United States)

    2013-09-10

    Black carbon (BC) emissions from traditional cooking fires and other sources are significant anthropogenic drivers of radiative forcing. Clean cookstoves present a more energy-efficient and cleaner-burning vehicle for cooking than traditional wood-burning stoves, yet many existing cookstoves reduce emissions by only modest amounts. Further research into cookstove use, fuel types, and verification of emissions is needed as adoption rates for such stoves remain low. Accelerated innovation requires techniques for measuring and verifying such cookstove performance. The overarching goal of the proposed program was to develop a low-cost, wireless instrument to provide a high-resolution profile of the cookstove BC emissions and usage in the field. We proposed transferring the complexity of analysis away from the sampling hardware at the measurement site and to software at a centrally located server to easily analyze data from thousands of sampling instruments. We were able to build a low-cost field-based instrument that produces repeatable, low-cost estimates of cookstove usage, fuel estimates, and emission values with low variability. Emission values from our instrument were consistent with published ranges of emissions for similar stove and fuel types.

  6. BATEMANATER: a computer program to estimate and bootstrap mating system variables based on Bateman's principles.

    Science.gov (United States)

    Jones, Adam G

    2015-11-01

    Bateman's principles continue to play a major role in the characterization of genetic mating systems in natural populations. The modern manifestations of Bateman's ideas include the opportunity for sexual selection (i.e. I(s) - the variance in relative mating success), the opportunity for selection (i.e. I - the variance in relative reproductive success) and the Bateman gradient (i.e. β(ss) - the slope of the least-squares regression of reproductive success on mating success). These variables serve as the foundation for one convenient approach for the quantification of mating systems. However, their estimation presents at least two challenges, which I address here with a new Windows-based computer software package called BATEMANATER. The first challenge is that confidence intervals for these variables are not easy to calculate. BATEMANATER solves this problem using a bootstrapping approach. The second, more serious, problem is that direct estimates of mating system variables from open populations will typically be biased if some potential progeny or adults are missing from the analysed sample. BATEMANATER addresses this problem using a maximum-likelihood approach to estimate mating system variables from incompletely sampled breeding populations. The current version of BATEMANATER addresses the problem for systems in which progeny can be collected in groups of half- or full-siblings, as would occur when eggs are laid in discrete masses or offspring occur in pregnant females. BATEMANATER has a user-friendly graphical interface and thus represents a new, convenient tool for the characterization and comparison of genetic mating systems. © 2015 John Wiley & Sons Ltd.

  7. Increasing precision of turbidity-based suspended sediment concentration and load estimates.

    Science.gov (United States)

    Jastram, John D; Zipper, Carl E; Zelazny, Lucian W; Hyer, Kenneth E

    2010-01-01

    Turbidity is an effective tool for estimating and monitoring suspended sediments in aquatic systems. Turbidity can be measured in situ remotely and at fine temporal scales as a surrogate for suspended sediment concentration (SSC), providing opportunity for a more complete record of SSC than is possible with physical sampling approaches. However, there is variability in turbidity-based SSC estimates and in sediment loadings calculated from those estimates. This study investigated the potential to improve turbidity-based SSC, and by extension the resulting sediment loading estimates, by incorporating hydrologic variables that can be monitored remotely and continuously (typically 15-min intervals) into the SSC estimation procedure. On the Roanoke River in southwestern Virginia, hydrologic stage, turbidity, and other water-quality parameters were monitored with in situ instrumentation; suspended sediments were sampled manually during elevated turbidity events; samples were analyzed for SSC and physical properties including particle-size distribution and organic C content; and rainfall was quantified by geologic source area. The study identified physical properties of the suspended-sediment samples that contribute to SSC estimation variance and hydrologic variables that explained variability of those physical properties. Results indicated that the inclusion of any of the measured physical properties in turbidity-based SSC estimation models reduces unexplained variance. Further, the use of hydrologic variables to represent these physical properties, along with turbidity, resulted in a model, relying solely on data collected remotely and continuously, that estimated SSC with less variance than a conventional turbidity-based univariate model, allowing a more precise estimate of sediment loading, Modeling results are consistent with known mechanisms governing sediment transport in hydrologic systems.

  8. Neutron-multiplication measurement instrument

    Energy Technology Data Exchange (ETDEWEB)

    Nixon, K.V.; Dowdy, E.J.; France, S.W.; Millegan, D.R.; Robba, A.A.

    1982-01-01

    The Advanced Nuclear Technology Group of the Los Alamos National Laboratory is now using intelligent data-acquisition and analysis instrumentation for determining the multiplication of nuclear material. Earlier instrumentation, such as the large NIM-crate systems, depended on house power and required additional computation to determine multiplication or to estimate error. The portable, battery-powered multiplication measurement unit, with advanced computational power, acquires data, calculates multiplication, and completes error analysis automatically. Thus, the multiplication is determined easily and an available error estimate enables the user to judge the significance of results.

  9. Neutron multiplication measurement instrument

    International Nuclear Information System (INIS)

    Nixon, K.V.; Dowdy, E.J.; France, S.W.; Millegan, D.R.; Robba, A.A.

    1983-01-01

    The Advanced Nuclear Technology Group of the Los Alamos National Laboratory is now using intelligent data-acquisition and analysis instrumentation for determining the multiplication of nuclear material. Earlier instrumentation, such as the large NIM-crate systems, depended on house power and required additional computation to determine multiplication or to estimate error. The portable, battery-powered multiplication measurement unit, with advanced computational power, acquires data, calculates multiplication, and completes error analysis automatically. Thus, the multiplication is determined easily and an available error estimate enables the user to judge the significance of results

  10. Neutron-multiplication measurement instrument

    International Nuclear Information System (INIS)

    Nixon, K.V.; Dowdy, E.J.; France, S.W.; Millegan, D.R.; Robba, A.A.

    1982-01-01

    The Advanced Nuclear Technology Group of the Los Alamos National Laboratory is now using intelligent data-acquisition and analysis instrumentation for determining the multiplication of nuclear material. Earlier instrumentation, such as the large NIM-crate systems, depended on house power and required additional computation to determine multiplication or to estimate error. The portable, battery-powered multiplication measurement unit, with advanced computational power, acquires data, calculates multiplication, and completes error analysis automatically. Thus, the multiplication is determined easily and an available error estimate enables the user to judge the significance of results

  11. Essential climatic variables estimation with satellite imagery

    Science.gov (United States)

    Kolotii, A.; Kussul, N.; Shelestov, A.; Lavreniuk, M. S.

    2016-12-01

    According to Sendai Framework for Disaster Risk Reduction 2015 - 2030 Leaf Area Index (LAI) is considered as one of essential climatic variables. This variable represents the amount of leaf material in ecosystems and controls the links between biosphere and atmosphere through various processes and enables monitoring and quantitative assessment of vegetation state. LAI has added value for such important global resources monitoring tasks as drought mapping and crop yield forecasting with use of data from different sources [1-2]. Remote sensing data from space can be used to estimate such biophysical parameter at regional and national scale. High temporal satellite imagery is usually required to capture main parameters of crop growth [3]. Sentinel-2 mission launched in 2015 be ESA is a source of high spatial and temporal resolution satellite imagery for mapping biophysical parameters. Products created with use of automated Sen2-Agri system deployed during Sen2-Agri country level demonstration project for Ukraine will be compared with our independent results of biophysical parameters mapping. References Shelestov, A., Kolotii, A., Camacho, F., Skakun, S., Kussul, O., Lavreniuk, M., & Kostetsky, O. (2015, July). Mapping of biophysical parameters based on high resolution EO imagery for JECAM test site in Ukraine. In 2015 IEEE International Geoscience and Remote Sensing Symposium (IGARSS), 1733-1736 Kolotii, A., Kussul, N., Shelestov, A., Skakun, S., Yailymov, B., Basarab, R., ... & Ostapenko, V. (2015). Comparison of biophysical and satellite predictors for wheat yield forecasting in Ukraine. The International Archives of Photogrammetry, Remote Sensing and Spatial Information Sciences, 40(7), 39-44. Kussul, N., Lemoine, G., Gallego, F. J., Skakun, S. V., Lavreniuk, M., & Shelestov, A. Y. Parcel-Based Crop Classification in Ukraine Using Landsat-8 Data and Sentinel-1A Data. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing , 9 (6), 2500-2508.

  12. Cardiac parasympathetic outflow during dynamic exercise in humans estimated from power spectral analysis of P-P interval variability.

    Science.gov (United States)

    Takahashi, Makoto; Nakamoto, Tomoko; Matsukawa, Kanji; Ishii, Kei; Watanabe, Tae; Sekikawa, Kiyokazu; Hamada, Hironobu

    2016-03-01

    What is the central question of this study? Should we use the high-frequency (HF) component of P-P interval as an index of cardiac parasympathetic nerve activity during moderate exercise? What is the main finding and its importance? The HF component of P-P interval variability remained even at a heart rate of 120-140 beats min(-1) and was further reduced by atropine, indicating incomplete cardiac vagal withdrawal during moderate exercise. The HF component of R-R interval is invalid as an estimate of cardiac parasympathetic outflow during moderate exercise; instead, the HF component of P-P interval variability should be used. The high-frequency (HF) component of R-R interval variability has been widely used as an indirect estimate of cardiac parasympathetic (vagal) outflow to the sino-atrial node of the heart. However, we have recently found that the variability of the R-R interval becomes much smaller during dynamic exercise than that of the P-P interval above a heart rate (HR) of ∼100 beats min(-1). We hypothesized that cardiac parasympathetic outflow during dynamic exercise with a higher intensity may be better estimated using the HF component of P-P interval variability. To test this hypothesis, the HF components of both P-P and R-R interval variability were analysed using a Wavelet transform during dynamic exercise. Twelve subjects performed ergometer exercise to increase HR from the baseline of 69 ± 3 beats min(-1) to three different levels of 100, 120 and 140 beats min(-1). We also examined the effect of atropine sulfate on the HF components in eight of the 12 subjects during exercise at an HR of 140 beats min(-1) . The HF component of P-P interval variability was significantly greater than that of R-R interval variability during exercise, especially at the HRs of 120 and 140 beats min(-1). The HF component of P-P interval variability was more reduced by atropine than that of R-R interval variability. We conclude that cardiac parasympathetic outflow to the

  13. Variation in the estimations of ETo and crop water use due to the sensor accuracy of the meteorological variables

    Directory of Open Access Journals (Sweden)

    R. Moratiel

    2013-06-01

    Full Text Available In agricultural ecosystems the use of evapotranspiration (ET to improve irrigation water management is generally widespread. Commonly, the crop ET (ETc is estimated by multiplying the reference crop evapotranspiration (ETo by a crop coefficient (Kc. Accurate estimation of ETo is critical because it is the main factor affecting the calculation of crop water use and water management. The ETo is generally estimated from recorded meteorological variables at reference weather stations. The main objective of this paper was assessing the effect of the uncertainty due to random noise in the sensors used for measurement of meteorological variables on the estimation of ETo, crop ET and net irrigation requirements of grain corn and alfalfa in three irrigation districts of the middle Ebro River basin. Five scenarios were simulated, four of them individually considering each recorded meteorological variable (temperature, relative humidity, solar radiation and wind speed and a fifth scenario combining together the uncertainty of all sensors. The uncertainty in relative humidity for irrigation districts Riegos del Alto Aragón (RAA and Bardenas (BAR, and temperature for irrigation district Canal de Aragón y Cataluña (CAC, were the two most important factors affecting the estimation of ETo, corn ET (ETc_corn, alfalfa ET (ETc_alf, net corn irrigation water requirements (IRncorn and net alfalfa irrigation water requirements (IRnalf. Nevertheless, this effect was never greater than ±0.5% over annual scale time. The wind speed variable (Scenario 3 was the third variable more influential in the fluctuations (± of evapotranspiration, followed by solar radiation. Considering the accuracy for all sensors over annual scale time, the variation was about ±1% of ETo, ETc_corn, ETc_alf, IRncorn, and IRnalf. The fluctuations of evapotranspiration were higher at shorter time scale. ETo daily fluctuation remained lower than 5 % during the growing season of corn and

  14. Direct estimation and correction of bias from temporally variable non-stationary noise in a channelized Hotelling model observer.

    Science.gov (United States)

    Fetterly, Kenneth A; Favazza, Christopher P

    2016-08-07

    Channelized Hotelling model observer (CHO) methods were developed to assess performance of an x-ray angiography system. The analytical methods included correction for known bias error due to finite sampling. Detectability indices ([Formula: see text]) corresponding to disk-shaped objects with diameters in the range 0.5-4 mm were calculated. Application of the CHO for variable detector target dose (DTD) in the range 6-240 nGy frame(-1) resulted in [Formula: see text] estimates which were as much as 2.9×  greater than expected of a quantum limited system. Over-estimation of [Formula: see text] was presumed to be a result of bias error due to temporally variable non-stationary noise. Statistical theory which allows for independent contributions of 'signal' from a test object (o) and temporally variable non-stationary noise (ns) was developed. The theory demonstrates that the biased [Formula: see text] is the sum of the detectability indices associated with the test object [Formula: see text] and non-stationary noise ([Formula: see text]). Given the nature of the imaging system and the experimental methods, [Formula: see text] cannot be directly determined independent of [Formula: see text]. However, methods to estimate [Formula: see text] independent of [Formula: see text] were developed. In accordance with the theory, [Formula: see text] was subtracted from experimental estimates of [Formula: see text], providing an unbiased estimate of [Formula: see text]. Estimates of [Formula: see text] exhibited trends consistent with expectations of an angiography system that is quantum limited for high DTD and compromised by detector electronic readout noise for low DTD conditions. Results suggest that these methods provide [Formula: see text] estimates which are accurate and precise for [Formula: see text]. Further, results demonstrated that the source of bias was detector electronic readout noise. In summary, this work presents theory and methods to test for the

  15. NASA Instrument Cost/Schedule Model

    Science.gov (United States)

    Habib-Agahi, Hamid; Mrozinski, Joe; Fox, George

    2011-01-01

    NASA's Office of Independent Program and Cost Evaluation (IPCE) has established a number of initiatives to improve its cost and schedule estimating capabilities. 12One of these initiatives has resulted in the JPL developed NASA Instrument Cost Model. NICM is a cost and schedule estimator that contains: A system level cost estimation tool; a subsystem level cost estimation tool; a database of cost and technical parameters of over 140 previously flown remote sensing and in-situ instruments; a schedule estimator; a set of rules to estimate cost and schedule by life cycle phases (B/C/D); and a novel tool for developing joint probability distributions for cost and schedule risk (Joint Confidence Level (JCL)). This paper describes the development and use of NICM, including the data normalization processes, data mining methods (cluster analysis, principal components analysis, regression analysis and bootstrap cross validation), the estimating equations themselves and a demonstration of the NICM tool suite.

  16. Estimating the Persistence and the Autocorrelation Function of a Time Series that is Measured with Error

    DEFF Research Database (Denmark)

    Hansen, Peter Reinhard; Lunde, Asger

    An economic time series can often be viewed as a noisy proxy for an underlying economic variable. Measurement errors will influence the dynamic properties of the observed process and may conceal the persistence of the underlying time series. In this paper we develop instrumental variable (IV...... application despite the large sample. Unit root tests based on the IV estimator have better finite sample properties in this context....

  17. Fuzzy associative memories for instrument fault detection

    International Nuclear Information System (INIS)

    Heger, A.S.

    1996-01-01

    A fuzzy logic instrument fault detection scheme is developed for systems having two or three redundant sensors. In the fuzzy logic approach the deviation between each signal pairing is computed and classified into three fuzzy sets. A rule base is created allowing the human perception of the situation to be represented mathematically. Fuzzy associative memories are then applied. Finally, a defuzzification scheme is used to find the centroid location, and hence the signal status. Real-time analyses are carried out to evaluate the instantaneous signal status as well as the long-term results for the sensor set. Instantaneous signal validation results are used to compute a best estimate for the measured state variable. The long-term sensor validation method uses a frequency fuzzy variable to determine the signal condition over a specific period. To corroborate the methodology synthetic data representing various anomalies are analyzed with both the fuzzy logic technique and the parity space approach. (Author)

  18. An empirical analysis of women’s working time, and an estimation of female labour supply in Italy

    Directory of Open Access Journals (Sweden)

    Maria Gabriella Campolo

    2013-05-01

    Full Text Available Empirical studies show that misspecification of the married (or cohabiting women’s labour supply equation may produce inefficient wage-elasticity estimates. In order to reduce the variability of these estimates, we suggest a new approach based on instrumental variables given by the economic value of the domestic unpaid-work of women. Using Italian micro-data on time use (ISTAT Survey on Time Use, and applying both a parametric and a semiparametric procedure, we estimate robust wage-elasticity coefficients of married women’s labour supply. Our results suggest that women’s labour supply is negatively influenced by the indirect cost of their informal activity of childcare and domestic work.

  19. Instrument uncertainty predictions

    International Nuclear Information System (INIS)

    Coutts, D.A.

    1991-07-01

    The accuracy of measurements and correlations should normally be provided for most experimental activities. The uncertainty is a measure of the accuracy of a stated value or equation. The uncertainty term reflects a combination of instrument errors, modeling limitations, and phenomena understanding deficiencies. This report provides several methodologies to estimate an instrument's uncertainty when used in experimental work. Methods are shown to predict both the pretest and post-test uncertainty

  20. How the 2SLS/IV estimator can handle equality constraints in structural equation models: a system-of-equations approach.

    Science.gov (United States)

    Nestler, Steffen

    2014-05-01

    Parameters in structural equation models are typically estimated using the maximum likelihood (ML) approach. Bollen (1996) proposed an alternative non-iterative, equation-by-equation estimator that uses instrumental variables. Although this two-stage least squares/instrumental variables (2SLS/IV) estimator has good statistical properties, one problem with its application is that parameter equality constraints cannot be imposed. This paper presents a mathematical solution to this problem that is based on an extension of the 2SLS/IV approach to a system of equations. We present an example in which our approach was used to examine strong longitudinal measurement invariance. We also investigated the new approach in a simulation study that compared it with ML in the examination of the equality of two latent regression coefficients and strong measurement invariance. Overall, the results show that the suggested approach is a useful extension of the original 2SLS/IV estimator and allows for the effective handling of equality constraints in structural equation models. © 2013 The British Psychological Society.

  1. Estimating search engine index size variability: a 9-year longitudinal study.

    Science.gov (United States)

    van den Bosch, Antal; Bogers, Toine; de Kunder, Maurice

    One of the determining factors of the quality of Web search engines is the size of their index. In addition to its influence on search result quality, the size of the indexed Web can also tell us something about which parts of the WWW are directly accessible to the everyday user. We propose a novel method of estimating the size of a Web search engine's index by extrapolating from document frequencies of words observed in a large static corpus of Web pages. In addition, we provide a unique longitudinal perspective on the size of Google and Bing's indices over a nine-year period, from March 2006 until January 2015. We find that index size estimates of these two search engines tend to vary dramatically over time, with Google generally possessing a larger index than Bing. This result raises doubts about the reliability of previous one-off estimates of the size of the indexed Web. We find that much, if not all of this variability can be explained by changes in the indexing and ranking infrastructure of Google and Bing. This casts further doubt on whether Web search engines can be used reliably for cross-sectional webometric studies.

  2. Estimating variability in placido-based topographic systems.

    Science.gov (United States)

    Kounis, George A; Tsilimbaris, Miltiadis K; Kymionis, George D; Ginis, Harilaos S; Pallikaris, Ioannis G

    2007-10-01

    To describe a new software tool for the detailed presentation of corneal topography measurements variability by means of color-coded maps. Software was developed in Visual Basic to analyze and process a series of 10 consecutive measurements obtained by a topographic system on calibration spheres, and individuals with emmetropic, low, high, and irregular astigmatic corneas. Corneal surface was segmented into 1200 segments and the coefficient of variance of each segment's keratometric dioptric power was used as the measure of variability. The results were presented graphically in color-coded maps (Variability Maps). Two topographic systems, the TechnoMed C-Scan and the TOMEY Topographic Modeling System (TMS-2N), were examined to demonstrate our method. Graphic representation of coefficient of variance offered a detailed representation of examination variability both in calibration surfaces and human corneas. It was easy to recognize an increase in variability, as the irregularity of examination surfaces increased. In individuals with high and irregular astigmatism, a variability pattern correlated with the pattern of corneal topography: steeper corneal areas possessed higher variability values compared with flatter areas of the same cornea. Numerical data permitted direct comparisons and statistical analysis. We propose a method that permits a detailed evaluation of the variability of corneal topography measurements. The representation of the results both graphically and quantitatively improves interpretability and facilitates a spatial correlation of variability maps with original topography maps. Given the popularity of topography based custom refractive ablations of the cornea, it is possible that variability maps may assist clinicians in the evaluation of corneal topography maps of patients with very irregular corneas, before custom ablation procedures.

  3. An automated performance budget estimator: a process for use in instrumentation

    Science.gov (United States)

    Laporte, Philippe; Schnetler, Hermine; Rees, Phil

    2016-08-01

    Current day astronomy projects continue to increase in size and are increasingly becoming more complex, regardless of the wavelength domain, while risks in terms of safety, cost and operability have to be reduced to ensure an affordable total cost of ownership. All of these drivers have to be considered carefully during the development process of an astronomy project at the same time as there is a big drive to shorten the development life-cycle. From the systems engineering point of view, this evolution is a significant challenge. Big instruments imply management of interfaces within large consortia and dealing with tight design phase schedules which necessitate efficient and rapid interactions between all the stakeholders to firstly ensure that the system is defined correctly and secondly that the designs will meet all the requirements. It is essential that team members respond quickly such that the time available for the design team is maximised. In this context, performance prediction tools can be very helpful during the concept phase of a project to help selecting the best design solution. In the first section of this paper we present the development of such a prediction tool that can be used by the system engineer to determine the overall performance of the system and to evaluate the impact on the science based on the proposed design. This tool can also be used in "what-if" design analysis to assess the impact on the overall performance of the system based on the simulated numbers calculated by the automated system performance prediction tool. Having such a tool available from the beginning of a project can allow firstly for a faster turn-around between the design engineers and the systems engineer and secondly, between the systems engineer and the instrument scientist. Following the first section we described the process for constructing a performance estimator tool, followed by describing three projects in which such a tool has been utilised to illustrate

  4. “You Can’t Play a Sad Song on the Banjo:” Acoustic Factors in the Judgment of Instrument Capacity to Convey Sadness

    OpenAIRE

    David Huron; Neesha Anderson; Daniel Shanahan

    2014-01-01

    Forty-four Western-enculturated musicians completed two studies. The first group was asked to judge the relative sadness of forty-four familiar Western instruments. An independent group was asked to assess a number of acoustical properties for those same instruments. Using the estimated acoustical properties as predictor variables in a multiple regression analysis, a significant correlation was found between those properties known to contribute to sad prosody in speech and the judged sadness ...

  5. A method for estimating spatially variable seepage and hydrualic conductivity in channels with very mild slopes

    Science.gov (United States)

    Shanafield, Margaret; Niswonger, Richard G.; Prudic, David E.; Pohll, Greg; Susfalk, Richard; Panday, Sorab

    2014-01-01

    Infiltration along ephemeral channels plays an important role in groundwater recharge in arid regions. A model is presented for estimating spatial variability of seepage due to streambed heterogeneity along channels based on measurements of streamflow-front velocities in initially dry channels. The diffusion-wave approximation to the Saint-Venant equations, coupled with Philip's equation for infiltration, is connected to the groundwater model MODFLOW and is calibrated by adjusting the saturated hydraulic conductivity of the channel bed. The model is applied to portions of two large water delivery canals, which serve as proxies for natural ephemeral streams. Estimated seepage rates compare well with previously published values. Possible sources of error stem from uncertainty in Manning's roughness coefficients, soil hydraulic properties and channel geometry. Model performance would be most improved through more frequent longitudinal estimates of channel geometry and thalweg elevation, and with measurements of stream stage over time to constrain wave timing and shape. This model is a potentially valuable tool for estimating spatial variability in longitudinal seepage along intermittent and ephemeral channels over a wide range of bed slopes and the influence of seepage rates on groundwater levels.

  6. Estimation of Foot Plantar Center of Pressure Trajectories with Low-Cost Instrumented Insoles Using an Individual-Specific Nonlinear Model

    Directory of Open Access Journals (Sweden)

    Xinyao Hu

    2018-02-01

    Full Text Available Postural control is a complex skill based on the interaction of dynamic sensorimotor processes, and can be challenging for people with deficits in sensory functions. The foot plantar center of pressure (COP has often been used for quantitative assessment of postural control. Previously, the foot plantar COP was mainly measured by force plates or complicated and expensive insole-based measurement systems. Although some low-cost instrumented insoles have been developed, their ability to accurately estimate the foot plantar COP trajectory was not robust. In this study, a novel individual-specific nonlinear model was proposed to estimate the foot plantar COP trajectories with an instrumented insole based on low-cost force sensitive resistors (FSRs. The model coefficients were determined by a least square error approximation algorithm. Model validation was carried out by comparing the estimated COP data with the reference data in a variety of postural control assessment tasks. We also compared our data with the COP trajectories estimated by the previously well accepted weighted mean approach. Comparing with the reference measurements, the average root mean square errors of the COP trajectories of both feet were 2.23 mm (±0.64 (left foot and 2.72 mm (±0.83 (right foot along the medial–lateral direction, and 9.17 mm (±1.98 (left foot and 11.19 mm (±2.98 (right foot along the anterior–posterior direction. The results are superior to those reported in previous relevant studies, and demonstrate that our proposed approach can be used for accurate foot plantar COP trajectory estimation. This study could provide an inexpensive solution to fall risk assessment in home settings or community healthcare center for the elderly. It has the potential to help prevent future falls in the elderly.

  7. Estimation of Foot Plantar Center of Pressure Trajectories with Low-Cost Instrumented Insoles Using an Individual-Specific Nonlinear Model.

    Science.gov (United States)

    Hu, Xinyao; Zhao, Jun; Peng, Dongsheng; Sun, Zhenglong; Qu, Xingda

    2018-02-01

    Postural control is a complex skill based on the interaction of dynamic sensorimotor processes, and can be challenging for people with deficits in sensory functions. The foot plantar center of pressure (COP) has often been used for quantitative assessment of postural control. Previously, the foot plantar COP was mainly measured by force plates or complicated and expensive insole-based measurement systems. Although some low-cost instrumented insoles have been developed, their ability to accurately estimate the foot plantar COP trajectory was not robust. In this study, a novel individual-specific nonlinear model was proposed to estimate the foot plantar COP trajectories with an instrumented insole based on low-cost force sensitive resistors (FSRs). The model coefficients were determined by a least square error approximation algorithm. Model validation was carried out by comparing the estimated COP data with the reference data in a variety of postural control assessment tasks. We also compared our data with the COP trajectories estimated by the previously well accepted weighted mean approach. Comparing with the reference measurements, the average root mean square errors of the COP trajectories of both feet were 2.23 mm (±0.64) (left foot) and 2.72 mm (±0.83) (right foot) along the medial-lateral direction, and 9.17 mm (±1.98) (left foot) and 11.19 mm (±2.98) (right foot) along the anterior-posterior direction. The results are superior to those reported in previous relevant studies, and demonstrate that our proposed approach can be used for accurate foot plantar COP trajectory estimation. This study could provide an inexpensive solution to fall risk assessment in home settings or community healthcare center for the elderly. It has the potential to help prevent future falls in the elderly.

  8. Variability in the carbon storage of seagrass habitats and its implications for global estimates of blue carbon ecosystem service.

    Directory of Open Access Journals (Sweden)

    Paul S Lavery

    Full Text Available The recent focus on carbon trading has intensified interest in 'Blue Carbon'-carbon sequestered by coastal vegetated ecosystems, particularly seagrasses. Most information on seagrass carbon storage is derived from studies of a single species, Posidonia oceanica, from the Mediterranean Sea. We surveyed 17 Australian seagrass habitats to assess the variability in their sedimentary organic carbon (C org stocks. The habitats encompassed 10 species, in mono-specific or mixed meadows, depositional to exposed habitats and temperate to tropical habitats. There was an 18-fold difference in the Corg stock (1.09-20.14 mg C org cm(-3 for a temperate Posidonia sinuosa and a temperate, estuarine P. australis meadow, respectively. Integrated over the top 25 cm of sediment, this equated to an areal stock of 262-4833 g C org m(-2. For some species, there was an effect of water depth on the C org stocks, with greater stocks in deeper sites; no differences were found among sub-tidal and inter-tidal habitats. The estimated carbon storage in Australian seagrass ecosystems, taking into account inter-habitat variability, was 155 Mt. At a 2014-15 fixed carbon price of A$25.40 t(-1 and an estimated market price of $35 t(-1 in 2020, the C org stock in the top 25 cm of seagrass habitats has a potential value of $AUD 3.9-5.4 bill. The estimates of annual C org accumulation by Australian seagrasses ranged from 0.093 to 6.15 Mt, with a most probable estimate of 0.93 Mt y(-1 (10.1 t. km(-2 y(-1. These estimates, while large, were one-third of those that would be calculated if inter-habitat variability in carbon stocks were not taken into account. We conclude that there is an urgent need for more information on the variability in seagrass carbon stock and accumulation rates, and the factors driving this variability, in order to improve global estimates of seagrass Blue Carbon storage.

  9. Estimating risks of radiotherapy complications as part of informed consent: the high degree of variability between radiation oncologists may be related to experience

    International Nuclear Information System (INIS)

    Shakespeare, Thomas Philip; Dwyer, Mary; Mukherjee, Rahul; Yeghiaian-Alvandi, Roland; Gebski, Val

    2002-01-01

    Purpose: Estimating the risks of radiotherapy (RT) toxicity is important for informed consent; however, the consistency in estimates has not been studied. This study aimed to explore the variability and factors affecting risk estimates (REs). Methods and Materials: A survey was mailed to Australian radiation oncologists, who were asked to estimate risks of RT complications given 49 clinical scenarios. The REs were assessed for association with oncologist experience, subspecialization, and private practice. Results: The REs were extremely variable, with a 50-fold median variability. The least variability (sevenfold) was for estimates of late, small intestinal perforation/obstruction after a one-third volume received 50 Gy with concurrent 5-fluorouracil (RE range 5-35%). The variation between the smallest and largest REs in 17 scenarios was ≥100-fold. The years of experience was significantly associated with REs of soft/connective-tissue toxicity (p=0.01) but inversely associated with estimates of neurologic/central nervous system toxicity (p=0.08). Ninety-six percent of respondents believed REs were important to RT practice; only 24% rated evidence to support their estimates as good. Sixty-seven percent believed national/international groups should pursue the issue further. Conclusion: Enormous variability exists in REs for normal tissue complications due to RT that is influenced by the years of experience. Risk estimation is perceived as an important issue without a good evidence base. Additional studies are strongly recommended

  10. Skills, earnings, and employment: exploring causality in the estimation of returns to skills

    Directory of Open Access Journals (Sweden)

    Franziska Hampf

    2017-04-01

    Full Text Available Abstract Ample evidence indicates that a person’s human capital is important for success on the labor market in terms of both wages and employment prospects. However, unlike the efforts to identify the impact of school attainment on labor-market outcomes, the literature on returns to cognitive skills has not yet provided convincing evidence that the estimated returns can be causally interpreted. Using the PIAAC Survey of Adult Skills, this paper explores several approaches that aim to address potential threats to causal identification of returns to skills, in terms of both higher wages and better employment chances. We address measurement error by exploiting the fact that PIAAC measures skills in several domains. Furthermore, we estimate instrumental-variable models that use skill variation stemming from school attainment and parental education to circumvent reverse causation. Results show a strikingly similar pattern across the diverse set of countries in our sample. In fact, the instrumental-variable estimates are consistently larger than those found in standard least-squares estimations. The same is true in two “natural experiments,” one of which exploits variation in skills from changes in compulsory-schooling laws across U.S. states. The other one identifies technologically induced variation in broadband Internet availability that gives rise to variation in ICT skills across German municipalities. Together, the results suggest that least-squares estimates may provide a lower bound of the true returns to skills in the labor market.

  11. Simultaneous Estimation of Model State Variables and Observation and Forecast Biases Using a Two-Stage Hybrid Kalman Filter

    Science.gov (United States)

    Pauwels, V. R. N.; DeLannoy, G. J. M.; Hendricks Franssen, H.-J.; Vereecken, H.

    2013-01-01

    In this paper, we present a two-stage hybrid Kalman filter to estimate both observation and forecast bias in hydrologic models, in addition to state variables. The biases are estimated using the discrete Kalman filter, and the state variables using the ensemble Kalman filter. A key issue in this multi-component assimilation scheme is the exact partitioning of the difference between observation and forecasts into state, forecast bias and observation bias updates. Here, the error covariances of the forecast bias and the unbiased states are calculated as constant fractions of the biased state error covariance, and the observation bias error covariance is a function of the observation prediction error covariance. In a series of synthetic experiments, focusing on the assimilation of discharge into a rainfall-runoff model, it is shown that both static and dynamic observation and forecast biases can be successfully estimated. The results indicate a strong improvement in the estimation of the state variables and resulting discharge as opposed to the use of a bias-unaware ensemble Kalman filter. Furthermore, minimal code modification in existing data assimilation software is needed to implement the method. The results suggest that a better performance of data assimilation methods should be possible if both forecast and observation biases are taken into account.

  12. Simultaneous estimation of model state variables and observation and forecast biases using a two-stage hybrid Kalman filter

    Directory of Open Access Journals (Sweden)

    V. R. N. Pauwels

    2013-09-01

    Full Text Available In this paper, we present a two-stage hybrid Kalman filter to estimate both observation and forecast bias in hydrologic models, in addition to state variables. The biases are estimated using the discrete Kalman filter, and the state variables using the ensemble Kalman filter. A key issue in this multi-component assimilation scheme is the exact partitioning of the difference between observation and forecasts into state, forecast bias and observation bias updates. Here, the error covariances of the forecast bias and the unbiased states are calculated as constant fractions of the biased state error covariance, and the observation bias error covariance is a function of the observation prediction error covariance. In a series of synthetic experiments, focusing on the assimilation of discharge into a rainfall-runoff model, it is shown that both static and dynamic observation and forecast biases can be successfully estimated. The results indicate a strong improvement in the estimation of the state variables and resulting discharge as opposed to the use of a bias-unaware ensemble Kalman filter. Furthermore, minimal code modification in existing data assimilation software is needed to implement the method. The results suggest that a better performance of data assimilation methods should be possible if both forecast and observation biases are taken into account.

  13. Estimation of indirect effect when the mediator is a censored variable.

    Science.gov (United States)

    Wang, Jian; Shete, Sanjay

    2017-01-01

    A mediation model explores the direct and indirect effects of an initial variable ( X) on an outcome variable ( Y) by including a mediator ( M). In many realistic scenarios, investigators observe censored data instead of the complete data. Current research in mediation analysis for censored data focuses mainly on censored outcomes, but not censored mediators. In this study, we proposed a strategy based on the accelerated failure time model and a multiple imputation approach. We adapted a measure of the indirect effect for the mediation model with a censored mediator, which can assess the indirect effect at both the group and individual levels. Based on simulation, we established the bias in the estimations of different paths (i.e. the effects of X on M [ a], of M on Y [ b] and of X on Y given mediator M [ c']) and indirect effects when analyzing the data using the existing approaches, including a naïve approach implemented in software such as Mplus, complete-case analysis, and the Tobit mediation model. We conducted simulation studies to investigate the performance of the proposed strategy compared to that of the existing approaches. The proposed strategy accurately estimates the coefficients of different paths, indirect effects and percentages of the total effects mediated. We applied these mediation approaches to the study of SNPs, age at menopause and fasting glucose levels. Our results indicate that there is no indirect effect of association between SNPs and fasting glucose level that is mediated through the age at menopause.

  14. THE QUADRANTS METHOD TO ESTIMATE QUANTITATIVE VARIABLES IN MANAGEMENT PLANS IN THE AMAZON

    Directory of Open Access Journals (Sweden)

    Gabriel da Silva Oliveira

    2015-12-01

    Full Text Available This work aimed to evaluate the accuracy in estimates of abundance, basal area and commercial volume per hectare, by the quadrants method applied to an area of 1.000 hectares of rain forest in the Amazon. Samples were simulated by random and systematic process with different sample sizes, ranging from 100 to 200 sampling points. The amounts estimated by the samples were compared with the parametric values recorded in the census. In the analysis we considered as the population all trees with diameter at breast height equal to or greater than 40 cm. The quadrants method did not reach the desired level of accuracy for the variables basal area and commercial volume, overestimating the observed values recorded in the census. However, the accuracy of the estimates of abundance, basal area and commercial volume was satisfactory for applying the method in forest inventories for management plans in the Amazon.

  15. Using Copulas in the Estimation of the Economic Project Value in the Mining Industry, Including Geological Variability

    Science.gov (United States)

    Krysa, Zbigniew; Pactwa, Katarzyna; Wozniak, Justyna; Dudek, Michal

    2017-12-01

    Geological variability is one of the main factors that has an influence on the viability of mining investment projects and on the technical risk of geology projects. In the current scenario, analyses of economic viability of new extraction fields have been performed for the KGHM Polska Miedź S.A. underground copper mine at Fore Sudetic Monocline with the assumption of constant averaged content of useful elements. Research presented in this article is aimed at verifying the value of production from copper and silver ore for the same economic background with the use of variable cash flows resulting from the local variability of useful elements. Furthermore, the ore economic model is investigated for a significant difference in model value estimated with the use of linear correlation between useful elements content and the height of mine face, and the approach in which model parameters correlation is based upon the copula best matched information capacity criterion. The use of copula allows the simulation to take into account the multi variable dependencies at the same time, thereby giving a better reflection of the dependency structure, which linear correlation does not take into account. Calculation results of the economic model used for deposit value estimation indicate that the correlation between copper and silver estimated with the use of copula generates higher variation of possible project value, as compared to modelling correlation based upon linear correlation. Average deposit value remains unchanged.

  16. Optimal estimation of spatially variable recharge and transmissivity fields under steady-state groundwater flow. Part 1. Theory

    Science.gov (United States)

    Graham, Wendy D.; Tankersley, Claude D.

    1994-05-01

    Stochastic methods are used to analyze two-dimensional steady groundwater flow subject to spatially variable recharge and transmissivity. Approximate partial differential equations are developed for the covariances and cross-covariances between the random head, transmissivity and recharge fields. Closed-form solutions of these equations are obtained using Fourier transform techniques. The resulting covariances and cross-covariances can be incorporated into a Bayesian conditioning procedure which provides optimal estimates of the recharge, transmissivity and head fields given available measurements of any or all of these random fields. Results show that head measurements contain valuable information for estimating the random recharge field. However, when recharge is treated as a spatially variable random field, the value of head measurements for estimating the transmissivity field can be reduced considerably. In a companion paper, the method is applied to a case study of the Upper Floridan Aquifer in NE Florida.

  17. Another look at the Grubbs estimators

    KAUST Repository

    Lombard, F.

    2012-01-01

    We consider estimation of the precision of a measuring instrument without the benefit of replicate observations on heterogeneous sampling units. Grubbs (1948) proposed an estimator which involves the use of a second measuring instrument, resulting in a pair of observations on each sampling unit. Since the precisions of the two measuring instruments are generally different, these observations cannot be treated as replicates. Very large sample sizes are often required if the standard error of the estimate is to be within reasonable bounds and if negative precision estimates are to be avoided. We show that the two instrument Grubbs estimator can be improved considerably if fairly reliable preliminary information regarding the ratio of sampling unit variance to instrument variance is available. Our results are presented in the context of the evaluation of on-line analyzers. A data set from an analyzer evaluation is used to illustrate the methodology. © 2011 Elsevier B.V.

  18. Using Indirect Turbulence Measurements for Real-Time Parameter Estimation in Turbulent Air

    Science.gov (United States)

    Martos, Borja; Morelli, Eugene A.

    2012-01-01

    The use of indirect turbulence measurements for real-time estimation of parameters in a linear longitudinal dynamics model in atmospheric turbulence was studied. It is shown that measuring the atmospheric turbulence makes it possible to treat the turbulence as a measured explanatory variable in the parameter estimation problem. Commercial off-the-shelf sensors were researched and evaluated, then compared to air data booms. Sources of colored noise in the explanatory variables resulting from typical turbulence measurement techniques were identified and studied. A major source of colored noise in the explanatory variables was identified as frequency dependent upwash and time delay. The resulting upwash and time delay corrections were analyzed and compared to previous time shift dynamic modeling research. Simulation data as well as flight test data in atmospheric turbulence were used to verify the time delay behavior. Recommendations are given for follow on flight research and instrumentation.

  19. Abstract: Inference and Interval Estimation for Indirect Effects With Latent Variable Models.

    Science.gov (United States)

    Falk, Carl F; Biesanz, Jeremy C

    2011-11-30

    Models specifying indirect effects (or mediation) and structural equation modeling are both popular in the social sciences. Yet relatively little research has compared methods that test for indirect effects among latent variables and provided precise estimates of the effectiveness of different methods. This simulation study provides an extensive comparison of methods for constructing confidence intervals and for making inferences about indirect effects with latent variables. We compared the percentile (PC) bootstrap, bias-corrected (BC) bootstrap, bias-corrected accelerated (BC a ) bootstrap, likelihood-based confidence intervals (Neale & Miller, 1997), partial posterior predictive (Biesanz, Falk, and Savalei, 2010), and joint significance tests based on Wald tests or likelihood ratio tests. All models included three reflective latent variables representing the independent, dependent, and mediating variables. The design included the following fully crossed conditions: (a) sample size: 100, 200, and 500; (b) number of indicators per latent variable: 3 versus 5; (c) reliability per set of indicators: .7 versus .9; (d) and 16 different path combinations for the indirect effect (α = 0, .14, .39, or .59; and β = 0, .14, .39, or .59). Simulations were performed using a WestGrid cluster of 1680 3.06GHz Intel Xeon processors running R and OpenMx. Results based on 1,000 replications per cell and 2,000 resamples per bootstrap method indicated that the BC and BC a bootstrap methods have inflated Type I error rates. Likelihood-based confidence intervals and the PC bootstrap emerged as methods that adequately control Type I error and have good coverage rates.

  20. Treating pre-instrumental data as "missing" data: using a tree-ring-based paleoclimate record and imputations to reconstruct streamflow in the Missouri River Basin

    Science.gov (United States)

    Ho, M. W.; Lall, U.; Cook, E. R.

    2015-12-01

    Advances in paleoclimatology in the past few decades have provided opportunities to expand the temporal perspective of the hydrological and climatological variability across the world. The North American region is particularly fortunate in this respect where a relatively dense network of high resolution paleoclimate proxy records have been assembled. One such network is the annually-resolved Living Blended Drought Atlas (LBDA): a paleoclimate reconstruction of the Palmer Drought Severity Index (PDSI) that covers North America on a 0.5° × 0.5° grid based on tree-ring chronologies. However, the use of the LBDA to assess North American streamflow variability requires a model by which streamflow may be reconstructed. Paleoclimate reconstructions have typically used models that first seek to quantify the relationship between the paleoclimate variable and the environmental variable of interest before extrapolating the relationship back in time. In contrast, the pre-instrumental streamflow is here considered as "missing" data. A method of imputing the "missing" streamflow data, prior to the instrumental record, is applied through multiple imputation using chained equations for streamflow in the Missouri River Basin. In this method, the distribution of the instrumental streamflow and LBDA is used to estimate sets of plausible values for the "missing" streamflow data resulting in a ~600 year-long streamflow reconstruction. Past research into external climate forcings, oceanic-atmospheric variability and its teleconnections, and assessments of rare multi-centennial instrumental records demonstrate that large temporal oscillations in hydrological conditions are unlikely to be captured in most instrumental records. The reconstruction of multi-centennial records of streamflow will enable comprehensive assessments of current and future water resource infrastructure and operations under the existing scope of natural climate variability.

  1. Estimation of exhaust gas aerodynamic force on the variable geometry turbocharger actuator: 1D flow model approach

    International Nuclear Information System (INIS)

    Ahmed, Fayez Shakil; Laghrouche, Salah; Mehmood, Adeel; El Bagdouri, Mohammed

    2014-01-01

    Highlights: • Estimation of aerodynamic force on variable turbine geometry vanes and actuator. • Method based on exhaust gas flow modeling. • Simulation tool for integration of aerodynamic force in automotive simulation software. - Abstract: This paper provides a reliable tool for simulating the effects of exhaust gas flow through the variable turbine geometry section of a variable geometry turbocharger (VGT), on flow control mechanism. The main objective is to estimate the resistive aerodynamic force exerted by the flow upon the variable geometry vanes and the controlling actuator, in order to improve the control of vane angles. To achieve this, a 1D model of the exhaust flow is developed using Navier–Stokes equations. As the flow characteristics depend upon the volute geometry, impeller blade force and the existing viscous friction, the related source terms (losses) are also included in the model. In order to guarantee stability, an implicit numerical solver has been developed for the resolution of the Navier–Stokes problem. The resulting simulation tool has been validated through comparison with experimentally obtained values of turbine inlet pressure and the aerodynamic force as measured at the actuator shaft. The simulator shows good compliance with experimental results

  2. Measurement Error in Income and Schooling and the Bias of Linear Estimators

    DEFF Research Database (Denmark)

    Bingley, Paul; Martinello, Alessandro

    2017-01-01

    and Retirement in Europe data with Danish administrative registers. Contrary to most validation studies, we find that measurement error in income is classical once we account for imperfect validation data. We find nonclassical measurement error in schooling, causing a 38% amplification bias in IV estimators......We propose a general framework for determining the extent of measurement error bias in ordinary least squares and instrumental variable (IV) estimators of linear models while allowing for measurement error in the validation source. We apply this method by validating Survey of Health, Ageing...

  3. Estimation of the shape parameter of a generalized Pareto distribution based on a transformation to Pareto distributed variables

    OpenAIRE

    van Zyl, J. Martin

    2012-01-01

    Random variables of the generalized Pareto distribution, can be transformed to that of the Pareto distribution. Explicit expressions exist for the maximum likelihood estimators of the parameters of the Pareto distribution. The performance of the estimation of the shape parameter of generalized Pareto distributed using transformed observations, based on the probability weighted method is tested. It was found to improve the performance of the probability weighted estimator and performs good wit...

  4. System and method for correcting attitude estimation

    Science.gov (United States)

    Josselson, Robert H. (Inventor)

    2010-01-01

    A system includes an angular rate sensor disposed in a vehicle for providing angular rates of the vehicle, and an instrument disposed in the vehicle for providing line-of-sight control with respect to a line-of-sight reference. The instrument includes an integrator which is configured to integrate the angular rates of the vehicle to form non-compensated attitudes. Also included is a compensator coupled across the integrator, in a feed-forward loop, for receiving the angular rates of the vehicle and outputting compensated angular rates of the vehicle. A summer combines the non-compensated attitudes and the compensated angular rates of the to vehicle to form estimated vehicle attitudes for controlling the instrument with respect to the line-of-sight reference. The compensator is configured to provide error compensation to the instrument free-of any feedback loop that uses an error signal. The compensator may include a transfer function providing a fixed gain to the received angular rates of the vehicle. The compensator may, alternatively, include a is transfer function providing a variable gain as a function of frequency to operate on the received angular rates of the vehicle.

  5. Estimating Catchment-Scale Snowpack Variability in Complex Forested Terrain, Valles Caldera National Preserve, NM

    Science.gov (United States)

    Harpold, A. A.; Brooks, P. D.; Biederman, J. A.; Swetnam, T.

    2011-12-01

    Difficulty estimating snowpack variability across complex forested terrain currently hinders the prediction of water resources in the semi-arid Southwestern U.S. Catchment-scale estimates of snowpack variability are necessary for addressing ecological, hydrological, and water resources issues, but are often interpolated from a small number of point-scale observations. In this study, we used LiDAR-derived distributed datasets to investigate how elevation, aspect, topography, and vegetation interact to control catchment-scale snowpack variability. The study area is the Redondo massif in the Valles Caldera National Preserve, NM, a resurgent dome that varies from 2500 to 3430 m and drains from all aspects. Mean LiDAR-derived snow depths from four catchments (2.2 to 3.4 km^2) draining different aspects of the Redondo massif varied by 30%, despite similar mean elevations and mixed conifer forest cover. To better quantify this variability in snow depths we performed a multiple linear regression (MLR) at a 7.3 by 7.3 km study area (5 x 106 snow depth measurements) comprising the four catchments. The MLR showed that elevation explained 45% of the variability in snow depths across the study area, aspect explained 18% (dominated by N-S aspect), and vegetation 2% (canopy density and height). This linear relationship was not transferable to the catchment-scale however, where additional MLR analyses showed the influence of aspect and elevation differed between the catchments. The strong influence of North-South aspect in most catchments indicated that the solar radiation is an important control on snow depth variability. To explore the role of solar radiation, a model was used to generate winter solar forcing index (SFI) values based on the local and remote topography. The SFI was able to explain a large amount of snow depth variability in areas with similar elevation and aspect. Finally, the SFI was modified to include the effects of shading from vegetation (in and out of

  6. Industrial instrumentation principles and design

    CERN Document Server

    Padmanabhan, Tattamangalam R

    2000-01-01

    Pneumatic, hydraulic and allied instrumentation schemes have given way to electronic schemes in recent years thanks to the rapid strides in electronics and allied areas. Principles, design and applications of such state-of-the-art instrumentation schemes form the subject matter of this book. Through representative examples, the basic building blocks of instrumentation schemes are identified and each of these building blocks discussed in terms of its design and interface characteristics. The common generic schemes synthesized with such building blocks are dealt with subsequently. This forms the scope of Part I. The focus in Part II is on application. Displacement and allied instrumentation, force and allied instrumentation and process instrumentation in terms of temperature, flow, pressure level and other common process variables are dealt with separately and exhaustively. Despite the diversity in the sensor principles and characteristics and the variety in the applications and their environments, it is possib...

  7. Reference Evapotranspiration Retrievals from a Mesoscale Model Based Weather Variables for Soil Moisture Deficit Estimation

    Directory of Open Access Journals (Sweden)

    Prashant K. Srivastava

    2017-10-01

    Full Text Available Reference Evapotranspiration (ETo and soil moisture deficit (SMD are vital for understanding the hydrological processes, particularly in the context of sustainable water use efficiency in the globe. Precise estimation of ETo and SMD are required for developing appropriate forecasting systems, in hydrological modeling and also in precision agriculture. In this study, the surface temperature downscaled from Weather Research and Forecasting (WRF model is used to estimate ETo using the boundary conditions that are provided by the European Center for Medium Range Weather Forecast (ECMWF. In order to understand the performance, the Hamon’s method is employed to estimate the ETo using the temperature from meteorological station and WRF derived variables. After estimating the ETo, a range of linear and non-linear models is utilized to retrieve SMD. The performance statistics such as RMSE, %Bias, and Nash Sutcliffe Efficiency (NSE indicates that the exponential model (RMSE = 0.226; %Bias = −0.077; NSE = 0.616 is efficient for SMD estimation by using the Observed ETo in comparison to the other linear and non-linear models (RMSE range = 0.019–0.667; %Bias range = 2.821–6.894; NSE = 0.013–0.419 used in this study. On the other hand, in the scenario where SMD is estimated using WRF downscaled meteorological variables based ETo, the linear model is found promising (RMSE = 0.017; %Bias = 5.280; NSE = 0.448 as compared to the non-linear models (RMSE range = 0.022–0.707; %Bias range = −0.207–−6.088; NSE range = 0.013–0.149. Our findings also suggest that all the models are performing better during the growing season (RMSE range = 0.024–0.025; %Bias range = −4.982–−3.431; r = 0.245–0.281 than the non−growing season (RMSE range = 0.011–0.12; %Bias range = 33.073–32.701; r = 0.161–0.244 for SMD estimation.

  8. Identification and estimation of nonlinear models using two samples with nonclassical measurement errors

    KAUST Repository

    Carroll, Raymond J.

    2010-05-01

    This paper considers identification and estimation of a general nonlinear Errors-in-Variables (EIV) model using two samples. Both samples consist of a dependent variable, some error-free covariates, and an error-prone covariate, for which the measurement error has unknown distribution and could be arbitrarily correlated with the latent true values; and neither sample contains an accurate measurement of the corresponding true variable. We assume that the regression model of interest - the conditional distribution of the dependent variable given the latent true covariate and the error-free covariates - is the same in both samples, but the distributions of the latent true covariates vary with observed error-free discrete covariates. We first show that the general latent nonlinear model is nonparametrically identified using the two samples when both could have nonclassical errors, without either instrumental variables or independence between the two samples. When the two samples are independent and the nonlinear regression model is parameterized, we propose sieve Quasi Maximum Likelihood Estimation (Q-MLE) for the parameter of interest, and establish its root-n consistency and asymptotic normality under possible misspecification, and its semiparametric efficiency under correct specification, with easily estimated standard errors. A Monte Carlo simulation and a data application are presented to show the power of the approach.

  9. Estimating decadal variability in sea level from tide gauge records: An application to the North Sea

    OpenAIRE

    Frederikse, Thomas; Riva, R.E.M.; Slobbe, Cornelis; Broerse, D.B.T.; Verlaan, Martin

    2016-01-01

    One of the primary observational data sets of sea level is represented by the tide gauge record. We propose a new method to estimate variability on decadal time scales from tide gauge data by using a state space formulation, which couples the direct observations to a predefined state space model by using a Kalman filter. The model consists of a time-varying trend and seasonal cycle, and variability induced by several physical processes, such as wind, atmospheric pressure changes and teleconne...

  10. An Improved Estimation Using Polya-Gamma Augmentation for Bayesian Structural Equation Models with Dichotomous Variables

    Science.gov (United States)

    Kim, Seohyun; Lu, Zhenqiu; Cohen, Allan S.

    2018-01-01

    Bayesian algorithms have been used successfully in the social and behavioral sciences to analyze dichotomous data particularly with complex structural equation models. In this study, we investigate the use of the Polya-Gamma data augmentation method with Gibbs sampling to improve estimation of structural equation models with dichotomous variables.…

  11. A method to combine non-probability sample data with probability sample data in estimating spatial means of environmental variables

    NARCIS (Netherlands)

    Brus, D.J.; Gruijter, de J.J.

    2003-01-01

    In estimating spatial means of environmental variables of a region from data collected by convenience or purposive sampling, validity of the results can be ensured by collecting additional data through probability sampling. The precision of the pi estimator that uses the probability sample can be

  12. Design and Implementation of Data Collection Instruments for Neonatology Research

    Directory of Open Access Journals (Sweden)

    Monica G. HĂŞMĂŞANU

    2014-12-01

    Full Text Available im: The aim of our research was to design and implement data collection instruments to be use in context of an observational prospective clinical study with follow-up conducted on new born with intrauterine growth restriction. Methods: The structure of the data collection forms (paper based and electronic based was first identified and for each variable the best type to accomplish the research aim was established. The code for categorical variables has also been decided as well as the units of measurements for quantitative variables. In respect of good practice, a set of confounding factors (as gender, date of birth, etc. have also been identified and integrated in data collection instruments. Data-entry validation rules were implemented for each variable to reduce data input errors when the electronic data collection instrument was created. Results: Two data collection instruments have been developed and successfully implemented: a paper-based form and an electronic data collection instrument. The developed forms included demographics, neonatal complications (as hypoglycemia, hypocalcemia, etc., biochemical data at birth and follow-up, immunological data, as well as basal and follow-up echocardiographic data. Data-entry validation criteria have been implemented in electronic data collection instrument to assure validity and precision when paper-based data are translated in electronic form. Furthermore, to assure subject’s confidentiality a careful attention was given to HIPPA identifiers when electronic data collection instrument was developed. Conclusion: Data collection instruments were successfully developed and implemented as an a priori step in a clinical research for assisting data collection and management in a case of an observational prospective study with follow-up visits.

  13. A computationally inexpensive model for estimating dimensional measurement uncertainty due to x-ray computed tomography instrument misalignments

    Science.gov (United States)

    Ametova, Evelina; Ferrucci, Massimiliano; Chilingaryan, Suren; Dewulf, Wim

    2018-06-01

    The recent emergence of advanced manufacturing techniques such as additive manufacturing and an increased demand on the integrity of components have motivated research on the application of x-ray computed tomography (CT) for dimensional quality control. While CT has shown significant empirical potential for this purpose, there is a need for metrological research to accelerate the acceptance of CT as a measuring instrument. The accuracy in CT-based measurements is vulnerable to the instrument geometrical configuration during data acquisition, namely the relative position and orientation of x-ray source, rotation stage, and detector. Consistency between the actual instrument geometry and the corresponding parameters used in the reconstruction algorithm is critical. Currently available procedures provide users with only estimates of geometrical parameters. Quantification and propagation of uncertainty in the measured geometrical parameters must be considered to provide a complete uncertainty analysis and to establish confidence intervals for CT dimensional measurements. In this paper, we propose a computationally inexpensive model to approximate the influence of errors in CT geometrical parameters on dimensional measurement results. We use surface points extracted from a computer-aided design (CAD) model to model discrepancies in the radiographic image coordinates assigned to the projected edges between an aligned system and a system with misalignments. The efficacy of the proposed method was confirmed on simulated and experimental data in the presence of various geometrical uncertainty contributors.

  14. A state-and-transition simulation modeling approach for estimating the historical range of variability

    Directory of Open Access Journals (Sweden)

    Kori Blankenship

    2015-04-01

    Full Text Available Reference ecological conditions offer important context for land managers as they assess the condition of their landscapes and provide benchmarks for desired future conditions. State-and-transition simulation models (STSMs are commonly used to estimate reference conditions that can be used to evaluate current ecosystem conditions and to guide land management decisions and activities. The LANDFIRE program created more than 1,000 STSMs and used them to assess departure from a mean reference value for ecosystems in the United States. While the mean provides a useful benchmark, land managers and researchers are often interested in the range of variability around the mean. This range, frequently referred to as the historical range of variability (HRV, offers model users improved understanding of ecosystem function, more information with which to evaluate ecosystem change and potentially greater flexibility in management options. We developed a method for using LANDFIRE STSMs to estimate the HRV around the mean reference condition for each model state in ecosystems by varying the fire probabilities. The approach is flexible and can be adapted for use in a variety of ecosystems. HRV analysis can be combined with other information to help guide complex land management decisions.

  15. Impact on mortality of prompt admission to critical care for deteriorating ward patients: an instrumental variable analysis using critical care bed strain.

    Science.gov (United States)

    Harris, Steve; Singer, Mervyn; Sanderson, Colin; Grieve, Richard; Harrison, David; Rowan, Kathryn

    2018-05-07

    To estimate the effect of prompt admission to critical care on mortality for deteriorating ward patients. We performed a prospective cohort study of consecutive ward patients assessed for critical care. Prompt admissions (within 4 h of assessment) were compared to a 'watchful waiting' cohort. We used critical care strain (bed occupancy) as a natural randomisation event that would predict prompt transfer to critical care. Strain was classified as low, medium or high (2+, 1 or 0 empty beds). This instrumental variable (IV) analysis was repeated for the subgroup of referrals with a recommendation for critical care once assessed. Risk-adjusted 90-day survival models were also constructed. A total of 12,380 patients from 48 hospitals were available for analysis. There were 2411 (19%) prompt admissions (median delay 1 h, IQR 1-2) and 9969 (81%) controls; 1990 (20%) controls were admitted later (median delay 11 h, IQR 6-26). Prompt admissions were less frequent (p care. In the risk-adjust survival model, 90-day mortality was similar. After allowing for unobserved prognostic differences between the groups, we find that prompt admission to critical care leads to lower 90-day mortality for patients assessed and recommended to critical care.

  16. On-line calibration of process instrumentation channels in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Hashemian, H.M.; Farmer, J.P. [Analysis and Measurement Services Corp., Knoxville, TN (United States)

    1995-04-01

    An on-line instrumentation monitoring system was developed and validated for use in nuclear power plants. This system continuously monitors the calibration status of instrument channels and determines whether or not they require manual calibrations. This is accomplished by comparing the output of each instrument channel to an estimate of the process it is monitoring. If the deviation of the instrument channel from the process estimate is greater than an allowable limit, then the instrument is said to be {open_quotes}out of calibration{close_quotes} and manual adjustments are made to correct the calibration. The success of the on-line monitoring system depends on the accuracy of the process estimation. The system described in this paper incorporates both simple intercomparison techniques as well as analytical approaches in the form of data-driven empirical modeling to estimate the process. On-line testing of the calibration of process instrumentation channels will reduce the number of manual calibrations currently performed, thereby reducing both costs to utilities and radiation exposure to plant personnel.

  17. Focus on variability : New tools to study intra-individual variability in developmental data

    NARCIS (Netherlands)

    van Geert, P; van Dijk, M

    2002-01-01

    In accordance with dynamic systems theory, we assume that variability is an important developmental phenomenon. However, the standard methodological toolkit of the developmental psychologist offers few instruments for the study of variability. In this article we will present several new methods that

  18. Estimating severity of sideways fall using a generic multi linear regression model based on kinematic input variables.

    Science.gov (United States)

    van der Zijden, A M; Groen, B E; Tanck, E; Nienhuis, B; Verdonschot, N; Weerdesteyn, V

    2017-03-21

    Many research groups have studied fall impact mechanics to understand how fall severity can be reduced to prevent hip fractures. Yet, direct impact force measurements with force plates are restricted to a very limited repertoire of experimental falls. The purpose of this study was to develop a generic model for estimating hip impact forces (i.e. fall severity) in in vivo sideways falls without the use of force plates. Twelve experienced judokas performed sideways Martial Arts (MA) and Block ('natural') falls on a force plate, both with and without a mat on top. Data were analyzed to determine the hip impact force and to derive 11 selected (subject-specific and kinematic) variables. Falls from kneeling height were used to perform a stepwise regression procedure to assess the effects of these input variables and build the model. The final model includes four input variables, involving one subject-specific measure and three kinematic variables: maximum upper body deceleration, body mass, shoulder angle at the instant of 'maximum impact' and maximum hip deceleration. The results showed that estimated and measured hip impact forces were linearly related (explained variances ranging from 46 to 63%). Hip impact forces of MA falls onto the mat from a standing position (3650±916N) estimated by the final model were comparable with measured values (3698±689N), even though these data were not used for training the model. In conclusion, a generic linear regression model was developed that enables the assessment of fall severity through kinematic measures of sideways falls, without using force plates. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Discrete factor approximations in simultaneous equation models: estimating the impact of a dummy endogenous variable on a continuous outcome.

    Science.gov (United States)

    Mroz, T A

    1999-10-01

    This paper contains a Monte Carlo evaluation of estimators used to control for endogeneity of dummy explanatory variables in continuous outcome regression models. When the true model has bivariate normal disturbances, estimators using discrete factor approximations compare favorably to efficient estimators in terms of precision and bias; these approximation estimators dominate all the other estimators examined when the disturbances are non-normal. The experiments also indicate that one should liberally add points of support to the discrete factor distribution. The paper concludes with an application of the discrete factor approximation to the estimation of the impact of marriage on wages.

  20. Recalcitrant soil organic matter : how useful is radiocarbon for estimating its amount and variability?

    International Nuclear Information System (INIS)

    Tate, K.; Parshotam, A.; Scott, Neal

    1997-01-01

    The role of the terrestrial biosphere in the global carbon (C) cycle is poorly understood because of the complex biology underlying C storage, the spatial variability of vegetation and soils, and the effects of land use. Little is known about the nature, amount and variability of recalcitrant C in soils, despite the importance of determining whether soils behave as sources or sinks of CO 2 . 14 C dating indicates that most soils contain this very stable C fraction, with turnover times of millennia. The amount of this fraction, named the Inert Organic Matter (IOM) in one model, is estimated indirectly using the 'bomb' 14 C content of soil. In nine New Zealand grassland and forest ecosystems, amounts of IOM-C ranged between 0.03 to 2.9 kg C m -2 (1-18% of soil C to 0.25m depth). A decomposable C fraction, considered to be more susceptible to the effects of climate and land use, was estimated by subtracting the IOM-C fraction from the total soil organic C. Turnover times ranged between 8 and 36 years, and were inversely related to mean annual temperature (R 2 0.91, P 13 C NMR and pyrolysis-mass spectrometry as alkyl C. Paradoxically, for some ecosystems, the variation in IOM-C appears to be best explained by differences in soil hydrological conditions rather than by the accumulation of a discrete C fraction. Thus characterisation of environmental factors that constrain decomposition could be most useful for explaining the differences observed in IOM across different ecosystems, climates and soils. Despite the insights the modelling approach using 'bomb' 14 C provides into mechanisms for organic matter stabilisation, on theoretical grounds the validity of using 14 C measurements to estimate a recalcitrant C fraction that by definition contains no 14 C is questionable. We conclude that more rigorous models are needed with pools that can be experimentally verified, to improve understanding of the spatial variability of soil C storage. (author)

  1. Organizational learning, pilot test of Likert-type instruments

    Directory of Open Access Journals (Sweden)

    Manuel Alfonso Garzón Castrillón

    2010-09-01

    Full Text Available This paper presents the results obtained in the pilot study of instruments created to comply the specific objective of designing and validating instruments to study the capacity of organizational learning. The Likert measurement scale was used because it allowed to establish the pertinence of the dimension as variable in the context of organizational learning. A One-way Analysis of Variance (ANOVA was used, with statistical package SPSS. Some 138 variables in 3 factors and 40 affirmations were simplified.

  2. A method of estimating GPS instrumental biases with a convolution algorithm

    Science.gov (United States)

    Li, Qi; Ma, Guanyi; Lu, Weijun; Wan, Qingtao; Fan, Jiangtao; Wang, Xiaolan; Li, Jinghua; Li, Changhua

    2018-03-01

    This paper presents a method of deriving the instrumental differential code biases (DCBs) of GPS satellites and dual frequency receivers. Considering that the total electron content (TEC) varies smoothly over a small area, one ionospheric pierce point (IPP) and four more nearby IPPs were selected to build an equation with a convolution algorithm. In addition, unknown DCB parameters were arranged into a set of equations with GPS observations in a day unit by assuming that DCBs do not vary within a day. Then, the DCBs of satellites and receivers were determined by solving the equation set with the least-squares fitting technique. The performance of this method is examined by applying it to 361 days in 2014 using the observation data from 1311 GPS Earth Observation Network (GEONET) receivers. The result was crosswise-compared with the DCB estimated by the mesh method and the IONEX products from the Center for Orbit Determination in Europe (CODE). The DCB values derived by this method agree with those of the mesh method and the CODE products, with biases of 0.091 ns and 0.321 ns, respectively. The convolution method's accuracy and stability were quite good and showed improvements over the mesh method.

  3. Portable radiation instrumentation traceability of standards and measurements

    International Nuclear Information System (INIS)

    Wiserman, A.; Walke, M.

    1995-01-01

    Portable radiation measuring instruments are used to estimate and control doses for workers. Calibration of these instruments must be sufficiently accurate to ensure that administrative and legal dose limits are not likely to be exceeded due to measurement uncertainties. An instrument calibration and management program is established which permits measurements made with an instrument to be traced to a national standard. This paper describes the establishment and maintenance of calibration standards for gamma survey instruments and an instrument management program which achieves traceability of measurement for uniquely identified field instruments. (author)

  4. Estimation of the Effects of Statistical Discrimination on the Gender Wage Gap

    OpenAIRE

    Atsuko Tanaka

    2015-01-01

    How much of the gender wage gap can be attributed to statistical discrimination? Applying an employer learning model and Instrumental Variable (IV) estimation strategy to Japanese panel data, I examine how women's generally weak labor force attachment affects wages when employers cannot easily observe an individual's labor force intentions. To overcome endogeneity issues, I use survey information on individual workers' intentions to continue working after having children and Japanese panel da...

  5. Modeling, estimation and optimal filtration in signal processing

    CERN Document Server

    Najim, Mohamed

    2010-01-01

    The purpose of this book is to provide graduate students and practitioners with traditional methods and more recent results for model-based approaches in signal processing.Firstly, discrete-time linear models such as AR, MA and ARMA models, their properties and their limitations are introduced. In addition, sinusoidal models are addressed.Secondly, estimation approaches based on least squares methods and instrumental variable techniques are presented.Finally, the book deals with optimal filters, i.e. Wiener and Kalman filtering, and adaptive filters such as the RLS, the LMS and the

  6. Relations estimate between investment in digital media and financial variables of companies: a Colombian overview

    Directory of Open Access Journals (Sweden)

    Amalia Novoa Hoyos

    2016-06-01

    Full Text Available This article shows a first estimate about the relationship between investment in digital media and some financial variables in Colombia. First, a literature review is made about the impact of marketing and digital marketing in Company performance. Then, an analysis of the sectorial variables such as liquidity, profitability, indebtedness and concentration in sectors like food, personal grooming, automotive, drinking and tobacco, construction, entertainment, furniture, services, telecommunication, tourism and clothing using the technique of ordinary squared minimums (OSM in the years 2011, 2012, 2013 and 2014. For this study, investment in digital media in the above- mentioned years is also taken into account.

  7. Variable disparity estimation based intermediate view reconstruction in dynamic flow allocation over EPON-based access networks

    Science.gov (United States)

    Bae, Kyung-Hoon; Lee, Jungjoon; Kim, Eun-Soo

    2008-06-01

    In this paper, a variable disparity estimation (VDE)-based intermediate view reconstruction (IVR) in dynamic flow allocation (DFA) over an Ethernet passive optical network (EPON)-based access network is proposed. In the proposed system, the stereoscopic images are estimated by a variable block-matching algorithm (VBMA), and they are transmitted to the receiver through DFA over EPON. This scheme improves a priority-based access network by converting it to a flow-based access network with a new access mechanism and scheduling algorithm, and then 16-view images are synthesized by the IVR using VDE. Some experimental results indicate that the proposed system improves the peak-signal-to-noise ratio (PSNR) to as high as 4.86 dB and reduces the processing time to 3.52 s. Additionally, the network service provider can provide upper limits of transmission delays by the flow. The modeling and simulation results, including mathematical analyses, from this scheme are also provided.

  8. Imprecision in estimates of dose from ingested 137Cs due to variability in human biological characteristics

    International Nuclear Information System (INIS)

    Schwarz, G.; Dunning, D.E. Jr.

    1982-01-01

    An attempt has been made to quantify the variability in human biological parameters determining dose to man from ingestion of a unit activity of soluble 137 Cs and the resulting imprecision in the predicted total-body dose commitment. The analysis is based on an extensive review of the literature along with the application of statistical methods to determine parameter variability, correlations between parameters, and predictive imprecision. The variability in the principal biological parameters (biological half-time and total-body mass) involved can be described by a geometric standard deviation of 1.2-1.5 for adults and 1.6-1.9 for children/ adolescents of age 0.1-18 yr. The estimated predictive imprecision (using a Monte Carlo technique) in the total-body dose commitment from ingested 137 Cs can be described by a geometric standard deviation on the order of 1.3-1.4, meaning that the 99th percentile of the predicted distribution of dose is within approximately 2.1 times the mean value. The mean dose estimate is 0.009 Sv/MBq (34 mrem/μ Ci) for children/adolescents and 0.01 Sv/MBq (38 mrem/μ Ci) for adults. Little evidence of age dependence in the total-body dose from ingested 137 Cs is observed. (author)

  9. Contribution to the discussion of P.M. Fayers and David J. Hand: Causal variables, indicator variables and measurement scales: an example from quality of life

    DEFF Research Database (Denmark)

    Keiding, Niels

    2002-01-01

    Causal variables; Clinimetric scales; Composite scales; Construct validity; Measurement scales; Multi-item scales; Quality-of-life instruments......Causal variables; Clinimetric scales; Composite scales; Construct validity; Measurement scales; Multi-item scales; Quality-of-life instruments...

  10. Environmental economic variables - what has been measured until now?

    International Nuclear Information System (INIS)

    Ahlroth, S.; Palm, V.

    2001-01-01

    Environmental accounting encompasses a variety of economic variables. They range from production values of different branches of industry, through fiscal instruments such as environmental taxes, and to valuation studies of external effects of the economy. This paper tries to map out the different aspects of variables, and to point out their linkages and uses, viewed from an environmental accounting perspective. Also, the estimated size of the different types of variables is discussed, based mainly on Swedish studies and on a national scale. Included variables are GDP, export and import, environmental taxes, subsidies, environmental costs, remediation costs, environmental damage costs and examples of prevention costs. We will divide the economic variables into four different types: 1. Those that are recorded as the actors payment on the market 2. Those that are part of the government budget 3. Those that serve as a valuation of the costs incurred on society 4. Those that could be invested to prevent environmental damage The size of the different costs will be taken from a variety of studies, mainly Swedish, and be put in relation to GDP or similar. A brief discussion of the Swedish situation as compared to international figures will also be made

  11. Quantitative precipitation estimation in complex orography using quasi-vertical profiles of dual polarization radar variables

    Science.gov (United States)

    Montopoli, Mario; Roberto, Nicoletta; Adirosi, Elisa; Gorgucci, Eugenio; Baldini, Luca

    2017-04-01

    Weather radars are nowadays a unique tool to estimate quantitatively the rain precipitation near the surface. This is an important task for a plenty of applications. For example, to feed hydrological models, mitigate the impact of severe storms at the ground using radar information in modern warning tools as well as aid the validation studies of satellite-based rain products. With respect to the latter application, several ground validation studies of the Global Precipitation Mission (GPM) products have recently highlighted the importance of accurate QPE from ground-based weather radars. To date, a plenty of works analyzed the performance of various QPE algorithms making use of actual and synthetic experiments, possibly trained by measurement of particle size distributions and electromagnetic models. Most of these studies support the use of dual polarization variables not only to ensure a good level of radar data quality but also as a direct input in the rain estimation equations. Among others, one of the most important limiting factors in radar QPE accuracy is the vertical variability of particle size distribution that affects at different levels, all the radar variables acquired as well as rain rates. This is particularly impactful in mountainous areas where the altitudes of the radar sampling is likely several hundred of meters above the surface. In this work, we analyze the impact of the vertical profile variations of rain precipitation on several dual polarization radar QPE algorithms when they are tested a in complex orography scenario. So far, in weather radar studies, more emphasis has been given to the extrapolation strategies that make use of the signature of the vertical profiles in terms of radar co-polar reflectivity. This may limit the use of the radar vertical profiles when dual polarization QPE algorithms are considered because in that case all the radar variables used in the rain estimation process should be consistently extrapolated at the surface

  12. A novel recursive Fourier transform for nonuniform sampled signals: application to heart rate variability spectrum estimation.

    Science.gov (United States)

    Holland, Alexander; Aboy, Mateo

    2009-07-01

    We present a novel method to iteratively calculate discrete Fourier transforms for discrete time signals with sample time intervals that may be widely nonuniform. The proposed recursive Fourier transform (RFT) does not require interpolation of the samples to uniform time intervals, and each iterative transform update of N frequencies has computational order N. Because of the inherent non-uniformity in the time between successive heart beats, an application particularly well suited for this transform is power spectral density (PSD) estimation for heart rate variability. We compare RFT based spectrum estimation with Lomb-Scargle Transform (LST) based estimation. PSD estimation based on the LST also does not require uniform time samples, but the LST has a computational order greater than Nlog(N). We conducted an assessment study involving the analysis of quasi-stationary signals with various levels of randomly missing heart beats. Our results indicate that the RFT leads to comparable estimation performance to the LST with significantly less computational overhead and complexity for applications requiring iterative spectrum estimations.

  13. Satellite-based Analysis of CO Variability over the Amazon Basin

    Science.gov (United States)

    Deeter, M. N.; Emmons, L. K.; Martinez-Alonso, S.; Tilmes, S.; Wiedinmyer, C.

    2017-12-01

    Pyrogenic emissions from the Amazon Basin exert significant influence on both climate and air quality but are highly variable from year to year. The ability of models to simulate the impact of biomass burning emissions on downstream atmospheric concentrations depends on (1) the quality of surface flux estimates (i.e., emissions inventories), (2) model dynamics (e.g., horizontal winds, large-scale convection and mixing) and (3) the representation of atmospheric chemical processes. With an atmospheric lifetime of a few months, carbon monoxide (CO) is a commonly used diagnostic for biomass burning. CO products are available from several satellite instruments and allow analyses of CO variability over extended regions such as the Amazon Basin with useful spatial and temporal sampling characteristics. The MOPITT ('Measurements of Pollution in the Troposphere') instrument was launched on the NASA Terra platform near the end of 1999 and is still operational. MOPITT is uniquely capable of measuring tropospheric CO concentrations using both thermal-infrared and near-infrared observations, resulting in the ability to independently retrieve lower- and upper-troposphere CO concentrations. We exploit the 18-year MOPITT record and related datasets to analyze the variability of CO over the Amazon Basin and evaluate simulations performed with the CAM-chem chemical transport model. We demonstrate that observed differences between MOPITT observations and model simulations provide important clues regarding emissions inventories, convective mixing and long-range transport.

  14. Estimating decadal variability in sea level from tide gauge records: An application to the North Sea

    NARCIS (Netherlands)

    Frederikse, Thomas; Riva, R.E.M.; Slobbe, Cornelis; Broerse, D.B.T.; Verlaan, Martin

    2016-01-01

    One of the primary observational data sets of sea level is represented by the tide gauge record. We propose a new method to estimate variability on decadal time scales from tide gauge data by using a state space formulation, which couples the direct observations to a predefined state space model by

  15. Estimating decadal variability in sea level from tide gauge records : An application to the North Sea

    NARCIS (Netherlands)

    Frederikse, T.; Riva, R.E.M.; Slobbe, D.C.; Broerse, D.B.T.; Verlaan, M.

    2016-01-01

    One of the primary observational data sets of sea level is represented by the tide gauge record. We propose a new method to estimate variability on decadal time scales from tide gauge data by using a state space formulation, which couples the direct observations to a predefined state space model

  16. A Systematic Review of Statistical Methods Used to Test for Reliability of Medical Instruments Measuring Continuous Variables

    Directory of Open Access Journals (Sweden)

    Rafdzah Zaki

    2013-06-01

    Full Text Available   Objective(s: Reliability measures precision or the extent to which test results can be replicated. This is the first ever systematic review to identify statistical methods used to measure reliability of equipment measuring continuous variables. This studyalso aims to highlight the inappropriate statistical method used in the reliability analysis and its implication in the medical practice.   Materials and Methods: In 2010, five electronic databases were searched between 2007 and 2009 to look for reliability studies. A total of 5,795 titles were initially identified. Only 282 titles were potentially related, and finally 42 fitted the inclusion criteria. Results: The Intra-class Correlation Coefficient (ICC is the most popular method with 25 (60% studies having used this method followed by the comparing means (8 or 19%. Out of 25 studies using the ICC, only 7 (28% reported the confidence intervals and types of ICC used. Most studies (71% also tested the agreement of instruments. Conclusion: This study finds that the Intra-class Correlation Coefficient is the most popular method used to assess the reliability of medical instruments measuring continuous outcomes. There are also inappropriate applications and interpretations of statistical methods in some studies. It is important for medical researchers to be aware of this issue, and be able to correctly perform analysis in reliability studies.

  17. Estimating Search Engine Index Size Variability

    DEFF Research Database (Denmark)

    Van den Bosch, Antal; Bogers, Toine; De Kunder, Maurice

    2016-01-01

    One of the determining factors of the quality of Web search engines is the size of their index. In addition to its influence on search result quality, the size of the indexed Web can also tell us something about which parts of the WWW are directly accessible to the everyday user. We propose a novel...... method of estimating the size of a Web search engine’s index by extrapolating from document frequencies of words observed in a large static corpus of Web pages. In addition, we provide a unique longitudinal perspective on the size of Google and Bing’s indices over a nine-year period, from March 2006...... until January 2015. We find that index size estimates of these two search engines tend to vary dramatically over time, with Google generally possessing a larger index than Bing. This result raises doubts about the reliability of previous one-off estimates of the size of the indexed Web. We find...

  18. Mode choice endogeneity in value of travel time estimation

    DEFF Research Database (Denmark)

    Mabit, Stefan Lindhard; Fosgerau, Mogens

    The current way to estimate value of travel time is to use a mode-specific sample and hence to estimate mode-specific value of travel times. This approach raises certain questions concerning how to generalise the values to a population. A problem would be if there is an uncontrolled sample...... selection mechanism. This is the case if there is correlation between mode choice and the value of travel time that is not controlled for by explanatory variables. What could confuse the estimated values is the difficulty to separate mode effects from user effect. An example would be the effect of income...... of travel time we use a stated choice dataset. These data include binary choice within mode for car and bus. The first approach is to use a probit model to model mode choice using instruments and then use this in the estimation of the value of travel time. The second approach is based on the use of a very...

  19. Radar rainfall estimation of stratiform winter precipitation in the Belgian Ardennes

    Science.gov (United States)

    Hazenberg, P.; Leijnse, H.; Uijlenhoet, R.

    2011-02-01

    Radars are known for their ability to obtain a wealth of information about spatial storm field characteristics. Unfortunately, rainfall estimates obtained by this instrument are known to be affected by multiple sources of error. Especially for stratiform precipitation systems, the quality of radar rainfall estimates starts to decrease at relatively close ranges. In the current study, the hydrological potential of weather radar is analyzed during a winter half-year for the hilly region of the Belgian Ardennes. A correction algorithm is proposed which corrects the radar data for errors related to attenuation, ground clutter, anomalous propagation, the vertical profile of reflectivity (VPR), and advection. No final bias correction with respect to rain gauge data was implemented because such an adjustment would not add to a better understanding of the quality of the radar data. The impact of the different corrections is assessed using rainfall information sampled by 42 hourly rain gauges. The largest improvement in the quality of the radar data is obtained by correcting for ground clutter. The impact of VPR correction and advection depends on the spatial variability and velocity of the precipitation system. Overall during the winter period, the radar underestimates the amount of precipitation as compared to the rain gauges. Remaining differences between both instruments can be attributed to spatial and temporal variability in the type of precipitation, which has not been taken into account.

  20. Unit-specific calibration of Actigraph accelerometers in a mechanical setup - is it worth the effort? The effect on random output variation caused by technical inter-instrument variability in the laboratory and in the field

    DEFF Research Database (Denmark)

    Moeller, Niels C; Korsholm, Lars; Kristensen, Peter L

    2008-01-01

    BACKGROUND: Potentially, unit-specific in-vitro calibration of accelerometers could increase field data quality and study power. However, reduced inter-unit variability would only be important if random instrument variability contributes considerably to the total variation in field data. Therefor...

  1. Reliability of Instruments Measuring At-Risk and Problem Gambling Among Young Individuals

    DEFF Research Database (Denmark)

    Edgren, Robert; Castrén, Sari; Mäkelä, Marjukka

    2016-01-01

    This review aims to clarify which instruments measuring at-risk and problem gambling (ARPG) among youth are reliable and valid in light of reported estimates of internal consistency, classification accuracy, and psychometric properties. A systematic search was conducted in PubMed, Medline, and Psyc......Info covering the years 2009–2015. In total, 50 original research articles fulfilled the inclusion criteria: target age under 29 years, using an instrument designed for youth, and reporting a reliability estimate. Articles were evaluated with the revised Quality Assessment of Diagnostic Accuracy Studies tool....... Reliability estimates were reported for five ARPG instruments. Most studies (66%) evaluated the South Oaks Gambling Screen Revised for Adolescents. The Gambling Addictive Behavior Scale for Adolescents was the only novel instrument. In general, the evaluation of instrument reliability was superficial. Despite...

  2. Estimating elasticity for residential electricity demand in China.

    Science.gov (United States)

    Shi, G; Zheng, X; Song, F

    2012-01-01

    Residential demand for electricity is estimated for China using a unique household level dataset. Household electricity demand is specified as a function of local electricity price, household income, and a number of social-economic variables at household level. We find that the residential demand for electricity responds rather sensitively to its own price in China, which implies that there is significant potential to use the price instrument to conserve electricity consumption. Electricity elasticities across different heterogeneous household groups (e.g., rich versus poor and rural versus urban) are also estimated. The results show that the high income group is more price elastic than the low income group, while rural families are more price elastic than urban families. These results have important policy implications for designing an increasing block tariff.

  3. Surveillance instrumentation for spent-fuel safeguards

    International Nuclear Information System (INIS)

    McKenzie, J.M.; Holmes, J.P.; Gillman, L.K.; Schmitz, J.A.; McDaniel, P.J.

    1978-01-01

    The movement, in a facility, of spent reactor fuel may be tracked using simple instrumentation together with a real time unfolding algorithm. Experimental measurements, from multiple radiation monitors and crane weight and position monitors, were obtained during spent fuel movements at the G.E. Morris Spent-Fuel Storage Facility. These data and a preliminary version of an unfolding algorithm were used to estimate the position of the centroid and the magnitude of the spent fuel radiation source. Spatial location was estimated to +-1.5 m and source magnitude to +-10% of their true values. Application of this surveillance instrumentation to spent-fuel safeguards is discussed

  4. Natural radioactivity of bedrock bath instruments and hot spring instruments in Japan

    International Nuclear Information System (INIS)

    Kazuki Iwaoka; Hiroyuki Tabe; Hidenori Yonehara

    2013-01-01

    In Japan, bedrock bath instruments and hot spring instruments that contain natural radioactive nuclides are commercially available. In this study, such instruments containing natural radioactive nuclides, currently distributed in Japan, were collected and the radioactivity concentration of 238 U series, 232 Th series, and 40 K in them was determined by gamma ray spectrum analyses. Effective doses to workers and general consumers handling the materials were estimated, revealing the radioactivity concentration of 238 U series, 232 Th series, and 40 K to be lower than critical values given in the IAEA Safety Guide. The maximum effective doses to workers and general consumers were 210 and 6.1 μSv y -1 , respectively. These values are lower than the intervention exemption level (1,000 μSv y -1 ) given in ICRP Publ. 82. (author)

  5. Testing a statistical method of global mean palotemperature estimations in a long climate simulation

    Energy Technology Data Exchange (ETDEWEB)

    Zorita, E.; Gonzalez-Rouco, F. [GKSS-Forschungszentrum Geesthacht GmbH (Germany). Inst. fuer Hydrophysik

    2001-07-01

    Current statistical methods of reconstructing the climate of the last centuries are based on statistical models linking climate observations (temperature, sea-level-pressure) and proxy-climate data (tree-ring chronologies, ice-cores isotope concentrations, varved sediments, etc.). These models are calibrated in the instrumental period, and the longer time series of proxy data are then used to estimate the past evolution of the climate variables. Using such methods the global mean temperature of the last 600 years has been recently estimated. In this work this method of reconstruction is tested using data from a very long simulation with a climate model. This testing allows to estimate the errors of the estimations as a function of the number of proxy data and the time scale at which the estimations are probably reliable. (orig.)

  6. NASA Instrument Cost Model for Explorer-Like Mission Instruments (NICM-E)

    Science.gov (United States)

    Habib-Agahi, Hamid; Fox, George; Mrozinski, Joe; Ball, Gary

    2013-01-01

    NICM-E is a cost estimating relationship that supplements the traditional NICM System Level CERs for instruments flown on NASA Explorer-like missions that have the following three characteristics: 1) fly on Class C missions, 2) major development led and performed by universities or research foundations, and 3) have significant level of inheritance.

  7. Predicting College Women's Career Plans: Instrumentality, Work, and Family

    Science.gov (United States)

    Savela, Alexandra E.; O'Brien, Karen M.

    2016-01-01

    This study examined how college women's instrumentality and expectations about combining work and family predicted early career development variables. Specifically, 177 undergraduate women completed measures of instrumentality (i.e., traits such as ambition, assertiveness, and risk taking), willingness to compromise career for family, anticipated…

  8. Comparison of Three Plot Selection Methods for Estimating Change in Temporally Variable, Spatially Clustered Populations.

    Energy Technology Data Exchange (ETDEWEB)

    Thompson, William L. [Bonneville Power Administration, Portland, OR (US). Environment, Fish and Wildlife

    2001-07-01

    Monitoring population numbers is important for assessing trends and meeting various legislative mandates. However, sampling across time introduces a temporal aspect to survey design in addition to the spatial one. For instance, a sample that is initially representative may lose this attribute if there is a shift in numbers and/or spatial distribution in the underlying population that is not reflected in later sampled plots. Plot selection methods that account for this temporal variability will produce the best trend estimates. Consequently, I used simulation to compare bias and relative precision of estimates of population change among stratified and unstratified sampling designs based on permanent, temporary, and partial replacement plots under varying levels of spatial clustering, density, and temporal shifting of populations. Permanent plots produced more precise estimates of change than temporary plots across all factors. Further, permanent plots performed better than partial replacement plots except for high density (5 and 10 individuals per plot) and 25% - 50% shifts in the population. Stratified designs always produced less precise estimates of population change for all three plot selection methods, and often produced biased change estimates and greatly inflated variance estimates under sampling with partial replacement. Hence, stratification that remains fixed across time should be avoided when monitoring populations that are likely to exhibit large changes in numbers and/or spatial distribution during the study period. Key words: bias; change estimation; monitoring; permanent plots; relative precision; sampling with partial replacement; temporary plots.

  9. Microprocessor-based, on-line decision aid for resolving conflicting nuclear reactor instrumentation

    International Nuclear Information System (INIS)

    Alesso, H.P.

    1981-01-01

    We describe one design for a microprocessor-based, on-line decision aid for identifying and resolving false, conflicting, or misleading instrument indications resulting from certain systems interactions for a pressurized water reactor. The system processes sensor signals from groups of instruments that track together under nominal transient and certain accident conditions, and alarms when they do not track together. We examine multiple-casualty systems interaction and formulate a trial grouping of variables that track together under specified conditions. A two-of-three type redundancy check of key variables provides alarm and indication of conflicting information when one signal suddenly tracks in opposition due to multiple casualty, instrument failure, and/or locally abnormal conditions. Since a vote count of two of three variables in conflict as inconclusive evidence, the system is not designed to provide tripping or corrective action, but improves the operator/instrument interface by providing additional and partially digested information

  10. Estimation of Paddy Rice Variables with a Modified Water Cloud Model and Improved Polarimetric Decomposition Using Multi-Temporal RADARSAT-2 Images

    Directory of Open Access Journals (Sweden)

    Zhi Yang

    2016-10-01

    Full Text Available Rice growth monitoring is very important as rice is one of the staple crops of the world. Rice variables as quantitative indicators of rice growth are critical for farming management and yield estimation, and synthetic aperture radar (SAR has great advantages for monitoring rice variables due to its all-weather observation capability. In this study, eight temporal RADARSAT-2 full-polarimetric SAR images were acquired during rice growth cycle and a modified water cloud model (MWCM was proposed, in which the heterogeneity of the rice canopy in the horizontal direction and its phenological changes were considered when the double-bounce scattering between the rice canopy and the underlying surface was firstly considered as well. Then, three scattering components from an improved polarimetric decomposition were coupled with the MWCM, instead of the backscattering coefficients. Using a genetic algorithm, eight rice variables were estimated, such as the leaf area index (LAI, rice height (h, and the fresh and dry biomass of ears (Fe and De. The accuracy validation showed the MWCM was suitable for the estimation of rice variables during the whole growth season. The validation results showed that the MWCM could predict the temporal behaviors of the rice variables well during the growth cycle (R2 > 0.8. Compared with the original water cloud model (WCM, the relative errors of rice variables with the MWCM were much smaller, especially in the vegetation phase (approximately 15% smaller. Finally, it was discussed that the MWCM could be used, theoretically, for extensive applications since the empirical coefficients in the MWCM were determined in general cases, but more applications of the MWCM are necessary in future work.

  11. Estimation of transmission mechanism of monetary policy in Serbia

    Directory of Open Access Journals (Sweden)

    Bungin Sanja

    2015-01-01

    Full Text Available Transmission mechanism of monetary policy recently has been subject to several studies in Serbia. The so called 'black box' of monetary policy is investigated with aim to identify the effects of transmission channel in environment where exchange rate has a dominant role in central bank operations. Therefore, it is a challenge to approach this problem in inflation targeting regime where key interest rate is expected to prevail as a main policy instrument. The study employs unrestricted Vector Autoregression model for estimating significance of exchange rate and interest rate channel. As expected, exchange rate has far more stronger influence on inflation, even though there are some signs of interest rate channel existence. Introducing Euribor as endogenous variables in VAR system displayed important impact on real variables.

  12. Improving the Accuracy of Laplacian Estimation with Novel Variable Inter-Ring Distances Concentric Ring Electrodes

    Directory of Open Access Journals (Sweden)

    Oleksandr Makeyev

    2016-06-01

    Full Text Available Noninvasive concentric ring electrodes are a promising alternative to conventional disc electrodes. Currently, the superiority of tripolar concentric ring electrodes over disc electrodes, in particular, in accuracy of Laplacian estimation, has been demonstrated in a range of applications. In our recent work, we have shown that accuracy of Laplacian estimation can be improved with multipolar concentric ring electrodes using a general approach to estimation of the Laplacian for an (n + 1-polar electrode with n rings using the (4n + 1-point method for n ≥ 2. This paper takes the next step toward further improving the Laplacian estimate by proposing novel variable inter-ring distances concentric ring electrodes. Derived using a modified (4n + 1-point method, linearly increasing and decreasing inter-ring distances tripolar (n = 2 and quadripolar (n = 3 electrode configurations are compared to their constant inter-ring distances counterparts. Finite element method modeling and analytic results are consistent and suggest that increasing inter-ring distances electrode configurations may decrease the truncation error resulting in more accurate Laplacian estimates compared to respective constant inter-ring distances configurations. For currently used tripolar electrode configuration, the truncation error may be decreased more than two-fold, while for the quadripolar configuration more than a six-fold decrease is expected.

  13. Improving the Accuracy of Laplacian Estimation with Novel Variable Inter-Ring Distances Concentric Ring Electrodes

    Science.gov (United States)

    Makeyev, Oleksandr; Besio, Walter G.

    2016-01-01

    Noninvasive concentric ring electrodes are a promising alternative to conventional disc electrodes. Currently, the superiority of tripolar concentric ring electrodes over disc electrodes, in particular, in accuracy of Laplacian estimation, has been demonstrated in a range of applications. In our recent work, we have shown that accuracy of Laplacian estimation can be improved with multipolar concentric ring electrodes using a general approach to estimation of the Laplacian for an (n + 1)-polar electrode with n rings using the (4n + 1)-point method for n ≥ 2. This paper takes the next step toward further improving the Laplacian estimate by proposing novel variable inter-ring distances concentric ring electrodes. Derived using a modified (4n + 1)-point method, linearly increasing and decreasing inter-ring distances tripolar (n = 2) and quadripolar (n = 3) electrode configurations are compared to their constant inter-ring distances counterparts. Finite element method modeling and analytic results are consistent and suggest that increasing inter-ring distances electrode configurations may decrease the truncation error resulting in more accurate Laplacian estimates compared to respective constant inter-ring distances configurations. For currently used tripolar electrode configuration, the truncation error may be decreased more than two-fold, while for the quadripolar configuration more than a six-fold decrease is expected. PMID:27294933

  14. Estimating the association between metabolic risk factors and marijuana use in U.S. adults using data from the continuous National Health and Nutrition Examination Survey.

    Science.gov (United States)

    Thompson, Christin Ann; Hay, Joel W

    2015-07-01

    More research is needed on the health effects of marijuana use. Results of previous studies indicate that marijuana could alleviate certain factors of metabolic syndrome, such as obesity. Data on 6281 persons from National Health and Nutrition Examination Survey from 2005 to 2012 were used to estimate the effect of marijuana use on cardiometabolic risk factors. The reliability of ordinary least squares (OLS) regression models was tested by replacing marijuana use as the risk factor of interest with alcohol and carbohydrate consumption. Instrumental variable methods were used to account for the potential endogeneity of marijuana use. OLS models show lower fasting insulin, insulin resistance, body mass index, and waist circumference in users compared with nonusers. However, when alcohol and carbohydrate intake substitute for marijuana use in OLS models, similar metabolic benefits are estimated. The Durbin-Wu-Hausman tests provide evidence of endogeneity of marijuana use in OLS models, but instrumental variables models do not yield significant estimates for marijuana use. These findings challenge the robustness of OLS estimates of a positive relationship between marijuana use and fasting insulin, insulin resistance, body mass index, and waist circumference. Copyright © 2015 Elsevier Inc. All rights reserved.

  15. Estimation of Chinese surface NO2 concentrations combining satellite data and Land Use Regression

    Science.gov (United States)

    Anand, J.; Monks, P.

    2016-12-01

    Monitoring surface-level air quality is often limited by in-situ instrument placement and issues arising from harmonisation over long timescales. Satellite instruments can offer a synoptic view of regional pollution sources, but in many cases only a total or tropospheric column can be measured. In this work a new technique of estimating surface NO2 combining both satellite and in-situ data is presented, in which a Land Use Regression (LUR) model is used to create high resolution pollution maps based on known predictor variables such as population density, road networks, and land cover. By employing a mixed effects approach, it is possible to take advantage of the spatiotemporal variability in the satellite-derived column densities to account for daily and regional variations in surface NO2 caused by factors such as temperature, elevation, and wind advection. In this work, surface NO2 maps are modelled over the North China Plain and Pearl River Delta during high-pollution episodes by combining in-situ measurements and tropospheric columns from the Ozone Monitoring Instrument (OMI). The modelled concentrations show good agreement with in-situ data and surface NO2 concentrations derived from the MACC-II global reanalysis.

  16. Assessing Mucoadhesion in Polymer Gels: The Effect of Method Type and Instrument Variables

    Directory of Open Access Journals (Sweden)

    Jéssica Bassi da Silva

    2018-03-01

    Full Text Available The process of mucoadhesion has been widely studied using a wide variety of methods, which are influenced by instrumental variables and experiment design, making the comparison between the results of different studies difficult. The aim of this work was to standardize the conditions of the detachment test and the rheological methods of mucoadhesion assessment for semisolids, and introduce a texture profile analysis (TPA method. A factorial design was developed to suggest standard conditions for performing the detachment force method. To evaluate the method, binary polymeric systems were prepared containing poloxamer 407 and Carbopol 971P®, Carbopol 974P®, or Noveon® Polycarbophil. The mucoadhesion of systems was evaluated, and the reproducibility of these measurements investigated. This detachment force method was demonstrated to be reproduceable, and gave different adhesion when mucin disk or ex vivo oral mucosa was used. The factorial design demonstrated that all evaluated parameters had an effect on measurements of mucoadhesive force, but the same was not observed for the work of adhesion. It was suggested that the work of adhesion is a more appropriate metric for evaluating mucoadhesion. Oscillatory rheology was more capable of investigating adhesive interactions than flow rheology. TPA method was demonstrated to be reproducible and can evaluate the adhesiveness interaction parameter. This investigation demonstrates the need for standardized methods to evaluate mucoadhesion and makes suggestions for a standard study design.

  17. An Attempt To Estimate The Contribution Of Variability Of Wetland Extent On The Variability Of The Atmospheric Methane Growth Rate In The Years 1993-2000.

    Science.gov (United States)

    Ringeval, B.; de Noblet-Ducoudre, N.; Prigent, C.; Bousquet, P.

    2006-12-01

    The atmospheric methane growth rate presents lots of seasonal and year-to-year variations. Large uncertainties still exist in the relative part of differents sources and sinks on these variations. We have considered, in this study, the main natural sources of methane and the supposed main variable source, i.e. wetlands, and tried to simulate the variations of their emissions considering the variability of the wetland extent and of the climate. For this study, we use the methane emission model of Walter et al. (2001) and the quantification of the flooded areas for the years 1993-2000 obtained with a suite of satellite observations by Prigent et al. (2001). The data necessary to the Walter's model are obtained with simulation of a dynamic global vegetation model ORCHIDEE (Krinner et al. (2005)) constrained by the NCC climate data (Ngo-Duc et al. (2005)) and after imposing a water-saturated soil to approach productivity of wetlands. We calculate global annual methane emissions from wetlands to be 400 Tg per year, that is higher than previous results obtained with fixed wetland extent. Simulations are realised to estimate the part of variability in the emissions explained by the variability of the wetland extent. It seems that the year-to-year emission variability is mainly explained by the interannual variability of wetland extent. The seasonnal variability is explained for 75% in the tropics and only for 40% in the north of 30°N by variability of wetlands extend. Finally, we compare results with a top-down approach of Bousquet et al.(2006).

  18. Nuclear instrumentation

    International Nuclear Information System (INIS)

    Weill, Jacky; Fabre, Rene.

    1981-01-01

    This article sums up the Research and Development effort at present being carried out in the five following fields of applications: Health physics and Radioprospection, Control of nuclear reactors, Plant control (preparation and reprocessing of the fuel, testing of nuclear substances, etc.), Research laboratory instrumentation, Detectors. It also sets the place of French industrial activities by means of an estimate of the French market, production and flow of trading with other countries [fr

  19. Distributions of component failure rates, estimated from LER data

    International Nuclear Information System (INIS)

    Atwood, C.L.

    1985-01-01

    Past analyses of Licensee Event Report (LER) data have noted that component failure rates vary from plant to plant, and have estimated the distributions by two-parameter γ distributions. In this study, a more complicated distributional form is considered, a mixture of γs. This could arise if the plants' failure rates cluster into distinct groups. The method was applied to selected published LER data for diesel generators, pumps, valves, and instrumentation and control assemblies. The improved fits from using a mixture rather than a single γ distribution were minimal, and not statistically significant. There seem to be two possibilities: either explanatory variables affect the failure rates only in a gradual way, not a qualitative way; or, for estimating individual component failure rates, the published LER data have been analyzed to the limit of resolution

  20. [A method to estimate the short-term fractal dimension of heart rate variability based on wavelet transform].

    Science.gov (United States)

    Zhonggang, Liang; Hong, Yan

    2006-10-01

    A new method of calculating fractal dimension of short-term heart rate variability signals is presented. The method is based on wavelet transform and filter banks. The implementation of the method is: First of all we pick-up the fractal component from HRV signals using wavelet transform. Next, we estimate the power spectrum distribution of fractal component using auto-regressive model, and we estimate parameter 7 using the least square method. Finally according to formula D = 2- (gamma-1)/2 estimate fractal dimension of HRV signal. To validate the stability and reliability of the proposed method, using fractional brown movement simulate 24 fractal signals that fractal value is 1.6 to validate, the result shows that the method has stability and reliability.

  1. Measuring Instrument Constructs of Return Factors for Green Office Building Investments Variables Using Rasch Measurement Model

    Directory of Open Access Journals (Sweden)

    Isa Mona

    2016-01-01

    Full Text Available This paper is a preliminary study on rationalising green office building investments in Malaysia. The aim of this paper is attempt to introduce the application of Rasch measurement model analysis to determine the validity and reliability of each construct in the questionnaire. In achieving this objective, a questionnaire survey was developed consists of 6 sections and a total of 106 responses were received from various investors who own and lease office buildings in Kuala Lumpur. The Rasch Measurement analysis is used to measure the quality control of item constructs in the instrument by measuring the specific objectivity within the same dimension, to reduce ambiguous measures, and a realistic estimation of precision and implicit quality. The Rasch analysis consists of the summary statistics, item unidimensionality and item measures. A result shows the items and respondent (person reliability is at 0.91 and 0.95 respectively.

  2. Providing nuclear reactor control information in the presence of instrument failures

    International Nuclear Information System (INIS)

    Tylee, J.L.; Purviance, J.E.

    1986-01-01

    A technique for using unfailed instrument outputs to generate optimal estimates of failed sensor outputs is presented and evaluated. The technique uses a bank of discrete, linear Kalman filters, each dedicated to one instrument, and a combinatory logic to perform the output estimation. The technique is tested using measurement data from a university research reactor

  3. Effect of input data variability on estimations of the equivalent constant temperature time for microbial inactivation by HTST and retort thermal processing.

    Science.gov (United States)

    Salgado, Diana; Torres, J Antonio; Welti-Chanes, Jorge; Velazquez, Gonzalo

    2011-08-01

    Consumer demand for food safety and quality improvements, combined with new regulations, requires determining the processor's confidence level that processes lowering safety risks while retaining quality will meet consumer expectations and regulatory requirements. Monte Carlo calculation procedures incorporate input data variability to obtain the statistical distribution of the output of prediction models. This advantage was used to analyze the survival risk of Mycobacterium avium subspecies paratuberculosis (M. paratuberculosis) and Clostridium botulinum spores in high-temperature short-time (HTST) milk and canned mushrooms, respectively. The results showed an estimated 68.4% probability that the 15 sec HTST process would not achieve at least 5 decimal reductions in M. paratuberculosis counts. Although estimates of the raw milk load of this pathogen are not available to estimate the probability of finding it in pasteurized milk, the wide range of the estimated decimal reductions, reflecting the variability of the experimental data available, should be a concern to dairy processors. Knowledge of the C. botulinum initial load and decimal thermal time variability was used to estimate an 8.5 min thermal process time at 110 °C for canned mushrooms reducing the risk to 10⁻⁹ spores/container with a 95% confidence. This value was substantially higher than the one estimated using average values (6.0 min) with an unacceptable 68.6% probability of missing the desired processing objective. Finally, the benefit of reducing the variability in initial load and decimal thermal time was confirmed, achieving a 26.3% reduction in processing time when standard deviation values were lowered by 90%. In spite of novel technologies, commercialized or under development, thermal processing continues to be the most reliable and cost-effective alternative to deliver safe foods. However, the severity of the process should be assessed to avoid under- and over

  4. Using latent variable approach to estimate China's economy-wide energy rebound effect over 1954–2010

    International Nuclear Information System (INIS)

    Shao, Shuai; Huang, Tao; Yang, Lili

    2014-01-01

    The energy rebound effect has been a significant issue in China, which is undergoing economic transition, since it reflects the effectiveness of energy-saving policy relying on improved energy efficiency. Based on the IPAT equation and Brookes' explanation of the rebound effect, this paper develops an alternative estimation model of the rebound effect. By using the estimation model and latent variable approach, which is achieved through a time-varying coefficient state space model, we estimate China's economy-wide energy rebound effect over 1954–2010. The results show that the rebound effect evidently exists in China as a result of the annual average of 39.73% over 1954–2010. Before and after the implementation of China's reform and opening-up policy in 1978, the rebound effects are 47.24% and 37.32%, with a strong fluctuation and a circuitously downward trend, respectively, indicating that a stable political environment and the development of market economy system facilitate the effectiveness of energy-saving policy. Although the energy-saving effect of improving energy efficiency has been partly realised, there remains a large energy-saving potential in China. - Highlights: • We present an improved estimation methodology of economy-wide energy rebound effect. • We use the latent variable approach to estimate China's economy-wide rebound effect. • The rebound exists in China and varies before and after reform and opening-up. • After 1978, the average rebound is 37.32% with a circuitously downward trend. • Traditional Solow remainder method underestimates the rebound in most cases

  5. Estimation of power system variability due to wind power

    NARCIS (Netherlands)

    Papaefthymiou, G.; Verboomen, J.; Van der Sluis, L.

    2007-01-01

    The incorporation of wind power generation to the power system leads to an increase in the variability of the system power flows. The assessment of this variability is necessary for the planning of the necessary system reinforcements. For the assessment of this variability, the uncertainty in the

  6. Spatial and temporal variability of rainfall and their effects on hydrological response in urban areas - a review

    Science.gov (United States)

    Cristiano, Elena; ten Veldhuis, Marie-claire; van de Giesen, Nick

    2017-07-01

    In urban areas, hydrological processes are characterized by high variability in space and time, making them sensitive to small-scale temporal and spatial rainfall variability. In the last decades new instruments, techniques, and methods have been developed to capture rainfall and hydrological processes at high resolution. Weather radars have been introduced to estimate high spatial and temporal rainfall variability. At the same time, new models have been proposed to reproduce hydrological response, based on small-scale representation of urban catchment spatial variability. Despite these efforts, interactions between rainfall variability, catchment heterogeneity, and hydrological response remain poorly understood. This paper presents a review of our current understanding of hydrological processes in urban environments as reported in the literature, focusing on their spatial and temporal variability aspects. We review recent findings on the effects of rainfall variability on hydrological response and identify gaps where knowledge needs to be further developed to improve our understanding of and capability to predict urban hydrological response.

  7. Electrochemical state and internal variables estimation using a reduced-order physics-based model of a lithium-ion cell and an extended Kalman filter

    Energy Technology Data Exchange (ETDEWEB)

    Stetzel, KD; Aldrich, LL; Trimboli, MS; Plett, GL

    2015-03-15

    This paper addresses the problem of estimating the present value of electrochemical internal variables in a lithium-ion cell in real time, using readily available measurements of cell voltage, current, and temperature. The variables that can be estimated include any desired set of reaction flux and solid and electrolyte potentials and concentrations at any set of one-dimensional spatial locations, in addition to more standard quantities such as state of charge. The method uses an extended Kalman filter along with a one-dimensional physics-based reduced-order model of cell dynamics. Simulations show excellent and robust predictions having dependable error bounds for most internal variables. (C) 2014 Elsevier B.V. All rights reserved.

  8. A Miniaturized Variable Pressure Scanning Electron Microscope (MVP-SEM) for the Surface of Mars: An Instrument for the Planetary Science Community

    Science.gov (United States)

    Edmunson, J.; Gaskin, J. A.; Danilatos, G.; Doloboff, I. J.; Effinger, M. R.; Harvey, R. P.; Jerman, G. A.; Klein-Schoder, R.; Mackie, W.; Magera, B.; hide

    2016-01-01

    The Miniaturized Variable Pressure Scanning Electron Microscope(MVP-SEM) project, funded by the NASA Planetary Instrument Concepts for the Advancement of Solar System Observations (PICASSO) Research Opportunities in Space and Earth Science (ROSES), will build upon previous miniaturized SEM designs for lunar and International Space Station (ISS) applications and recent advancements in variable pressure SEM's to design and build a SEM to complete analyses of samples on the surface of Mars using the atmosphere as an imaging medium. By the end of the PICASSO work, a prototype of the primary proof-of-concept components (i.e., the electron gun, focusing optics and scanning system)will be assembled and preliminary testing in a Mars analog chamber at the Jet Propulsion Laboratory will be completed to partially fulfill Technology Readiness Level to 5 requirements for those components. The team plans to have Secondary Electron Imaging(SEI), Backscattered Electron (BSE) detection, and Energy Dispersive Spectroscopy (EDS) capabilities through the MVP-SEM.

  9. Penetrating eye injuries from writing instruments

    Directory of Open Access Journals (Sweden)

    Kelly SP

    2011-12-01

    Full Text Available Simon P Kelly, Graham MB ReevesThe Royal Bolton Hospital, Bolton, UKPurpose: To consider the potential for ocular injury from writing implements by presenting four such cases, and to consider the incidence of such eye injuries from analysis of a national trauma database.Methods: The Home and Leisure Accident Surveillance System was searched for records of eye injuries from writing instruments to provide UK estimates of such injuries. Four patients with ocular penetrating injury from pens or pencils (especially when caused by children, and examined by the authors, are described which illustrate mechanisms of injury.Results: It is estimated that around 748 ocular pen injuries and 892 ocular pencil injuries of undetermined severity occurred annually in the UK during the database surveillance period 2000–2002. No eye injuries from swords, including toy swords and fencing foils, were reported.Conclusion: Ocular perforation sometimes occur from writing instruments that are thrown in the community, especially by children. Implications for policy and prevention are discussed. Non-specialists should have a low threshold for referring patients with eye injuries if suspicious of ocular penetration, even where caused by everyday objects, such as writing instruments.Keywords: eye injury, eye, children, mechanism, writing instruments, prevention

  10. Geostatistical Analysis of Mesoscale Spatial Variability and Error in SeaWiFS and MODIS/Aqua Global Ocean Color Data

    Science.gov (United States)

    Glover, David M.; Doney, Scott C.; Oestreich, William K.; Tullo, Alisdair W.

    2018-01-01

    Mesoscale (10-300 km, weeks to months) physical variability strongly modulates the structure and dynamics of planktonic marine ecosystems via both turbulent advection and environmental impacts upon biological rates. Using structure function analysis (geostatistics), we quantify the mesoscale biological signals within global 13 year SeaWiFS (1998-2010) and 8 year MODIS/Aqua (2003-2010) chlorophyll a ocean color data (Level-3, 9 km resolution). We present geographical distributions, seasonality, and interannual variability of key geostatistical parameters: unresolved variability or noise, resolved variability, and spatial range. Resolved variability is nearly identical for both instruments, indicating that geostatistical techniques isolate a robust measure of biophysical mesoscale variability largely independent of measurement platform. In contrast, unresolved variability in MODIS/Aqua is substantially lower than in SeaWiFS, especially in oligotrophic waters where previous analysis identified a problem for the SeaWiFS instrument likely due to sensor noise characteristics. Both records exhibit a statistically significant relationship between resolved mesoscale variability and the low-pass filtered chlorophyll field horizontal gradient magnitude, consistent with physical stirring acting on large-scale gradient as an important factor supporting observed mesoscale variability. Comparable horizontal length scales for variability are found from tracer-based scaling arguments and geostatistical decorrelation. Regional variations between these length scales may reflect scale dependence of biological mechanisms that also create variability directly at the mesoscale, for example, enhanced net phytoplankton growth in coastal and frontal upwelling and convective mixing regions. Global estimates of mesoscale biophysical variability provide an improved basis for evaluating higher resolution, coupled ecosystem-ocean general circulation models, and data assimilation.

  11. 12 CFR 956.6 - Use of hedging instruments.

    Science.gov (United States)

    2010-01-01

    ... counterparty for over-the-counter derivative contracts shall include: (i) A requirement that market value... reasonable estimate of the market value of the over-the-counter derivative contract at termination (standard.... Derivative instruments that do not qualify as hedging instruments pursuant to GAAP may be used only if a non...

  12. Technical Note: Error metrics for estimating the accuracy of needle/instrument placement during transperineal magnetic resonance/ultrasound-guided prostate interventions.

    Science.gov (United States)

    Bonmati, Ester; Hu, Yipeng; Villarini, Barbara; Rodell, Rachael; Martin, Paul; Han, Lianghao; Donaldson, Ian; Ahmed, Hashim U; Moore, Caroline M; Emberton, Mark; Barratt, Dean C

    2018-04-01

    Image-guided systems that fuse magnetic resonance imaging (MRI) with three-dimensional (3D) ultrasound (US) images for performing targeted prostate needle biopsy and minimally invasive treatments for prostate cancer are of increasing clinical interest. To date, a wide range of different accuracy estimation procedures and error metrics have been reported, which makes comparing the performance of different systems difficult. A set of nine measures are presented to assess the accuracy of MRI-US image registration, needle positioning, needle guidance, and overall system error, with the aim of providing a methodology for estimating the accuracy of instrument placement using a MR/US-guided transperineal approach. Using the SmartTarget fusion system, an MRI-US image alignment error was determined to be 2.0 ± 1.0 mm (mean ± SD), and an overall system instrument targeting error of 3.0 ± 1.2 mm. Three needle deployments for each target phantom lesion was found to result in a 100% lesion hit rate and a median predicted cancer core length of 5.2 mm. The application of a comprehensive, unbiased validation assessment for MR/US guided systems can provide useful information on system performance for quality assurance and system comparison. Furthermore, such an analysis can be helpful in identifying relationships between these errors, providing insight into the technical behavior of these systems. © 2018 American Association of Physicists in Medicine.

  13. Estimating the atmospheric concentration of Criegee intermediates and their possible interference in a FAGE-LIF instrument

    Science.gov (United States)

    Novelli, Anna; Hens, Korbinian; Tatum Ernest, Cheryl; Martinez, Monica; Nölscher, Anke C.; Sinha, Vinayak; Paasonen, Pauli; Petäjä, Tuukka; Sipilä, Mikko; Elste, Thomas; Plass-Dülmer, Christian; Phillips, Gavin J.; Kubistin, Dagmar; Williams, Jonathan; Vereecken, Luc; Lelieveld, Jos; Harder, Hartwig

    2017-06-01

    We analysed the extensive dataset from the HUMPPA-COPEC 2010 and the HOPE 2012 field campaigns in the boreal forest and rural environments of Finland and Germany, respectively, and estimated the abundance of stabilised Criegee intermediates (SCIs) in the lower troposphere. Based on laboratory tests, we propose that the background OH signal observed in our IPI-LIF-FAGE instrument during the aforementioned campaigns is caused at least partially by SCIs. This hypothesis is based on observed correlations with temperature and with concentrations of unsaturated volatile organic compounds and ozone. Just like SCIs, the background OH concentration can be removed through the addition of sulfur dioxide. SCIs also add to the previously underestimated production rate of sulfuric acid. An average estimate of the SCI concentration of ˜ 5.0 × 104 molecules cm-3 (with an order of magnitude uncertainty) is calculated for the two environments. This implies a very low ambient concentration of SCIs, though, over the boreal forest, significant for the conversion of SO2 into H2SO4. The large uncertainties in these calculations, owing to the many unknowns in the chemistry of Criegee intermediates, emphasise the need to better understand these processes and their potential effect on the self-cleaning capacity of the atmosphere.

  14. A critical appraisal of instruments to measure outcomes of interprofessional education.

    Science.gov (United States)

    Oates, Matthew; Davidson, Megan

    2015-04-01

    Interprofessional education (IPE) is believed to prepare health professional graduates for successful collaborative practice. A range of instruments have been developed to measure the outcomes of IPE. An understanding of the psychometric properties of these instruments is important if they are to be used to measure the effectiveness of IPE. This review set out to identify instruments available to measure outcomes of IPE and collaborative practice in pre-qualification health professional students and to critically appraise the psychometric properties of validity, responsiveness and reliability against contemporary standards for instrument design. Instruments were selected from a pool of extant instruments and subjected to critical appraisal to determine whether they satisfied inclusion criteria. The qualitative and psychometric attributes of the included instruments were appraised using a checklist developed for this review. Nine instruments were critically appraised, including the widely adopted Readiness for Interprofessional Learning Scale (RIPLS) and the Interdisciplinary Education Perception Scale (IEPS). Validity evidence for instruments was predominantly based on test content and internal structure. Ceiling effects and lack of scale width contribute to the inability of some instruments to detect change in variables of interest. Limited reliability data were reported for two instruments. Scale development and scoring protocols were generally reported by instrument developers, but the inconsistent application of scoring protocols for some instruments was apparent. A number of instruments have been developed to measure outcomes of IPE in pre-qualification health professional students. Based on reported validity evidence and reliability data, the psychometric integrity of these instruments is limited. The theoretical test construction paradigm on which instruments have been developed may be contributing to the failure of some instruments to detect change in

  15. Nitrogen concentration estimation with hyperspectral LiDAR

    Directory of Open Access Journals (Sweden)

    O. Nevalainen

    2013-10-01

    Full Text Available Agricultural lands have strong impact on global carbon dynamics and nitrogen availability. Monitoring changes in agricultural lands require more efficient and accurate methods. The first prototype of a full waveform hyperspectral Light Detection and Ranging (LiDAR instrument has been developed at the Finnish Geodetic Institute (FGI. The instrument efficiently combines the benefits of passive and active remote sensing sensors. It is able to produce 3D point clouds with spectral information included for every point which offers great potential in the field of remote sensing of environment. This study investigates the performance of the hyperspectral LiDAR instrument in nitrogen estimation. The investigation was conducted by finding vegetation indices sensitive to nitrogen concentration using hyperspectral LiDAR data and validating their performance in nitrogen estimation. The nitrogen estimation was performed by calculating 28 published vegetation indices to ten oat samples grown in different fertilization conditions. Reference data was acquired by laboratory nitrogen concentration analysis. The performance of the indices in nitrogen estimation was determined by linear regression and leave-one-out cross-validation. The results indicate that the hyperspectral LiDAR instrument holds a good capability to estimate plant biochemical parameters such as nitrogen concentration. The instrument holds much potential in various environmental applications and provides a significant improvement to the remote sensing of environment.

  16. Development of Instrumentation for Direct Validation of Regional Carbon Flux Estimates

    Data.gov (United States)

    National Aeronautics and Space Administration — We are pursuing three tasks under internal research and development: 1) procure a state-of-the-art, commercial instrument for measuring atmospheric methane (CH4) in...

  17. Estimation of Subjective Mental Work Load Level with Heart Rate Variability by Tolerance to Driver's Mental Load

    Science.gov (United States)

    Yokoi, Toshiyuki; Itoh, Michimasa; Oguri, Koji

    Most of the traffic accidents have been caused by inappropriate driver's mental state. Therefore, driver monitoring is one of the most important challenges to prevent traffic accidents. Some studies for evaluating the driver's mental state while driving have been reported; however driver's mental state should be estimated in real-time in the future. This paper proposes a way to estimate quantitatively driver's mental workload using heart rate variability. It is assumed that the tolerance to driver's mental workload is different depending on the individual. Therefore, we classify people based on their individual tolerance to mental workload. Our estimation method is multiple linear regression analysis, and we compare it to NASA-TLX which is used as the evaluation method of subjective mental workload. As a result, the coefficient of correlation improved from 0.83 to 0.91, and the standard deviation of error also improved. Therefore, our proposed method demonstrated the possibility to estimate mental workload.

  18. Spatial and temporal variability of rainfall and their effects on hydrological response in urban areas – a review

    Directory of Open Access Journals (Sweden)

    E. Cristiano

    2017-07-01

    Full Text Available In urban areas, hydrological processes are characterized by high variability in space and time, making them sensitive to small-scale temporal and spatial rainfall variability. In the last decades new instruments, techniques, and methods have been developed to capture rainfall and hydrological processes at high resolution. Weather radars have been introduced to estimate high spatial and temporal rainfall variability. At the same time, new models have been proposed to reproduce hydrological response, based on small-scale representation of urban catchment spatial variability. Despite these efforts, interactions between rainfall variability, catchment heterogeneity, and hydrological response remain poorly understood. This paper presents a review of our current understanding of hydrological processes in urban environments as reported in the literature, focusing on their spatial and temporal variability aspects. We review recent findings on the effects of rainfall variability on hydrological response and identify gaps where knowledge needs to be further developed to improve our understanding of and capability to predict urban hydrological response.

  19. Observers for vehicle tyre/road forces estimation: experimental validation

    Science.gov (United States)

    Doumiati, M.; Victorino, A.; Lechner, D.; Baffet, G.; Charara, A.

    2010-11-01

    The motion of a vehicle is governed by the forces generated between the tyres and the road. Knowledge of these vehicle dynamic variables is important for vehicle control systems that aim to enhance vehicle stability and passenger safety. This study introduces a new estimation process for tyre/road forces. It presents many benefits over the existing state-of-art works, within the dynamic estimation framework. One of these major contributions consists of discussing in detail the vertical and lateral tyre forces at each tyre. The proposed method is based on the dynamic response of a vehicle instrumented with potentially integrated sensors. The estimation process is separated into two principal blocks. The role of the first block is to estimate vertical tyre forces, whereas in the second block two observers are proposed and compared for the estimation of lateral tyre/road forces. The different observers are based on a prediction/estimation Kalman filter. The performance of this concept is tested and compared with real experimental data using a laboratory car. Experimental results show that the proposed approach is a promising technique to provide accurate estimation. Thus, it can be considered as a practical low-cost solution for calculating vertical and lateral tyre/road forces.

  20. Rapid Estimation Method for State of Charge of Lithium-Ion Battery Based on Fractional Continual Variable Order Model

    Directory of Open Access Journals (Sweden)

    Xin Lu

    2018-03-01

    Full Text Available In recent years, the fractional order model has been employed to state of charge (SOC estimation. The non integer differentiation order being expressed as a function of recursive factors defining the fractality of charge distribution on porous electrodes. The battery SOC affects the fractal dimension of charge distribution, therefore the order of the fractional order model varies with the SOC at the same condition. This paper proposes a new method to estimate the SOC. A fractional continuous variable order model is used to characterize the fractal morphology of charge distribution. The order identification results showed that there is a stable monotonic relationship between the fractional order and the SOC after the battery inner electrochemical reaction reaches balanced. This feature makes the proposed model particularly suitable for SOC estimation when the battery is in the resting state. Moreover, a fast iterative method based on the proposed model is introduced for SOC estimation. The experimental results showed that the proposed iterative method can quickly estimate the SOC by several iterations while maintaining high estimation accuracy.

  1. Instrumentation for the follow-up of severe accidents

    International Nuclear Information System (INIS)

    Munoz Sanchez, A.; Nino Perote, R.

    2000-01-01

    During severe accidents, it is foreseeable that the instrumentation installed in a plant is subjected to conditions which are more hostile than those for which the instrumentation was designed and qualified. Moreover, new, specific instrumentation is required to monitor variables which have not been considered until now, and to control systems which lessen the consequences of severe accidents. Both existing instrumentation used to monitor critical functions in design basis accident conditions and additional instrumentation which provides the information necessary to control and mitigate the consequences of severe accidents, have to be designed to withstand such conditions, especially in terms of measurements range, functional characteristics and qualification to withstand pressure and temperature loads resulting from steam explosion, hydrogen combustion/explosion and high levels of radiation over long periods of time. (Author)

  2. A model for estimating pathogen variability in shellfish and predicting minimum depuration times.

    Science.gov (United States)

    McMenemy, Paul; Kleczkowski, Adam; Lees, David N; Lowther, James; Taylor, Nick

    2018-01-01

    Norovirus is a major cause of viral gastroenteritis, with shellfish consumption being identified as one potential norovirus entry point into the human population. Minimising shellfish norovirus levels is therefore important for both the consumer's protection and the shellfish industry's reputation. One method used to reduce microbiological risks in shellfish is depuration; however, this process also presents additional costs to industry. Providing a mechanism to estimate norovirus levels during depuration would therefore be useful to stakeholders. This paper presents a mathematical model of the depuration process and its impact on norovirus levels found in shellfish. Two fundamental stages of norovirus depuration are considered: (i) the initial distribution of norovirus loads within a shellfish population and (ii) the way in which the initial norovirus loads evolve during depuration. Realistic assumptions are made about the dynamics of norovirus during depuration, and mathematical descriptions of both stages are derived and combined into a single model. Parameters to describe the depuration effect and norovirus load values are derived from existing norovirus data obtained from U.K. harvest sites. However, obtaining population estimates of norovirus variability is time-consuming and expensive; this model addresses the issue by assuming a 'worst case scenario' for variability of pathogens, which is independent of mean pathogen levels. The model is then used to predict minimum depuration times required to achieve norovirus levels which fall within possible risk management levels, as well as predictions of minimum depuration times for other water-borne pathogens found in shellfish. Times for Escherichia coli predicted by the model all fall within the minimum 42 hours required for class B harvest sites, whereas minimum depuration times for norovirus and FRNA+ bacteriophage are substantially longer. Thus this study provides relevant information and tools to assist

  3. A thermal control system for long-term survival of scientific instruments on lunar surface

    Energy Technology Data Exchange (ETDEWEB)

    Ogawa, K., E-mail: ogawa@astrobio.k.u-tokyo.ac.jp [Department of Complexity Science and Engineering, The University of Tokyo, 5-1-5 Kashiwanoha, Kashiwa, Chiba (Japan); Iijima, Y.; Tanaka, S. [Institute of Space and Astronautical Science, Japan Aerospace Exploration Agency, 3-1-1 Yoshinodai, Chuo, Sagamihara, Kanagawa (Japan); Sakatani, N. [The Graduate University for Advanced Studies, Shonan Village, Hayama, Kanagawa (Japan); Otake, H. [JAXA Space Exploration Center, Japan Aerospace Exploration Agency, 3-1-1 Yoshinodai, Chuo, Sagamihara, Kanagawa (Japan)

    2014-03-15

    A thermal control system is being developed for scientific instruments placed on the lunar surface. This thermal control system, Lunar Mission Survival Module (MSM), was designed for scientific instruments that are planned to be operated for over a year in the future Japanese lunar landing mission SELENE-2. For the long-term operations, the lunar surface is a severe environment because the soil (regolith) temperature varies widely from nighttime −200 degC to daytime 100 degC approximately in which space electronics can hardly survive. The MSM has a tent of multi-layered insulators and performs a “regolith mound”. Temperature of internal devices is less variable just like in the lunar underground layers. The insulators retain heat in the regolith soil in the daylight, and it can keep the device warm in the night. We conducted the concept design of the lunar survival module, and estimated its potential by a thermal mathematical model on the assumption of using a lunar seismometer designed for SELENE-2. Thermal vacuum tests were also conducted by using a thermal evaluation model in order to estimate the validity of some thermal parameters assumed in the computed thermal model. The numerical and experimental results indicated a sufficient survivability potential of the concept of our thermal control system.

  4. A thermal control system for long-term survival of scientific instruments on lunar surface.

    Science.gov (United States)

    Ogawa, K; Iijima, Y; Sakatani, N; Otake, H; Tanaka, S

    2014-03-01

    A thermal control system is being developed for scientific instruments placed on the lunar surface. This thermal control system, Lunar Mission Survival Module (MSM), was designed for scientific instruments that are planned to be operated for over a year in the future Japanese lunar landing mission SELENE-2. For the long-term operations, the lunar surface is a severe environment because the soil (regolith) temperature varies widely from nighttime -200 degC to daytime 100 degC approximately in which space electronics can hardly survive. The MSM has a tent of multi-layered insulators and performs a "regolith mound". Temperature of internal devices is less variable just like in the lunar underground layers. The insulators retain heat in the regolith soil in the daylight, and it can keep the device warm in the night. We conducted the concept design of the lunar survival module, and estimated its potential by a thermal mathematical model on the assumption of using a lunar seismometer designed for SELENE-2. Thermal vacuum tests were also conducted by using a thermal evaluation model in order to estimate the validity of some thermal parameters assumed in the computed thermal model. The numerical and experimental results indicated a sufficient survivability potential of the concept of our thermal control system.

  5. A thermal control system for long-term survival of scientific instruments on lunar surface

    International Nuclear Information System (INIS)

    Ogawa, K.; Iijima, Y.; Tanaka, S.; Sakatani, N.; Otake, H.

    2014-01-01

    A thermal control system is being developed for scientific instruments placed on the lunar surface. This thermal control system, Lunar Mission Survival Module (MSM), was designed for scientific instruments that are planned to be operated for over a year in the future Japanese lunar landing mission SELENE-2. For the long-term operations, the lunar surface is a severe environment because the soil (regolith) temperature varies widely from nighttime −200 degC to daytime 100 degC approximately in which space electronics can hardly survive. The MSM has a tent of multi-layered insulators and performs a “regolith mound”. Temperature of internal devices is less variable just like in the lunar underground layers. The insulators retain heat in the regolith soil in the daylight, and it can keep the device warm in the night. We conducted the concept design of the lunar survival module, and estimated its potential by a thermal mathematical model on the assumption of using a lunar seismometer designed for SELENE-2. Thermal vacuum tests were also conducted by using a thermal evaluation model in order to estimate the validity of some thermal parameters assumed in the computed thermal model. The numerical and experimental results indicated a sufficient survivability potential of the concept of our thermal control system

  6. Distributions of component failure rates estimated from LER data

    International Nuclear Information System (INIS)

    Atwood, C.L.

    1985-01-01

    Past analyses of Licensee Event Report (LER) data have noted that component failure rates vary from plant to plant, and have estimated the distributions by two-parameter gamma distributions. In this study, a more complicated distributional form is considered, a mixture of gammas. This could arise if the plants' failure rates cluster into distinct groups. The method was applied to selected published LER data for diesel generators, pumps, valves, and instrumentation and control assemblies. The improved fits from using a mixture rather than a single gamma distribution were minimal, and not statistically significant. There seem to be two possibilities: either explanatory variables affect the failure rates only in a gradual way, not a qualitative way; or, for estimating individual component failure rates, the published LER data have been analyzed to the limit of resolution. 9 refs

  7. Microdiamond grade as a regionalised variable - some basic requirements for successful local microdiamond resource estimation of kimberlites

    Science.gov (United States)

    Stiefenhofer, Johann; Thurston, Malcolm L.; Bush, David E.

    2018-04-01

    Microdiamonds offer several advantages as a resource estimation tool, such as access to deeper parts of a deposit which may be beyond the reach of large diameter drilling (LDD) techniques, the recovery of the total diamond content in the kimberlite, and a cost benefit due to the cheaper treatment cost compared to large diameter samples. In this paper we take the first step towards local estimation by showing that micro-diamond samples can be treated as a regionalised variable suitable for use in geostatistical applications and we show examples of such output. Examples of microdiamond variograms are presented, the variance-support relationship for microdiamonds is demonstrated and consistency of the diamond size frequency distribution (SFD) is shown with the aid of real datasets. The focus therefore is on why local microdiamond estimation should be possible, not how to generate such estimates. Data from our case studies and examples demonstrate a positive correlation between micro- and macrodiamond sample grades as well as block estimates. This relationship can be demonstrated repeatedly across multiple mining operations. The smaller sample support size for microdiamond samples is a key difference between micro- and macrodiamond estimates and this aspect must be taken into account during the estimation process. We discuss three methods which can be used to validate or reconcile the estimates against macrodiamond data, either as estimates or in the form of production grades: (i) reconcilliation using production data, (ii) by comparing LDD-based grade estimates against microdiamond-based estimates and (iii) using simulation techniques.

  8. The Impact of Clinical and Cognitive Variables on Social Functioning in Parkinson's Disease: Patient versus Examiner Estimates

    Directory of Open Access Journals (Sweden)

    Patrick McNamara

    2010-01-01

    Results. Patients' estimates of their own social functioning were not significantly different from examiners' estimates. The impact of clinical variables on social functioning in PD revealed depression to be the strongest association of social functioning in PD on both the patient and the examiner version of the Social Adaptation Self-Evaluation Scale. Conclusions. PD patients appear to be well aware of their social strengths and weaknesses. Depression and motor symptom severity are significant predictors of both self- and examiner reported social functioning in patients with PD. Assessment and treatment of depression in patients with PD may improve social functioning and overall quality of life.

  9. Artificial Intelligence Estimation of Carotid-Femoral Pulse Wave Velocity using Carotid Waveform.

    Science.gov (United States)

    Tavallali, Peyman; Razavi, Marianne; Pahlevan, Niema M

    2018-01-17

    In this article, we offer an artificial intelligence method to estimate the carotid-femoral Pulse Wave Velocity (PWV) non-invasively from one uncalibrated carotid waveform measured by tonometry and few routine clinical variables. Since the signal processing inputs to this machine learning algorithm are sensor agnostic, the presented method can accompany any medical instrument that provides a calibrated or uncalibrated carotid pressure waveform. Our results show that, for an unseen hold back test set population in the age range of 20 to 69, our model can estimate PWV with a Root-Mean-Square Error (RMSE) of 1.12 m/sec compared to the reference method. The results convey the fact that this model is a reliable surrogate of PWV. Our study also showed that estimated PWV was significantly associated with an increased risk of CVDs.

  10. A Comparison of seismic instrument noise coherence analysis techniques

    Science.gov (United States)

    Ringler, A.T.; Hutt, C.R.; Evans, J.R.; Sandoval, L.D.

    2011-01-01

    The self-noise of a seismic instrument is a fundamental characteristic used to evaluate the quality of the instrument. It is important to be able to measure this self-noise robustly, to understand how differences among test configurations affect the tests, and to understand how different processing techniques and isolation methods (from nonseismic sources) can contribute to differences in results. We compare two popular coherence methods used for calculating incoherent noise, which is widely used as an estimate of instrument self-noise (incoherent noise and self-noise are not strictly identical but in observatory practice are approximately equivalent; Holcomb, 1989; Sleeman et al., 2006). Beyond directly comparing these two coherence methods on similar models of seismometers, we compare how small changes in test conditions can contribute to incoherent-noise estimates. These conditions include timing errors, signal-to-noise ratio changes (ratios between background noise and instrument incoherent noise), relative sensor locations, misalignment errors, processing techniques, and different configurations of sensor types.

  11. A Gaussian IV estimator of cointegrating relations

    DEFF Research Database (Denmark)

    Bårdsen, Gunnar; Haldrup, Niels

    2006-01-01

    In static single equation cointegration regression modelsthe OLS estimator will have a non-standard distribution unless regressors arestrictly exogenous. In the literature a number of estimators have been suggestedto deal with this problem, especially by the use of semi-nonparametricestimators. T......In static single equation cointegration regression modelsthe OLS estimator will have a non-standard distribution unless regressors arestrictly exogenous. In the literature a number of estimators have been suggestedto deal with this problem, especially by the use of semi...... in cointegrating regressions. These instruments are almost idealand simulations show that the IV estimator using such instruments alleviatethe endogeneity problem extremely well in both finite and large samples....

  12. Minimally invasive instrumentation without fusion during posterior thoracic corpectomies: a comparison of percutaneously instrumented nonfused segments with open instrumented fused segments.

    Science.gov (United States)

    Lau, Darryl; Chou, Dean

    2017-07-01

    OBJECTIVE During the mini-open posterior corpectomy, percutaneous instrumentation without fusion is performed above and below the corpectomy level. In this study, the authors' goal was to compare the perioperative and long-term implant failure rates of patients who underwent nonfused percutaneous instrumentation with those of patients who underwent traditional open instrumented fusion. METHODS Adult patients who underwent posterior thoracic corpectomies with cage reconstruction between 2009 and 2014 were identified. Patients who underwent mini-open corpectomy had percutaneous instrumentation without fusion, and patients who underwent open corpectomy had instrumented fusion above and below the corpectomy site. The authors compared perioperative outcomes and rates of implant failure requiring reoperation between the open (fused) and mini-open (unfused) groups. RESULTS A total of 75 patients were identified, and 53 patients (32 open and 21 mini-open) were available for followup. The mean patient age was 52.8 years, and 56.6% of patients were male. There were no significant differences in baseline variables between the 2 groups. The overall perioperative complication rate was 15.1%, and there was no significant difference between the open and mini-open groups (18.8% vs 9.5%; p = 0.359). The mean hospital stay was 10.5 days. The open group required a significantly longer stay than the mini-open group (12.8 vs 7.1 days; p open and mini-open groups at 6 months (3.1% vs 0.0%, p = 0.413), 1 year (10.7% vs 6.2%, p = 0.620), and 2 years (18.2% vs 8.3%, p = 0.438). The overall mean follow-up was 29.2 months. CONCLUSIONS These findings suggest that percutaneous instrumentation without fusion in mini-open transpedicular corpectomies offers similar implant failure and reoperation rates as open instrumented fusion as far out as 2 years of follow-up.

  13. Importance of Intrinsic and Instrumental Value of Education in Pakistan

    Science.gov (United States)

    Kumar, Mahendar

    2017-01-01

    Normally, effectiveness of any object or thing is judged by two values; intrinsic and instrumental. To compare intrinsic value of education with instrumental value, this study has used the following variables: getting knowledge for its own sake, getting knowledge for social status, getting knowledge for job or business endeavor and getting…

  14. Global Ocean Evaporation: How Well Can We Estimate Interannual to Decadal Variability?

    Science.gov (United States)

    Robertson, Franklin R.; Bosilovich, Michael G.; Roberts, Jason B.; Wang, Hailan

    2015-01-01

    Evaporation from the world's oceans constitutes the largest component of the global water balance. It is important not only as the ultimate source of moisture that is tied to the radiative processes determining Earth's energy balance but also to freshwater availability over land, governing habitability of the planet. Here we focus on variability of ocean evaporation on scales from interannual to decadal by appealing to three sources of data: the new MERRA-2 (Modern-Era Retrospective analysis for Research and Applications -2); climate models run with historical sea-surface temperatures, ice and atmospheric constituents (so-called AMIP experiments); and state-of-the-art satellite retrievals from the Seaflux and HOAPS (Hamburg Ocean-Atmosphere Parameters and Fluxes from Satellite) projects. Each of these sources has distinct advantages as well as drawbacks. MERRA-2, like other reanalyses, synthesizes evaporation estimates consistent with observationally constrained physical and dynamical models-but data stream discontinuities are a major problem for interpreting multi-decadal records. The climate models used in data assimilation can also be run with lesser constraints such as with SSTs and sea-ice (i.e. AMIPs) or with additional, minimal observations of surface pressure and marine observations that have longer and less fragmentary observational records. We use the new ERA-20C reanalysis produced by ECMWF embodying the latter methodology. Still, the model physics biases in climate models and the lack of a predicted surface energy balance are of concern. Satellite retrievals and comparisons to ship-based measurements offer the most observationally-based estimates, but sensor inter-calibration, algorithm retrieval assumptions, and short records are dominant issues. Our strategy depends on maximizing the advantages of these combined records. The primary diagnostic tool used here is an analysis of bulk aerodynamic computations produced by these sources and uses a first

  15. RECOVERY OF LARGE ANGULAR SCALE CMB POLARIZATION FOR INSTRUMENTS EMPLOYING VARIABLE-DELAY POLARIZATION MODULATORS

    Energy Technology Data Exchange (ETDEWEB)

    Miller, N. J.; Marriage, T. A.; Appel, J. W.; Bennett, C. L.; Eimer, J.; Essinger-Hileman, T.; Harrington, K.; Rostem, K.; Watts, D. J. [Department of Physics and Astronomy, Johns Hopkins University, 3400 N. Charles St., Baltimore, MD 21218 (United States); Chuss, D. T. [Department of Physics, Villanova University, 800 E Lancaster, Villanova, PA 19085 (United States); Wollack, E. J.; Fixsen, D. J.; Moseley, S. H.; Switzer, E. R., E-mail: Nathan.J.Miller@nasa.gov [Observational Cosmology Laboratory, Code 665, NASA Goddard Space Flight Center, Greenbelt, MD 20771 (United States)

    2016-02-20

    Variable-delay Polarization Modulators (VPMs) are currently being implemented in experiments designed to measure the polarization of the cosmic microwave background on large angular scales because of their capability for providing rapid, front-end polarization modulation and control over systematic errors. Despite the advantages provided by the VPM, it is important to identify and mitigate any time-varying effects that leak into the synchronously modulated component of the signal. In this paper, the effect of emission from a 300 K VPM on the system performance is considered and addressed. Though instrument design can greatly reduce the influence of modulated VPM emission, some residual modulated signal is expected. VPM emission is treated in the presence of rotational misalignments and temperature variation. Simulations of time-ordered data are used to evaluate the effect of these residual errors on the power spectrum. The analysis and modeling in this paper guides experimentalists on the critical aspects of observations using VPMs as front-end modulators. By implementing the characterizations and controls as described, front-end VPM modulation can be very powerful for mitigating 1/f noise in large angular scale polarimetric surveys. None of the systematic errors studied fundamentally limit the detection and characterization of B-modes on large scales for a tensor-to-scalar ratio of r = 0.01. Indeed, r < 0.01 is achievable with commensurately improved characterizations and controls.

  16. Instrument to determine prestress remaining in a damaged bridge girder

    Science.gov (United States)

    Civjan, Scott A.; Jirsa, James O.; Carrasquillo, Ramon L.; Fowler, David W.

    1998-03-01

    An instrument has been developed to estimate stress levels in prestress strands in existing members. The prototype instrument applies a lateral load to an exposed prestressing strand and measures the resulting displacements. The instrument was calibrated for 0.5-inch (12.7 mm) diameter seven-wire strand with exposed lengths of 1.5 feet (0.46 m) to 3.75 feet (1.14 m). It was tested to determine its accuracy, precision, and usefulness in the field. Strand forces were consistently estimated to within ten percent of the actual load. The device was also utilized in the placement of strand splices and was found to be more reliable in checking induced strand tensions than the standard torque wrench method.

  17. Confirming theoretical pay constructs of a variable pay scheme

    Directory of Open Access Journals (Sweden)

    Sibangilizwe Ncube

    2013-05-01

    Full Text Available Orientation: Return on the investment in variable pay programmes remains controversial because their cost versus contribution cannot be empirically justified. Research purpose: This study validates the findings of the model developed by De Swardt on the factors related to successful variable pay programmes. Motivation for the study: Many organisations blindly implement variable pay programmes without any means to assess the impact these programmes have on the company’s performance. This study was necessary to validate the findings of an existing instrument that validates the contribution of variable pay schemes. Research design, approach and method: The study was conducted using quantitative research. A total of 300 completed questionnaires from a non-purposive sample of 3000 participants in schemes across all South African industries were returned and analysed. Main findings: Using exploratory and confirmatory factor analysis, it was found that the validation instrument developed by De Swardt is still largely valid in evaluating variable pay schemes. The differences between the study and the model were reported. Practical/managerial implications: The study confirmed the robustness of an existing model that enables practitioners to empirically validate the use of variable pay plans. This model assists in the design and implementation of variable pay programmes that meet critical success factors. Contribution/value-add: The study contributed to the development of a measurement instrument that will assess whether a variable pay plan contributes to an organisation’s success.

  18. Remote and Virtual Instrumentation Platform for Distance Learning

    Directory of Open Access Journals (Sweden)

    Tom Eppes

    2010-08-01

    Full Text Available This journal presents distance learning using the National Instruments ELVIS II and how Multisim can be combined with ELVIS II for distance learning. National Instrument’s ELVIS II is a new version that can easily be used for e-learning. It features 12 of the commonly used instruments in engineering and science laboratories, including an oscilloscope, a function generator, a variable power supply, and an isolated digital multi-meter in a low-cost and easy-to-use platform and completes integration with Multisim software for SPICE simulation, which simplifies the teaching of circuit design. As NI ELVIS II is based on LabView, designers can easily customize the 12 instruments or can create their own using the provided source code for the instruments.

  19. Positive random variables with a discrete probability mass at the origin: Parameter estimation for left-censored samples with application to air quality monitoring data

    International Nuclear Information System (INIS)

    Gogolak, C.V.

    1986-11-01

    The concentration of a contaminant measured in a particular medium might be distributed as a positive random variable when it is present, but it may not always be present. If there is a level below which the concentration cannot be distinguished from zero by the analytical apparatus, a sample from such a population will be censored on the left. The presence of both zeros and positive values in the censored portion of such samples complicates the problem of estimating the parameters of the underlying positive random variable and the probability of a zero observation. Using the method of maximum likelihood, it is shown that the solution to this estimation problem reduces largely to that of estimating the parameters of the distribution truncated at the point of censorship. The maximum likelihood estimate of the proportion of zero values follows directly. The derivation of the maximum likelihood estimates for a lognormal population with zeros is given in detail, and the asymptotic properties of the estimates are examined. The estimation method was used to fit several different distributions to a set of severely censored 85 Kr monitoring data from six locations at the Savannah River Plant chemical separations facilities

  20. Template-Directed Instrumentation Reduces Cost and Improves Efficiency for Total Knee Arthroplasty: An Economic Decision Analysis and Pilot Study.

    Science.gov (United States)

    McLawhorn, Alexander S; Carroll, Kaitlin M; Blevins, Jason L; DeNegre, Scott T; Mayman, David J; Jerabek, Seth A

    2015-10-01

    Template-directed instrumentation (TDI) for total knee arthroplasty (TKA) may streamline operating room (OR) workflow and reduce costs by preselecting implants and minimizing instrument tray burden. A decision model simulated the economics of TDI. Sensitivity analyses determined thresholds for model variables to ensure TDI success. A clinical pilot was reviewed. The accuracy of preoperative templates was validated, and 20 consecutive primary TKAs were performed using TDI. The model determined that preoperative component size estimation should be accurate to ±1 implant size for 50% of TKAs to implement TDI. The pilot showed that preoperative template accuracy exceeded 97%. There were statistically significant improvements in OR turnover time and in-room time for TDI compared to an historical cohort of TKAs. TDI reduces costs and improves OR efficiency. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. Methodology for estimation of secondary meteorological variables to be used in local dispersion of air pollutants

    International Nuclear Information System (INIS)

    Turtos, L.; Sanchez, M.; Roque, A.; Soltura, R.

    2003-01-01

    Methodology for estimation of secondary meteorological variables to be used in local dispersion of air pollutants. This paper include the main works, carried out into the frame of the project Atmospheric environmental externalities of the electricity generation in Cuba, aiming to develop methodologies and corresponding software, which will allow to improve the quality of the secondary meteorological data used in atmospheric pollutant calculations; specifically the wind profiles coefficient, urban and rural mixed high and temperature gradients

  2. The Dynamics, Ecological Variability and Estimated Carbon Stocks of Mangroves in Mahajamba Bay, Madagascar

    Directory of Open Access Journals (Sweden)

    Trevor G. Jones

    2015-08-01

    Full Text Available Mangroves are found throughout the tropics, providing critical ecosystem goods and services to coastal communities and supporting rich biodiversity. Globally, mangroves are being rapidly degraded and deforested at rates exceeding loss in many tropical inland forests. Madagascar contains around 2% of the global distribution, >20% of which has been deforested since 1990, primarily from over-harvest for forest products and conversion for agriculture and aquaculture. While historically not prominent, mangrove loss in Madagascar’s Mahajamba Bay is increasing. Here, we focus on Mahajamba Bay, presenting long-term dynamics calculated using United States Geological Survey (USGS national-level mangrove maps contextualized with socio-economic research and ground observations, and the results of contemporary (circa 2011 mapping of dominant mangrove types. The analysis of the USGS data indicated 1050 hectares (3.8% lost from 2000 to 2010, which socio-economic research suggests is increasingly driven by commercial timber extraction. Contemporary mapping results permitted stratified sampling based on spectrally distinct and ecologically meaningful mangrove types, allowing for the first-ever vegetation carbon stock estimates for Mahajamba Bay. The overall mean carbon stock across all mangrove classes was estimated to be 100.97 ± 10.49 Mg C ha−1. High stature closed-canopy mangroves had the highest average carbon stock estimate (i.e., 166.82 ± 15.28 Mg C ha−1. These estimates are comparable to other published values in Madagascar and elsewhere in the Western Indian Ocean and demonstrate the ecological variability of Mahajamba Bay’s mangroves and their value towards climate change mitigation.

  3. Optimal estimation of spatially variable recharge and transmissivity fields under steady-state groundwater flow. Part 2. Case study

    Science.gov (United States)

    Graham, Wendy D.; Neff, Christina R.

    1994-05-01

    The first-order analytical solution of the inverse problem for estimating spatially variable recharge and transmissivity under steady-state groundwater flow, developed in Part 1 is applied to the Upper Floridan Aquifer in NE Florida. Parameters characterizing the statistical structure of the log-transmissivity and head fields are estimated from 152 measurements of transmissivity and 146 measurements of hydraulic head available in the study region. Optimal estimates of the recharge, transmissivity and head fields are produced throughout the study region by conditioning on the nearest 10 available transmissivity measurements and the nearest 10 available head measurements. Head observations are shown to provide valuable information for estimating both the transmissivity and the recharge fields. Accurate numerical groundwater model predictions of the aquifer flow system are obtained using the optimal transmissivity and recharge fields as input parameters, and the optimal head field to define boundary conditions. For this case study, both the transmissivity field and the uncertainty of the transmissivity field prediction are poorly estimated, when the effects of random recharge are neglected.

  4. Age-Related Changes in Bimanual Instrument Playing with Rhythmic Cueing

    Directory of Open Access Journals (Sweden)

    Soo Ji Kim

    2017-09-01

    Full Text Available Deficits in bimanual coordination of older adults have been demonstrated to significantly limit their functioning in daily life. As a bimanual sensorimotor task, instrument playing has great potential for motor and cognitive training in advanced age. While the process of matching a person’s repetitive movements to auditory rhythmic cueing during instrument playing was documented to involve motor and attentional control, investigation into whether the level of cognitive functioning influences the ability to rhythmically coordinate movement to an external beat in older populations is relatively limited. Therefore, the current study aimed to examine how timing accuracy during bimanual instrument playing with rhythmic cueing differed depending on the degree of participants’ cognitive aging. Twenty one young adults, 20 healthy older adults, and 17 older adults with mild dementia participated in this study. Each participant tapped an electronic drum in time to the rhythmic cueing provided using both hands simultaneously and in alternation. During bimanual instrument playing with rhythmic cueing, mean and variability of synchronization errors were measured and compared across the groups and the tempo of cueing during each type of tapping task. Correlations of such timing parameters with cognitive measures were also analyzed. The results showed that the group factor resulted in significant differences in the synchronization errors-related parameters. During bimanual tapping tasks, cognitive decline resulted in differences in synchronization errors between younger adults and older adults with mild dimentia. Also, in terms of variability of synchronization errors, younger adults showed significant differences in maintaining timing performance from older adults with and without mild dementia, which may be attributed to decreased processing time for bimanual coordination due to aging. Significant correlations were observed between variability of

  5. Determinants of The Application of Macro Prudential Instruments

    Directory of Open Access Journals (Sweden)

    Zakaria Firano

    2017-09-01

    Full Text Available The use of macro prudential instruments today gives rise to a major debate within the walls of central banks and other authorities in charge of financial stability. Contrary to micro prudential instruments, whose effects remain limited, macro prudential instruments are different in nature and can affect the stability of the financial system. By influencing the financial cycle and the financial structure of financial institutions, the use of such instruments should be conducted with great vigilance as well as macroeconomic and financial expertise. But the experiences of central banks in this area are sketchy, and only some emerging countries have experience using these types of instruments in different ways. This paper presents an analysis of instruments of macro prudential policy and attempts to empirically demonstrate that these instruments should be used only in specific economic and financial situations. Indeed, the results obtained, using modeling bivariate panel, confirm that these instruments are more effective when used to mitigate the euphoria of financial and economic cycles. In this sense, the output gap, describing the economic cycle, and the Z-score are the intermediate variables for the activation of capital instruments. Moreover, the liquidity ratio and changes in bank profitability are the two early warning indicators for activation of liquidity instruments.

  6. Another look at the Grubbs estimators

    KAUST Repository

    Lombard, F.; Potgieter, C.J.

    2012-01-01

    of the estimate is to be within reasonable bounds and if negative precision estimates are to be avoided. We show that the two instrument Grubbs estimator can be improved considerably if fairly reliable preliminary information regarding the ratio of sampling unit

  7. Uncertainty and variability in updated estimates of potential dose and risk at a US Nuclear Test Site - Bikini Atoll

    International Nuclear Information System (INIS)

    Bogen, K.T.; Conrado, C.L.; Robison, W.L.

    1997-01-01

    Uncertainty and interindividual variability were assessed in estimated doses for a rehabilitation scenario for Bikini Island at Bikini Atoll, in which the top 40 cm of soil would be removed in the housing and village area, and the rest of the island would be treated with potassium fertilizer, prior to an assumed resettlement date of 1999. Doses were estimated for ingested 137 Cs and 90 Sr, external gamma-exposure, and inhalation+ingestion of 241 Am + 239+240 Pu. Two dietary scenarios were considered: imported foods are available (IA); imported foods are unavailable with only local foods consumed (IUA). After ∼5 y of Bikini residence under either IA or IUA assumptions, upper and lower 95% confidence limits on interindividual variability in calculated dose were estimated to lie within a ∼threefold factor of its in population-average value; upper and lower 95% confidence limits on uncertainty in calculated dose were estimated to lie within a ∼twofold factor of its expected value. For reference, the expected values of population-average dose at age 70 y were estimated to be 16 and 52 mSv under IA and IUA dietary assumptions, respectively. Assuming that 200 Bikini resettlers would be exposed to local foods (under both IA and IUA assumptions), the maximum 1-y dose received by any Bikini resident is most likely to be approximately 2 and 8 mSv under the IA and IUA assumptions, respectively. Under the most likely dietary scenario, involving access to imported foods, this analysis indicates that it is most likely that no additional cancer fatalities (above those normally expected) would arise from the increased radiation exposures considered. 33 refs., 4 figs., 4 tabs

  8. Impact of instrument response variations on health physics measurements

    International Nuclear Information System (INIS)

    Armantrout, G.A.

    1984-10-01

    Uncertainties in estimating the potential health impact of a given radiation exposure include instrument measurement error in determining exposure and difficulty in relating this exposure to an effective dose value. Instrument error can be due to design or manufacturing deficiencies, limitations of the sensing element used, and calibration and maintenance of the instrument. This paper evaluates the errors which can be introduced by design deficiencies and limitations of the sensing element for a wide variety of commonly used survey instruments. The results indicate little difference among sensing element choice for general survey work, with variations among specific instrument designs being the major factor. Ion chamber instruments tend to be the best for all around use, while scintillator-based units should not be used where accurate measurements are required. The need to properly calibrate and maintain an instrument appears to be the most important factor in instrument accuracy. 8 references, 6 tables

  9. High-Frequency X-ray Variability Detection in A Black Hole Transient with USA.

    Energy Technology Data Exchange (ETDEWEB)

    Shabad, Gayane

    2000-10-16

    Studies of high-frequency variability (above {approx}100 Hz) in X-ray binaries provide a unique opportunity to explore the fundamental physics of spacetime and matter, since the orbital timescale on the order of several milliseconds is a timescale of the motion of matter through the region located in close proximity to a compact stellar object. The detection of weak high-frequency signals in X-ray binaries depends on how well we understand the level of Poisson noise due to the photon counting statistics, i.e. how well we can understand and model the detector deadtime and other instrumental systematic effects. We describe the preflight timing calibration work performed on the Unconventional Stellar Aspect (USA) X-ray detector to study deadtime and timing issues. We developed a Monte Carlo deadtime model and deadtime correction methods for the USA experiment. The instrumental noise power spectrum can be estimated within {approx}0.1% accuracy in the case when no energy-dependent instrumental effect is present. We also developed correction techniques to account for an energy-dependent instrumental effect. The developed methods were successfully tested on USA Cas A and Cygnus X-1 data. This work allowed us to make a detection of a weak signal in a black hole candidate (BHC) transient.

  10. SPECIES-SPECIFIC FOREST VARIABLE ESTIMATION USING NON-PARAMETRIC MODELING OF MULTI-SPECTRAL PHOTOGRAMMETRIC POINT CLOUD DATA

    Directory of Open Access Journals (Sweden)

    J. Bohlin

    2012-07-01

    Full Text Available The recent development in software for automatic photogrammetric processing of multispectral aerial imagery, and the growing nation-wide availability of Digital Elevation Model (DEM data, are about to revolutionize data capture for forest management planning in Scandinavia. Using only already available aerial imagery and ALS-assessed DEM data, raster estimates of the forest variables mean tree height, basal area, total stem volume, and species-specific stem volumes were produced and evaluated. The study was conducted at a coniferous hemi-boreal test site in southern Sweden (lat. 58° N, long. 13° E. Digital aerial images from the Zeiss/Intergraph Digital Mapping Camera system were used to produce 3D point-cloud data with spectral information. Metrics were calculated for 696 field plots (10 m radius from point-cloud data and used in k-MSN to estimate forest variables. For these stands, the tree height ranged from 1.4 to 33.0 m (18.1 m mean, stem volume from 0 to 829 m3 ha-1 (249 m3 ha-1 mean and basal area from 0 to 62.2 m2 ha-1 (26.1 m2 ha-1 mean, with mean stand size of 2.8 ha. Estimates made using digital aerial images corresponding to the standard acquisition of the Swedish National Land Survey (Lantmäteriet showed RMSEs (in percent of the surveyed stand mean of 7.5% for tree height, 11.4% for basal area, 13.2% for total stem volume, 90.6% for pine stem volume, 26.4 for spruce stem volume, and 72.6% for deciduous stem volume. The results imply that photogrammetric matching of digital aerial images has significant potential for operational use in forestry.

  11. Centile estimation for a proportion response variable.

    Science.gov (United States)

    Hossain, Abu; Rigby, Robert; Stasinopoulos, Mikis; Enea, Marco

    2016-03-15

    This paper introduces two general models for computing centiles when the response variable Y can take values between 0 and 1, inclusive of 0 or 1. The models developed are more flexible alternatives to the beta inflated distribution. The first proposed model employs a flexible four parameter logit skew Student t (logitSST) distribution to model the response variable Y on the unit interval (0, 1), excluding 0 and 1. This model is then extended to the inflated logitSST distribution for Y on the unit interval, including 1. The second model developed in this paper is a generalised Tobit model for Y on the unit interval, including 1. Applying these two models to (1-Y) rather than Y enables modelling of Y on the unit interval including 0 rather than 1. An application of the new models to real data shows that they can provide superior fits. Copyright © 2015 John Wiley & Sons, Ltd.

  12. Estimating the reliability of glycemic index values and potential sources of methodological and biological variability.

    Science.gov (United States)

    Matthan, Nirupa R; Ausman, Lynne M; Meng, Huicui; Tighiouart, Hocine; Lichtenstein, Alice H

    2016-10-01

    The utility of glycemic index (GI) values for chronic disease risk management remains controversial. Although absolute GI value determinations for individual foods have been shown to vary significantly in individuals with diabetes, there is a dearth of data on the reliability of GI value determinations and potential sources of variability among healthy adults. We examined the intra- and inter-individual variability in glycemic response to a single food challenge and methodologic and biological factors that potentially mediate this response. The GI value for white bread was determined by using standardized methodology in 63 volunteers free from chronic disease and recruited to differ by sex, age (18-85 y), and body mass index [BMI (in kg/m 2 ): 20-35]. Volunteers randomly underwent 3 sets of food challenges involving glucose (reference) and white bread (test food), both providing 50 g available carbohydrates. Serum glucose and insulin were monitored for 5 h postingestion, and GI values were calculated by using different area under the curve (AUC) methods. Biochemical variables were measured by using standard assays and body composition by dual-energy X-ray absorptiometry. The mean ± SD GI value for white bread was 62 ± 15 when calculated by using the recommended method. Mean intra- and interindividual CVs were 20% and 25%, respectively. Increasing sample size, replication of reference and test foods, and length of blood sampling, as well as AUC calculation method, did not improve the CVs. Among the biological factors assessed, insulin index and glycated hemoglobin values explained 15% and 16% of the variability in mean GI value for white bread, respectively. These data indicate that there is substantial variability in individual responses to GI value determinations, demonstrating that it is unlikely to be a good approach to guiding food choices. Additionally, even in healthy individuals, glycemic status significantly contributes to the variability in GI value

  13. A stochastic analysis of the influence of soil and climatic variability on the estimate of pesticide ground water polution potential

    Science.gov (United States)

    Jury, William A.; Gruber, Joachim

    1989-12-01

    Soil and climatic variability contribute in an unknown manner to the leaching of pesticides below the surface soil zone where degradation occurs at maximum levels. In this paper we couple the climatic variability model of Eagleson (1978) to the soil variability transport model of Jury (1982) to produce a probability density distribution of residual mass fraction (RMF) remaining after leaching below the surface degradation zone. Estimates of the RMF distribution are shown to be much more sensitive to soil variability than climatic variability, except when the residence time of the chemical is shorter than one year. When soil variability dominates climatic variability, the applied water distribution may be replaced by a constant average water application rate without serious error. Simulations of leaching are run with 10 pesticides in two climates and in two representative soil types with a range of soil variability. Variability in soil or climate act to produce a nonnegligible probability of survival of a small value of residual mass even for relatively immobile compounds which are predicted to degrade completely by a simple model which neglects variability. However, the simpler model may still be useful for screening pesticides for groundwater pollution potential if somewhat larger residual masses of a given compound are tolerated. Monte Carlo simulations of the RMF distribution agreed well with model predictions over a wide range of pesticide properties.

  14. Estimating pushrim temporal and kinetic measures using an instrumented treadmill during wheelchair propulsion: A concurrent validity study.

    Science.gov (United States)

    Gagnon, Dany H; Jouval, Camille; Chénier, Félix

    2016-06-14

    Using ground reaction forces recorded while propelling a manual wheelchair on an instrumented treadmill may represent a valuable alternative to using an instrumented pushrim to calculate temporal and kinetic parameters during propulsion. Sixteen manual wheelchair users propelled their wheelchair equipped with instrumented pushrims (i.e., SMARTWheel) on an instrumented dual-belt treadmill set a 1m/s during a 1-minute period. Spatio-temporal (i.e., duration of the push and recovery phase) and kinetic measures (i.e. propulsive moments) were calculated for 20 consecutive strokes for each participant. Strong associations were confirmed between the treadmill and the instrumented pushrim for the mean duration of the push phase (r=0.98) and of the recovery phase (r=0.99). Good agreement between these two measurement instruments was also confirmed with mean differences of only 0.028s for the push phase and 0.012s for the recovery phase. Strong associations were confirmed between the instrumented wheelchair pushrim and treadmill for mean (r=0.97) and peak (r=0.96) propulsive moments. Good agreement between these two measurement instruments was also confirmed with mean differences of 0.50Nm (mean moment) and 0.71Nm (peak moment). The use of a dual-belt instrumented treadmill represents an alternative to characterizing temporal parameters and propulsive moments during manual wheelchair propulsion. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Hemispherical photography to estimate biophysical variables of cotton

    Directory of Open Access Journals (Sweden)

    Ziany N. Brandão

    Full Text Available ABSTRACT The Leaf Area Index (LAI is a key parameter to evaluate the vegetation spectral response, estimating plant nutrition and water requirements. However, in large fields is difficult to obtain accurate data to LAI determination. Therefore, the objective of this study was the estimation of LAI, biomass and yield of irrigated cotton through digital hemispherical photography. The treatments consisted of four nitrogen doses (0, 90, 180 and 270 kg ha-1 and four phosphorus doses (0, 120, 240 and 360 kg ha-1. Digital hemispherical photographs were collected under similar sky brightness conditions at 60 and 75 days after emergence (DAE, performed by the Digital Plant Canopy Imager - CI-110® of CID Inc. Biomass and LAI measurements were made on the same dates. LAI was also determined by destructive and non-destructive methods through a leaf area integrator (LI-COR® -LI-3100C model, and by measurements based on the midrib length of all leaves, respectively. The results indicate that the hemispherical images were appropriate to estimate the LAI and biomass production of irrigated cotton, while for the estimation of yield, more research is needed to improve the method.

  16. Comparison of small-footprint discrete return and full waveform airborne lidar data for estimating multiple forest variables

    OpenAIRE

    Sumnall, Matthew J.; Hill, Ross A.; Hinsley, Shelley A.

    2016-01-01

    The quantification of forest ecosystems is important for a variety of purposes, including the assessment of wildlife habitat, nutrient cycles, timber yield and fire propagation. This research assesses the estimation of forest structure, composition and deadwood variables from small-footprint airborne lidar data, both discrete return (DR) and full waveform (FW), acquired under leaf-on and leaf-off conditions. The field site, in the New Forest, UK, includes managed plantation and ancient, se...

  17. Estimating inter-annual variability in winter wheat sowing dates from satellite time series in Camargue, France

    Science.gov (United States)

    Manfron, Giacinto; Delmotte, Sylvestre; Busetto, Lorenzo; Hossard, Laure; Ranghetti, Luigi; Brivio, Pietro Alessandro; Boschetti, Mirco

    2017-05-01

    Crop simulation models are commonly used to forecast the performance of cropping systems under different hypotheses of change. Their use on a regional scale is generally constrained, however, by a lack of information on the spatial and temporal variability of environment-related input variables (e.g., soil) and agricultural practices (e.g., sowing dates) that influence crop yields. Satellite remote sensing data can shed light on such variability by providing timely information on crop dynamics and conditions over large areas. This paper proposes a method for analyzing time series of MODIS satellite data in order to estimate the inter-annual variability of winter wheat sowing dates. A rule-based method was developed to automatically identify a reliable sample of winter wheat field time series, and to infer the corresponding sowing dates. The method was designed for a case study in the Camargue region (France), where winter wheat is characterized by vernalization, as in other temperate regions. The detection criteria were chosen on the grounds of agronomic expertise and by analyzing high-confidence time-series vegetation index profiles for winter wheat. This automatic method identified the target crop on more than 56% (four-year average) of the cultivated areas, with low commission errors (11%). It also captured the seasonal variability in sowing dates with errors of ±8 and ±16 days in 46% and 66% of cases, respectively. Extending the analysis to the years 2002-2012 showed that sowing in the Camargue was usually done on or around November 1st (±4 days). Comparing inter-annual sowing date variability with the main local agro-climatic drivers showed that the type of preceding crop and the weather conditions during the summer season before the wheat sowing had a prominent role in influencing winter wheat sowing dates.

  18. www.common-metrics.org: a web application to estimate scores from different patient-reported outcome measures on a common scale.

    Science.gov (United States)

    Fischer, H Felix; Rose, Matthias

    2016-10-19

    Recently, a growing number of Item-Response Theory (IRT) models has been published, which allow estimation of a common latent variable from data derived by different Patient Reported Outcomes (PROs). When using data from different PROs, direct estimation of the latent variable has some advantages over the use of sum score conversion tables. It requires substantial proficiency in the field of psychometrics to fit such models using contemporary IRT software. We developed a web application ( http://www.common-metrics.org ), which allows estimation of latent variable scores more easily using IRT models calibrating different measures on instrument independent scales. Currently, the application allows estimation using six different IRT models for Depression, Anxiety, and Physical Function. Based on published item parameters, users of the application can directly estimate latent trait estimates using expected a posteriori (EAP) for sum scores as well as for specific response patterns, Bayes modal (MAP), Weighted likelihood estimation (WLE) and Maximum likelihood (ML) methods and under three different prior distributions. The obtained estimates can be downloaded and analyzed using standard statistical software. This application enhances the usability of IRT modeling for researchers by allowing comparison of the latent trait estimates over different PROs, such as the Patient Health Questionnaire Depression (PHQ-9) and Anxiety (GAD-7) scales, the Center of Epidemiologic Studies Depression Scale (CES-D), the Beck Depression Inventory (BDI), PROMIS Anxiety and Depression Short Forms and others. Advantages of this approach include comparability of data derived with different measures and tolerance against missing values. The validity of the underlying models needs to be investigated in the future.

  19. Estimation of macular pigment optical density in the elderly: test-retest variability and effect of optical blur in pseudophakic subjects

    NARCIS (Netherlands)

    Gallaher, Kevin T.; Mura, Marco; Todd, Wm Andrew; Harris, Tarsha L.; Kenyon, Emily; Harris, Tamara; Johnson, Karen C.; Satterfield, Suzanne; Kritchevsky, Stephen B.; Iannaccone, Alessandro

    2007-01-01

    The reproducibility of macular pigment optical density (MPOD) estimates in the elderly was assessed in 40 subjects (age: 79.1+/-3.5). Test-retest variability was good (Pearson's r coefficient: 0.734), with an average coefficient of variation (CV) of 18.4% and an intraclass correlation coefficient

  20. Statistical methods for biodosimetry in the presence of both Berkson and classical measurement error

    Science.gov (United States)

    Miller, Austin

    In radiation epidemiology, the true dose received by those exposed cannot be assessed directly. Physical dosimetry uses a deterministic function of the source term, distance and shielding to estimate dose. For the atomic bomb survivors, the physical dosimetry system is well established. The classical measurement errors plaguing the location and shielding inputs to the physical dosimetry system are well known. Adjusting for the associated biases requires an estimate for the classical measurement error variance, for which no data-driven estimate exists. In this case, an instrumental variable solution is the most viable option to overcome the classical measurement error indeterminacy. Biological indicators of dose may serve as instrumental variables. Specification of the biodosimeter dose-response model requires identification of the radiosensitivity variables, for which we develop statistical definitions and variables. More recently, researchers have recognized Berkson error in the dose estimates, introduced by averaging assumptions for many components in the physical dosimetry system. We show that Berkson error induces a bias in the instrumental variable estimate of the dose-response coefficient, and then address the estimation problem. This model is specified by developing an instrumental variable mixed measurement error likelihood function, which is then maximized using a Monte Carlo EM Algorithm. These methods produce dose estimates that incorporate information from both physical and biological indicators of dose, as well as the first instrumental variable based data-driven estimate for the classical measurement error variance.

  1. Risk-adjusted econometric model to estimate postoperative costs: an additional instrument for monitoring performance after major lung resection.

    Science.gov (United States)

    Brunelli, Alessandro; Salati, Michele; Refai, Majed; Xiumé, Francesco; Rocco, Gaetano; Sabbatini, Armando

    2007-09-01

    The objectives of this study were to develop a risk-adjusted model to estimate individual postoperative costs after major lung resection and to use it for internal economic audit. Variable and fixed hospital costs were collected for 679 consecutive patients who underwent major lung resection from January 2000 through October 2006 at our unit. Several preoperative variables were used to develop a risk-adjusted econometric model from all patients operated on during the period 2000 through 2003 by a stepwise multiple regression analysis (validated by bootstrap). The model was then used to estimate the postoperative costs in the patients operated on during the 3 subsequent periods (years 2004, 2005, and 2006). Observed and predicted costs were then compared within each period by the Wilcoxon signed rank test. Multiple regression and bootstrap analysis yielded the following model predicting postoperative cost: 11,078 + 1340.3X (age > 70 years) + 1927.8X cardiac comorbidity - 95X ppoFEV1%. No differences between predicted and observed costs were noted in the first 2 periods analyzed (year 2004, $6188.40 vs $6241.40, P = .3; year 2005, $6308.60 vs $6483.60, P = .4), whereas in the most recent period (2006) observed costs were significantly lower than the predicted ones ($3457.30 vs $6162.70, P model may be used as a methodologic template for economic audit in our specialty and complement more traditional outcome measures in the assessment of performance.

  2. Estimation of new production in the North Sea: consequences for temporal and spatial variability of phytoplankton

    DEFF Research Database (Denmark)

    Richardson, Katherine; Bo Pedersen, Flemming

    1998-01-01

    By coupling knowledge of oceanographic processes and phytoplankton responses to light and nutrient availability, we estimate a total potential new (sensu Dugdale and Goering,1967) production for the North Sea of approximately 15.6 million tons C per year. In a typical year, about 40......% of this production will be associated with the spring bloom in the surface waters of the seasonally stratified (central and northern) North Sea. About 40% is predicted to occur in the coastal waters while the remaining new production is predicted to take place in sub-surface chlorophyll peaks occuring in association...... with fronts in the North Sea during summer month. By considering the inter-annual variation in heat, wind and nutrient availability (light and tidal energy input are treated as non-varying from year to year), the inter-annual variability in the new production occuring in these different regions is estimated...

  3. Einstein x-ray observations of cataclysmic variables

    International Nuclear Information System (INIS)

    Mason, K.O.; Cordova, F.A.

    1982-01-01

    Observations with the imaging x-ray detectors on the Einstein Observatory have led to a large increase in the number of low luminosity x-ray sources known to be associated with cataclysmic variable stars (CVs). The high sensitivity of the Einstein instrumentation has permitted study of their short timescale variability and spectra. The data are adding significantly to our knowledge of the accretion process in cataclysmic variables and forcing some revision in our ideas concerning the origin of the optical variability in these stars

  4. Variable Stars in the Field of V729 Aql

    Science.gov (United States)

    Cagaš, P.

    2017-04-01

    Wide field instruments can be used to acquire light curves of tens or even hundreds of variable stars per night, which increases the probability of new discoveries of interesting variable stars and generally increases the efficiency of observations. At the same time, wide field instruments produce a large amount of data, which must be processed using advanced software. The traditional approach, typically used by amateur astronomers, requires an unacceptable amount of time needed to process each data set. New functionality, built into SIPS software package, can shorten the time needed to obtain light curves by several orders of magnitude. Also, newly introduced SILICUPS software is intended for post-processing of stored light curves. It can be used to visualize observations from many nights, to find variable star periods, evaluate types of variability, etc. This work provides an overview of tools used to process data from the large field of view around the variable star V729 Aql. and demonstrates the results.

  5. Estimation of Staphylococcus aureus growth parameters from turbidity data: characterization of strain variation and comparison of methods.

    Science.gov (United States)

    Lindqvist, R

    2006-07-01

    Turbidity methods offer possibilities for generating data required for addressing microorganism variability in risk modeling given that the results of these methods correspond to those of viable count methods. The objectives of this study were to identify the best approach for determining growth parameters based on turbidity data and use of a Bioscreen instrument and to characterize variability in growth parameters of 34 Staphylococcus aureus strains of different biotypes isolated from broiler carcasses. Growth parameters were estimated by fitting primary growth models to turbidity growth curves or to detection times of serially diluted cultures either directly or by using an analysis of variance (ANOVA) approach. The maximum specific growth rates in chicken broth at 17 degrees C estimated by time to detection methods were in good agreement with viable count estimates, whereas growth models (exponential and Richards) underestimated growth rates. Time to detection methods were selected for strain characterization. The variation of growth parameters among strains was best described by either the logistic or lognormal distribution, but definitive conclusions require a larger data set. The distribution of the physiological state parameter ranged from 0.01 to 0.92 and was not significantly different from a normal distribution. Strain variability was important, and the coefficient of variation of growth parameters was up to six times larger among strains than within strains. It is suggested to apply a time to detection (ANOVA) approach using turbidity measurements for convenient and accurate estimation of growth parameters. The results emphasize the need to consider implications of strain variability for predictive modeling and risk assessment.

  6. Surface-temperature trends and variability in the low-latitude North Atlantic since 1552

    KAUST Repository

    Saenger, Casey

    2009-06-21

    Sea surface temperature variability in the North Atlantic Ocean recorded since about 1850 has been ascribed to a natural multidecadal oscillation superimposed on a background warming trend1-6. It has been suggested that the multidecadal variability may be a persistent feature6-8, raising the possibility that the associated climate impacts may be predictable7,8. owever, our understanding of the multidecadal ocean variability before the instrumental record is based on interpretations of high-latitude terrestrial proxy records. Here we present an absolutely dated and annually resolved record of sea surface temperature from the Bahamas, based on a 440-year time series of coral growth rates. The reconstruction indicates that temperatures were as warm as today from about 1552 to 1570, then cooled by about 1° C from 1650 to 1730 before warming until the present. Our estimates of background variability suggest that much of the warming since 1900 was driven by anthropogenic forcing. Interdecadal variability with a period of 15-25 years is superimposed on most of the record, but multidecadal variability becomes significant only after 1730. We conclude that the multidecadal variability in sea surface temperatures in the low-latitude western Atlantic Ocean may not be persistent, potentially making accurate decadal climate forecasts more difficult to achieve. © 2009 Macmillan Publishers Limited. All rights reserved.

  7. Surface-temperature trends and variability in the low-latitude North Atlantic since 1552

    KAUST Repository

    Saenger, Casey; Cohen, Anne L.; Oppo, Delia W.; Halley, Robert B.; Carilli, Jessica E.

    2009-01-01

    Sea surface temperature variability in the North Atlantic Ocean recorded since about 1850 has been ascribed to a natural multidecadal oscillation superimposed on a background warming trend1-6. It has been suggested that the multidecadal variability may be a persistent feature6-8, raising the possibility that the associated climate impacts may be predictable7,8. owever, our understanding of the multidecadal ocean variability before the instrumental record is based on interpretations of high-latitude terrestrial proxy records. Here we present an absolutely dated and annually resolved record of sea surface temperature from the Bahamas, based on a 440-year time series of coral growth rates. The reconstruction indicates that temperatures were as warm as today from about 1552 to 1570, then cooled by about 1° C from 1650 to 1730 before warming until the present. Our estimates of background variability suggest that much of the warming since 1900 was driven by anthropogenic forcing. Interdecadal variability with a period of 15-25 years is superimposed on most of the record, but multidecadal variability becomes significant only after 1730. We conclude that the multidecadal variability in sea surface temperatures in the low-latitude western Atlantic Ocean may not be persistent, potentially making accurate decadal climate forecasts more difficult to achieve. © 2009 Macmillan Publishers Limited. All rights reserved.

  8. Variable selection in multivariate calibration based on clustering of variable concept.

    Science.gov (United States)

    Farrokhnia, Maryam; Karimi, Sadegh

    2016-01-01

    Recently we have proposed a new variable selection algorithm, based on clustering of variable concept (CLoVA) in classification problem. With the same idea, this new concept has been applied to a regression problem and then the obtained results have been compared with conventional variable selection strategies for PLS. The basic idea behind the clustering of variable is that, the instrument channels are clustered into different clusters via clustering algorithms. Then, the spectral data of each cluster are subjected to PLS regression. Different real data sets (Cargill corn, Biscuit dough, ACE QSAR, Soy, and Tablet) have been used to evaluate the influence of the clustering of variables on the prediction performances of PLS. Almost in the all cases, the statistical parameter especially in prediction error shows the superiority of CLoVA-PLS respect to other variable selection strategies. Finally the synergy clustering of variable (sCLoVA-PLS), which is used the combination of cluster, has been proposed as an efficient and modification of CLoVA algorithm. The obtained statistical parameter indicates that variable clustering can split useful part from redundant ones, and then based on informative cluster; stable model can be reached. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. Intrajudge and Interjudge Reliability of the Stuttering Severity Instrument-Fourth Edition.

    Science.gov (United States)

    Davidow, Jason H; Scott, Kathleen A

    2017-11-08

    The Stuttering Severity Instrument (SSI) is a tool used to measure the severity of stuttering. Previous versions of the instrument have known limitations (e.g., Lewis, 1995). The present study examined the intra- and interjudge reliability of the newest version, the Stuttering Severity Instrument-Fourth Edition (SSI-4) (Riley, 2009). Twelve judges who were trained on the SSI-4 protocol participated. Judges collected SSI-4 data while viewing 4 videos of adults who stutter at Time 1 and 4 weeks later at Time 2. Data were analyzed for intra- and interjudge reliability of the SSI-4 subscores (for Frequency, Duration, and Physical Concomitants), total score, and final severity rating. Intra- and interjudge reliability across the subscores and total score concurred with the manual's reported reliability when reliability was calculated using the methods described in the manual. New calculations of judge agreement produced different values from those in the manual-for the 3 subscores, total score, and final severity rating-and provided data absent from the manual. Clinicians and researchers who use the SSI-4 should carefully consider the limitations of the instrument. Investigation into the multitasking demands of the instrument may provide information on whether separating the collection of data for specific variables will improve intra- and interjudge reliability of those variables.

  10. Variable-Structure Control of a Model Glider Airplane

    Science.gov (United States)

    Waszak, Martin R.; Anderson, Mark R.

    2008-01-01

    A variable-structure control system designed to enable a fuselage-heavy airplane to recover from spin has been demonstrated in a hand-launched, instrumented model glider airplane. Variable-structure control is a high-speed switching feedback control technique that has been developed for control of nonlinear dynamic systems.

  11. use of genetic variability estimates and interrelationships

    African Journals Online (AJOL)

    Prof. Adipala Ekwamu

    of 11 agronomic and biochemical traits to water stress based on estimation of genetic ... of primary branches and 100 seed weight under W0, and number of primary ... selection of superior drought-tolerant genotype (LR1) with good yield ...

  12. Digital Cover Photography for Estimating Leaf Area Index (LAI in Apple Trees Using a Variable Light Extinction Coefficient

    Directory of Open Access Journals (Sweden)

    Carlos Poblete-Echeverría

    2015-01-01

    Full Text Available Leaf area index (LAI is one of the key biophysical variables required for crop modeling. Direct LAI measurements are time consuming and difficult to obtain for experimental and commercial fruit orchards. Devices used to estimate LAI have shown considerable errors when compared to ground-truth or destructive measurements, requiring tedious site-specific calibrations. The objective of this study was to test the performance of a modified digital cover photography method to estimate LAI in apple trees using conventional digital photography and instantaneous measurements of incident radiation (Io and transmitted radiation (I through the canopy. Leaf area of 40 single apple trees were measured destructively to obtain real leaf area index (LAID, which was compared with LAI estimated by the proposed digital photography method (LAIM. Results showed that the LAIM was able to estimate LAID with an error of 25% using a constant light extinction coefficient (k = 0.68. However, when k was estimated using an exponential function based on the fraction of foliage cover (ff derived from images, the error was reduced to 18%. Furthermore, when measurements of light intercepted by the canopy (Ic were used as a proxy value for k, the method presented an error of only 9%. These results have shown that by using a proxy k value, estimated by Ic, helped to increase accuracy of LAI estimates using digital cover images for apple trees with different canopy sizes and under field conditions.

  13. Digital cover photography for estimating leaf area index (LAI) in apple trees using a variable light extinction coefficient.

    Science.gov (United States)

    Poblete-Echeverría, Carlos; Fuentes, Sigfredo; Ortega-Farias, Samuel; Gonzalez-Talice, Jaime; Yuri, Jose Antonio

    2015-01-28

    Leaf area index (LAI) is one of the key biophysical variables required for crop modeling. Direct LAI measurements are time consuming and difficult to obtain for experimental and commercial fruit orchards. Devices used to estimate LAI have shown considerable errors when compared to ground-truth or destructive measurements, requiring tedious site-specific calibrations. The objective of this study was to test the performance of a modified digital cover photography method to estimate LAI in apple trees using conventional digital photography and instantaneous measurements of incident radiation (Io) and transmitted radiation (I) through the canopy. Leaf area of 40 single apple trees were measured destructively to obtain real leaf area index (LAI(D)), which was compared with LAI estimated by the proposed digital photography method (LAI(M)). Results showed that the LAI(M) was able to estimate LAI(D) with an error of 25% using a constant light extinction coefficient (k = 0.68). However, when k was estimated using an exponential function based on the fraction of foliage cover (f(f)) derived from images, the error was reduced to 18%. Furthermore, when measurements of light intercepted by the canopy (Ic) were used as a proxy value for k, the method presented an error of only 9%. These results have shown that by using a proxy k value, estimated by Ic, helped to increase accuracy of LAI estimates using digital cover images for apple trees with different canopy sizes and under field conditions.

  14. Digital Cover Photography for Estimating Leaf Area Index (LAI) in Apple Trees Using a Variable Light Extinction Coefficient

    Science.gov (United States)

    Poblete-Echeverría, Carlos; Fuentes, Sigfredo; Ortega-Farias, Samuel; Gonzalez-Talice, Jaime; Yuri, Jose Antonio

    2015-01-01

    Leaf area index (LAI) is one of the key biophysical variables required for crop modeling. Direct LAI measurements are time consuming and difficult to obtain for experimental and commercial fruit orchards. Devices used to estimate LAI have shown considerable errors when compared to ground-truth or destructive measurements, requiring tedious site-specific calibrations. The objective of this study was to test the performance of a modified digital cover photography method to estimate LAI in apple trees using conventional digital photography and instantaneous measurements of incident radiation (Io) and transmitted radiation (I) through the canopy. Leaf area of 40 single apple trees were measured destructively to obtain real leaf area index (LAID), which was compared with LAI estimated by the proposed digital photography method (LAIM). Results showed that the LAIM was able to estimate LAID with an error of 25% using a constant light extinction coefficient (k = 0.68). However, when k was estimated using an exponential function based on the fraction of foliage cover (ff) derived from images, the error was reduced to 18%. Furthermore, when measurements of light intercepted by the canopy (Ic) were used as a proxy value for k, the method presented an error of only 9%. These results have shown that by using a proxy k value, estimated by Ic, helped to increase accuracy of LAI estimates using digital cover images for apple trees with different canopy sizes and under field conditions. PMID:25635411

  15. Observing System Simulations for Small Satellite Formations Estimating Bidirectional Reflectance

    Science.gov (United States)

    Nag, Sreeja; Gatebe, Charles K.; de Weck, Olivier

    2015-01-01

    The bidirectional reflectance distribution function (BRDF) gives the reflectance of a target as a function of illumination geometry and viewing geometry, hence carries information about the anisotropy of the surface. BRDF is needed in remote sensing for the correction of view and illumination angle effects (for example in image standardization and mosaicing), for deriving albedo, for land cover classification, for cloud detection, for atmospheric correction, and other applications. However, current spaceborne instruments provide sparse angular sampling of BRDF and airborne instruments are limited in the spatial and temporal coverage. To fill the gaps in angular coverage within spatial, spectral and temporal requirements, we propose a new measurement technique: Use of small satellites in formation flight, each satellite with a VNIR (visible and near infrared) imaging spectrometer, to make multi-spectral, near-simultaneous measurements of every ground spot in the swath at multiple angles. This paper describes an observing system simulation experiment (OSSE) to evaluate the proposed concept and select the optimal formation architecture that minimizes BRDF uncertainties. The variables of the OSSE are identified; number of satellites, measurement spread in the view zenith and relative azimuth with respect to solar plane, solar zenith angle, BRDF models and wavelength of reflection. Analyzing the sensitivity of BRDF estimation errors to the variables allow simplification of the OSSE, to enable its use to rapidly evaluate formation architectures. A 6-satellite formation is shown to produce lower BRDF estimation errors, purely in terms of angular sampling as evaluated by the OSSE, than a single spacecraft with 9 forward-aft sensors. We demonstrate the ability to use OSSEs to design small satellite formations as complements to flagship mission data. The formations can fill angular sampling gaps and enable better BRDF products than currently possible.

  16. Observing system simulations for small satellite formations estimating bidirectional reflectance

    Science.gov (United States)

    Nag, Sreeja; Gatebe, Charles K.; Weck, Olivier de

    2015-12-01

    The bidirectional reflectance distribution function (BRDF) gives the reflectance of a target as a function of illumination geometry and viewing geometry, hence carries information about the anisotropy of the surface. BRDF is needed in remote sensing for the correction of view and illumination angle effects (for example in image standardization and mosaicing), for deriving albedo, for land cover classification, for cloud detection, for atmospheric correction, and other applications. However, current spaceborne instruments provide sparse angular sampling of BRDF and airborne instruments are limited in the spatial and temporal coverage. To fill the gaps in angular coverage within spatial, spectral and temporal requirements, we propose a new measurement technique: use of small satellites in formation flight, each satellite with a VNIR (visible and near infrared) imaging spectrometer, to make multi-spectral, near-simultaneous measurements of every ground spot in the swath at multiple angles. This paper describes an observing system simulation experiment (OSSE) to evaluate the proposed concept and select the optimal formation architecture that minimizes BRDF uncertainties. The variables of the OSSE are identified; number of satellites, measurement spread in the view zenith and relative azimuth with respect to solar plane, solar zenith angle, BRDF models and wavelength of reflection. Analyzing the sensitivity of BRDF estimation errors to the variables allow simplification of the OSSE, to enable its use to rapidly evaluate formation architectures. A 6-satellite formation is shown to produce lower BRDF estimation errors, purely in terms of angular sampling as evaluated by the OSSE, than a single spacecraft with 9 forward-aft sensors. We demonstrate the ability to use OSSEs to design small satellite formations as complements to flagship mission data. The formations can fill angular sampling gaps and enable better BRDF products than currently possible.

  17. Reconstructing pre-instrumental streamflow in Eastern Australia using a water balance approach

    Science.gov (United States)

    Tozer, C. R.; Kiem, A. S.; Vance, T. R.; Roberts, J. L.; Curran, M. A. J.; Moy, A. D.

    2018-03-01

    Streamflow reconstructions based on paleoclimate proxies provide much longer records than the short instrumental period records on which water resource management plans are currently based. In Australia there is a lack of in-situ high resolution paleoclimate proxy records, but remote proxies with teleconnections to Australian climate have utility in producing streamflow reconstructions. Here we investigate, via a case study for a catchment in eastern Australia, the novel use of an Antarctic ice-core based rainfall reconstruction within a Budyko-framework to reconstruct ∼1000 years of annual streamflow. The resulting streamflow reconstruction captures interannual to decadal variability in the instrumental streamflow, validating both the use of the ice core rainfall proxy record and the Budyko-framework method. In the preinstrumental era the streamflow reconstruction shows longer wet and dry epochs and periods of streamflow variability that are higher than observed in the instrumental era. Importantly, for both the instrumental record and preinstrumental reconstructions, the wet (dry) epochs in the rainfall record are shorter (longer) in the streamflow record and this non-linearity must be considered when inferring hydroclimatic risk or historical water availability directly from rainfall proxy records alone. These insights provide a better understanding of present infrastructure vulnerability in the context of past climate variability for eastern Australia. The streamflow reconstruction presented here also provides a better understanding of the range of hydroclimatic variability possible, and therefore represents a more realistic baseline on which to quantify the potential impacts of anthropogenic climate change on water security.

  18. The Role of Heart-Rate Variability Parameters in Activity Recognition and Energy-Expenditure Estimation Using Wearable Sensors.

    Science.gov (United States)

    Park, Heesu; Dong, Suh-Yeon; Lee, Miran; Youn, Inchan

    2017-07-24

    Human-activity recognition (HAR) and energy-expenditure (EE) estimation are major functions in the mobile healthcare system. Both functions have been investigated for a long time; however, several challenges remain unsolved, such as the confusion between activities and the recognition of energy-consuming activities involving little or no movement. To solve these problems, we propose a novel approach using an accelerometer and electrocardiogram (ECG). First, we collected a database of six activities (sitting, standing, walking, ascending, resting and running) of 13 voluntary participants. We compared the HAR performances of three models with respect to the input data type (with none, all, or some of the heart-rate variability (HRV) parameters). The best recognition performance was 96.35%, which was obtained with some selected HRV parameters. EE was also estimated for different choices of the input data type (with or without HRV parameters) and the model type (single and activity-specific). The best estimation performance was found in the case of the activity-specific model with HRV parameters. Our findings indicate that the use of human physiological data, obtained by wearable sensors, has a significant impact on both HAR and EE estimation, which are crucial functions in the mobile healthcare system.

  19. Use of an improved radiation amplification factor to estimate the effect of total ozone changes on action spectrum weighted irradiances and an instrument response function

    Science.gov (United States)

    Herman, Jay R.

    2010-12-01

    Multiple scattering radiative transfer results are used to calculate action spectrum weighted irradiances and fractional irradiance changes in terms of a power law in ozone Ω, U(Ω/200)-RAF, where the new radiation amplification factor (RAF) is just a function of solar zenith angle. Including Rayleigh scattering caused small differences in the estimated 30 year changes in action spectrum-weighted irradiances compared to estimates that neglect multiple scattering. The radiative transfer results are applied to several action spectra and to an instrument response function corresponding to the Solar Light 501 meter. The effect of changing ozone on two plant damage action spectra are shown for plants with high sensitivity to UVB (280-315 nm) and those with lower sensitivity, showing that the probability for plant damage for the latter has increased since 1979, especially at middle to high latitudes in the Southern Hemisphere. Similarly, there has been an increase in rates of erythemal skin damage and pre-vitamin D3 production corresponding to measured ozone decreases. An example conversion function is derived to obtain erythemal irradiances and the UV index from measurements with the Solar Light 501 instrument response function. An analytic expressions is given to convert changes in erythemal irradiances to changes in CIE vitamin-D action spectrum weighted irradiances.

  20. Is foreign direct investment good for health in low and middle income countries? An instrumental variable approach.

    Science.gov (United States)

    Burns, Darren K; Jones, Andrew P; Goryakin, Yevgeniy; Suhrcke, Marc

    2017-05-01

    There is a scarcity of quantitative research into the effect of FDI on population health in low and middle income countries (LMICs). This paper investigates the relationship using annual panel data from 85 LMICs between 1974 and 2012. When controlling for time trends, country fixed effects, correlation between repeated observations, relevant covariates, and endogeneity via a novel instrumental variable approach, we find FDI to have a beneficial effect on overall health, proxied by life expectancy. When investigating age-specific mortality rates, we find a stronger beneficial effect of FDI on adult mortality, yet no association with either infant or child mortality. Notably, FDI effects on health remain undetected in all models which do not control for endogeneity. Exploring the effect of sector-specific FDI on health in LMICs, we provide preliminary evidence of a weak inverse association between secondary (i.e. manufacturing) sector FDI and overall life expectancy. Our results thus suggest that FDI has provided an overall benefit to population health in LMICs, particularly in adults, yet investments into the secondary sector could be harmful to health. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. A neural network-based estimate of the seasonal to inter-annual variability of the Atlantic Ocean carbon sink

    DEFF Research Database (Denmark)

    Landschützer, P.; Gruber, N.; Bakker, D.C.E.

    2013-01-01

    The Atlantic Ocean is one of the most important sinks for atmospheric carbon dioxide (CO2), but this sink is known to vary substantially in time. Here we use surface ocean CO2 observations to estimate this sink and the temporal variability from 1998 to 2007 in the Atlantic Ocean. We benefit from ......, leading to a substantial trend toward a stronger CO2 sink for the entire South Atlantic (–0.14 Pg C yr–1 decade–1). The Atlantic carbon sink varies relatively little on inter-annual time-scales (±0.04 Pg C yr–1; 1σ)......The Atlantic Ocean is one of the most important sinks for atmospheric carbon dioxide (CO2), but this sink is known to vary substantially in time. Here we use surface ocean CO2 observations to estimate this sink and the temporal variability from 1998 to 2007 in the Atlantic Ocean. We benefit from (i...... poleward of 40° N, but many other parts of the North Atlantic increased more slowly, resulting in a barely changing Atlantic carbon sink north of the equator (–0.007 Pg C yr–1 decade–1). Surface ocean pCO2 was also increasing less than that of the atmosphere over most of the Atlantic south of the equator...

  2. Explicit estimating equations for semiparametric generalized linear latent variable models

    KAUST Repository

    Ma, Yanyuan; Genton, Marc G.

    2010-01-01

    which is similar to that of a sufficient complete statistic, which enables us to simplify the estimating procedure and explicitly to formulate the semiparametric estimating equations. We further show that the explicit estimators have the usual root n

  3. Definition study for variable cycle engine testbed engine and associated test program

    Science.gov (United States)

    Vdoviak, J. W.

    1978-01-01

    The product/study double bypass variable cycle engine (VCE) was updated to incorporate recent improvements. The effect of these improvements on mission range and noise levels was determined. This engine design was then compared with current existing high-technology core engines in order to define a subscale testbed configuration that simulated many of the critical technology features of the product/study VCE. Detailed preliminary program plans were then developed for the design, fabrication, and static test of the selected testbed engine configuration. These plans included estimated costs and schedules for the detail design, fabrication and test of the testbed engine and the definition of a test program, test plan, schedule, instrumentation, and test stand requirements.

  4. Cost-effective design of economic instruments in nutrition policy

    Directory of Open Access Journals (Sweden)

    Smed Sinne

    2007-04-01

    Full Text Available Abstract This paper addresses the potential for using economic regulation, e.g. taxes or subsidies, as instruments to combat the increasing problems of inappropriate diets, leading to health problems such as obesity, diabetes 2, cardiovascular diseases etc. in most countries. Such policy measures may be considered as alternatives or supplements to other regulation instruments, including information campaigns, bans or enhancement of technological solutions to the problems of obesity or related diseases. 7 different food tax and subsidy instruments or combinations of instruments are analysed quantitatively. The analyses demonstrate that the average cost-effectiveness with regard to changing the intake of selected nutritional variables can be improved by 10–30 per cent if taxes/subsidies are targeted against these nutrients, compared with targeting selected food categories. Finally, the paper raises a range of issues, which need to be investigated further, before firm conclusions about the suitability of economic instruments in nutrition policy can be drawn.

  5. The Plasma Instrument for Magnetic Sounding (PIMS) onboard the Europa Clipper Mission

    Science.gov (United States)

    Westlake, Joseph H.; McNutt, Ralph L.; Kasper, Justin C.; Rymer, Abigail; Case, Anthony; Battista, Corina; Cochrane, Corey; Coren, David; Crew, Alexander; Grey, Matthew; Jia, Xianzhe; Khurana, Krishan; Kim, Cindy; Kivelson, Margaret G.; Korth, Haje; Krupp, Norbert; Paty, Carol; Roussos, Elias; Stevens, Michael; Slavin, James A.; Smith, Howard T.; Saur, Joachim

    2017-10-01

    Europa is embedded in a complex Jovian magnetospheric plasma, which rotates with the tilted planetary field and interacts dynamically with Europa’s ionosphere affecting the magnetic induction signal. Plasma from Io’s temporally varying torus diffuses outward and mixes with the charged particles in Europa’s own torus producing highly variable plasma conditions. Onboard the Europa Clipper spacecraft the Plasma Instrument for Magnetic Sounding (PIMS) works in conjunction with the Interior Characterization of Europa using Magnetometry (ICEMAG) investigation to probe Europa’s subsurface ocean. This investigation exploits currents induced in Europa’s interior by the moon’s exposure to variable magnetic fields in the Jovian system to infer properties of Europa’s subsurface ocean such as its depth, thickness, and conductivity. This technique was successfully applied to Galileo observations and demonstrated that Europa indeed has a subsurface ocean. While these Galileo observations contributed to the renewed interest in Europa, due to limitations in the observations the results raised major questions that remain unanswered. PIMS will greatly refine our understanding of Europa’s global liquid ocean by accounting for contributions to the magnetic field from plasma currents.The Europa Clipper mission is equipped with a sophisticated suite of 9 instruments to study Europa's interior and ocean, geology, chemistry, and habitability from a Jupiter orbiting spacecraft. PIMS on Europa Clipper is a Faraday Cup based plasma instrument whose heritage dates back to the Voyager spacecraft. PIMS will measure the plasma that populates Jupiter’s magnetosphere and Europa’s ionosphere. The science goals of PIMS are to: 1) estimate the ocean salinity and thickness by determining Europa’s magnetic induction response, corrected for plasma contributions; 2) assess mechanisms responsible for weathering and releasing material from Europa’s surface into the atmosphere and

  6. The Plasma Instrument for Magnetic Sounding (PIMS) on The Europa Clipper Mission

    Science.gov (United States)

    Westlake, J. H.; McNutt, R. L., Jr.; Kasper, J. C.; Battista, C.; Case, A. W.; Cochrane, C.; Grey, M.; Jia, X.; Kivelson, M.; Kim, C.; Korth, H.; Khurana, K. K.; Krupp, N.; Paty, C. S.; Roussos, E.; Rymer, A. M.; Stevens, M. L.; Slavin, J. A.; Smith, H. T.; Saur, J.; Coren, D.

    2017-12-01

    The Europa Clipper mission is equipped with a sophisticated suite of 9 instruments to study Europa's interior and ocean, geology, chemistry, and habitability from a Jupiter orbiting spacecraft. The Plasma Instrument for Magnetic Sounding (PIMS) on Europa Clipper is a Faraday Cup based plasma instrument whose heritage dates back to the Voyager spacecraft. PIMS will measure the plasma that populates Jupiter's magnetosphere and Europa's ionosphere. The science goals of PIMS are to: 1) estimate the ocean salinity and thickness by determining Europa's magnetic induction response, corrected for plasma contributions; 2) assess mechanisms responsible for weathering and releasing material from Europa's surface into the atmosphere and ionosphere; and 3) understand how Europa influences its local space environment and Jupiter's magnetosphere and vice versa. Europa is embedded in a complex Jovian magnetospheric plasma, which rotates with the tilted planetary field and interacts dynamically with Europa's ionosphere affecting the magnetic induction signal. Plasma from Io's temporally varying torus diffuses outward and mixes with the charged particles in Europa's own torus producing highly variable plasma conditions at Europa. PIMS works in conjunction with the Interior Characterization of Europa using Magnetometry (ICEMAG) investigation to probe Europa's subsurface ocean. This investigation exploits currents induced in Europa's interior by the moon's exposure to variable magnetic fields in the Jovian system to infer properties of Europa's subsurface ocean such as its depth, thickness, and conductivity. This technique was successfully applied to Galileo observations and demonstrated that Europa indeed has a subsurface ocean. While these Galileo observations contributed to the renewed interest in Europa, due to limitations in the observations the results raised major questions that remain unanswered. PIMS will greatly refine our understanding of Europa's global liquid ocean by

  7. Practical aspects of a maximum likelihood estimation method to extract stability and control derivatives from flight data

    Science.gov (United States)

    Iliff, K. W.; Maine, R. E.

    1976-01-01

    A maximum likelihood estimation method was applied to flight data and procedures to facilitate the routine analysis of a large amount of flight data were described. Techniques that can be used to obtain stability and control derivatives from aircraft maneuvers that are less than ideal for this purpose are described. The techniques involve detecting and correcting the effects of dependent or nearly dependent variables, structural vibration, data drift, inadequate instrumentation, and difficulties with the data acquisition system and the mathematical model. The use of uncertainty levels and multiple maneuver analysis also proved to be useful in improving the quality of the estimated coefficients. The procedures used for editing the data and for overall analysis are also discussed.

  8. Teaching Confirmatory Factor Analysis to Non-Statisticians: A Case Study for Estimating Composite Reliability of Psychometric Instruments

    Science.gov (United States)

    Gajewski, Byron J.; Jiang, Yu; Yeh, Hung-Wen; Engelman, Kimberly; Teel, Cynthia; Choi, Won S.; Greiner, K. Allen; Daley, Christine Makosky

    2013-01-01

    Texts and software that we are currently using for teaching multivariate analysis to non-statisticians lack in the delivery of confirmatory factor analysis (CFA). The purpose of this paper is to provide educators with a complement to these resources that includes CFA and its computation. We focus on how to use CFA to estimate a “composite reliability” of a psychometric instrument. This paper provides guidance for introducing, via a case-study, the non-statistician to CFA. As a complement to our instruction about the more traditional SPSS, we successfully piloted the software R for estimating CFA on nine non-statisticians. This approach can be used with healthcare graduate students taking a multivariate course, as well as modified for community stakeholders of our Center for American Indian Community Health (e.g. community advisory boards, summer interns, & research team members). The placement of CFA at the end of the class is strategic and gives us an opportunity to do some innovative teaching: (1) build ideas for understanding the case study using previous course work (such as ANOVA); (2) incorporate multi-dimensional scaling (that students already learned) into the selection of a factor structure (new concept); (3) use interactive data from the students (active learning); (4) review matrix algebra and its importance to psychometric evaluation; (5) show students how to do the calculation on their own; and (6) give students access to an actual recent research project. PMID:24772373

  9. Propensity-score matching in economic analyses: comparison with regression models, instrumental variables, residual inclusion, differences-in-differences, and decomposition methods.

    Science.gov (United States)

    Crown, William H

    2014-02-01

    This paper examines the use of propensity score matching in economic analyses of observational data. Several excellent papers have previously reviewed practical aspects of propensity score estimation and other aspects of the propensity score literature. The purpose of this paper is to compare the conceptual foundation of propensity score models with alternative estimators of treatment effects. References are provided to empirical comparisons among methods that have appeared in the literature. These comparisons are available for a subset of the methods considered in this paper. However, in some cases, no pairwise comparisons of particular methods are yet available, and there are no examples of comparisons across all of the methods surveyed here. Irrespective of the availability of empirical comparisons, the goal of this paper is to provide some intuition about the relative merits of alternative estimators in health economic evaluations where nonlinearity, sample size, availability of pre/post data, heterogeneity, and missing variables can have important implications for choice of methodology. Also considered is the potential combination of propensity score matching with alternative methods such as differences-in-differences and decomposition methods that have not yet appeared in the empirical literature.

  10. ASPECT OF LANGUAGE ON A QUALITATIVE ANALYSIS OF STUDENT’S EVALUATION INSTRUMENT

    Directory of Open Access Journals (Sweden)

    Ismanto Ismanto

    2016-11-01

    Full Text Available This article examined the characteristics of good student’s evaluation instrument. There are at least two requirements that must be met. Those are valid and reliable. The validity of the instrument can be seen from the instrument's ability to measure what should be measured. The fact the existence of the validity of an instrument may be a grain fill, the response process, internal structure, relationship with other variables, and the consequences of the implementation of the charging instrument. Analysis of the content is then known as content validity, i.e. rational analysis of the domain to be measured to determine the representation of each item on the instrument with the ability to be measured. Content validity is submitting pieces of blue print and items of the instrument to the experts to be analyzed quantitatively and qualitatively.

  11. An econometric method for estimating population parameters from non-random samples: An application to clinical case finding.

    Science.gov (United States)

    Burger, Rulof P; McLaren, Zoë M

    2017-09-01

    The problem of sample selection complicates the process of drawing inference about populations. Selective sampling arises in many real world situations when agents such as doctors and customs officials search for targets with high values of a characteristic. We propose a new method for estimating population characteristics from these types of selected samples. We develop a model that captures key features of the agent's sampling decision. We use a generalized method of moments with instrumental variables and maximum likelihood to estimate the population prevalence of the characteristic of interest and the agents' accuracy in identifying targets. We apply this method to tuberculosis (TB), which is the leading infectious disease cause of death worldwide. We use a national database of TB test data from South Africa to examine testing for multidrug resistant TB (MDR-TB). Approximately one quarter of MDR-TB cases was undiagnosed between 2004 and 2010. The official estimate of 2.5% is therefore too low, and MDR-TB prevalence is as high as 3.5%. Signal-to-noise ratios are estimated to be between 0.5 and 1. Our approach is widely applicable because of the availability of routinely collected data and abundance of potential instruments. Using routinely collected data to monitor population prevalence can guide evidence-based policy making. Copyright © 2017 John Wiley & Sons, Ltd.

  12. Treatment of thoracolumbar burst fractures with variable screw placement or Isola instrumentation and arthrodesis: case series and literature review.

    Science.gov (United States)

    Alvine, Gregory F; Swain, James M; Asher, Marc A; Burton, Douglas C

    2004-08-01

    The controversy of burst fracture surgical management is addressed in this retrospective case study and literature review. The series consisted of 40 consecutive patients, index included, with 41 fractures treated with stiff, limited segment transpedicular bone-anchored instrumentation and arthrodesis from 1987 through 1994. No major acute complications such as death, paralysis, or infection occurred. For the 30 fractures with pre- and postoperative computed tomography studies, spinal canal compromise was 61% and 32%, respectively. Neurologic function improved in 7 of 14 patients (50%) and did not worsen in any. The principal problem encountered was screw breakage, which occurred in 16 of the 41 (39%) instrumented fractures. As we have previously reported, transpedicular anterior bone graft augmentation significantly decreased variable screw placement (VSP) implant breakage. However, it did not prevent Isola implant breakage in two-motion segment constructs. Compared with VSP, Isola provided better sagittal plane realignment and constructs that have been found to be significantly stiffer. Unplanned reoperation was necessary in 9 of the 40 patients (23%). At 1- and 2-year follow-up, 95% and 79% of patients were available for study, and a satisfactory outcome was achieved in 84% and 79%, respectively. These satisfaction and reoperation rates are consistent with the literature of the time. Based on these observations and the loads to which implant constructs are exposed following posterior realignment and stabilization of burst fractures, we recommend that three- or four-motion segment constructs, rather than two motion, be used. To save valuable motion segments, planned construct shortening can be used. An alternative is sequential or staged anterior corpectomy and structural grafting.

  13. Design and validation of a standards-based science teacher efficacy instrument

    Science.gov (United States)

    Kerr, Patricia Reda

    National standards for K--12 science education address all aspects of science education, with their main emphasis on curriculum---both science subject matter and the process involved in doing science. Standards for science teacher education programs have been developing along a parallel plane, as is self-efficacy research involving classroom teachers. Generally, studies about efficacy have been dichotomous---basing the theoretical underpinnings on the work of either Rotter's Locus of Control theory or on Bandura's explanations of efficacy beliefs and outcome expectancy. This study brings all three threads together---K--12 science standards, teacher education standards, and efficacy beliefs---in an instrument designed to measure science teacher efficacy with items based on identified critical attributes of standards-based science teaching and learning. Based on Bandura's explanation of efficacy being task-specific and having outcome expectancy, a developmental, systematic progression from standards-based strategies and activities to tasks to critical attributes was used to craft items for a standards-based science teacher efficacy instrument. Demographic questions related to school characteristics, teacher characteristics, preservice background, science teaching experience, and post-certification professional development were included in the instrument. The instrument was completed by 102 middle level science teachers, with complete data for 87 teachers. A principal components analysis of the science teachers' responses to the instrument resulted in two components: Standards-Based Science Teacher Efficacy: Beliefs About Teaching (BAT, reliability = .92) and Standards-Based Science Teacher Efficacy: Beliefs About Student Achievement (BASA, reliability = .82). Variables that were characteristic of professional development activities, science content preparation, and school environment were identified as members of the sets of variables predicting the BAT and BASA

  14. UV Reconstruction Algorithm And Diurnal Cycle Variability

    Science.gov (United States)

    Curylo, Aleksander; Litynska, Zenobia; Krzyscin, Janusz; Bogdanska, Barbara

    2009-03-01

    UV reconstruction is a method of estimation of surface UV with the use of available actinometrical and aerological measurements. UV reconstruction is necessary for the study of long-term UV change. A typical series of UV measurements is not longer than 15 years, which is too short for trend estimation. The essential problem in the reconstruction algorithm is the good parameterization of clouds. In our previous algorithm we used an empirical relation between Cloud Modification Factor (CMF) in global radiation and CMF in UV. The CMF is defined as the ratio between measured and modelled irradiances. Clear sky irradiance was calculated with a solar radiative transfer model. In the proposed algorithm, the time variability of global radiation during the diurnal cycle is used as an additional source of information. For elaborating an improved reconstruction algorithm relevant data from Legionowo [52.4 N, 21.0 E, 96 m a.s.l], Poland were collected with the following instruments: NILU-UV multi channel radiometer, Kipp&Zonen pyranometer, radiosonde profiles of ozone, humidity and temperature. The proposed algorithm has been used for reconstruction of UV at four Polish sites: Mikolajki, Kolobrzeg, Warszawa-Bielany and Zakopane since the early 1960s. Krzyscin's reconstruction of total ozone has been used in the calculations.

  15. Risk estimates for hip fracture from clinical and densitometric variables and impact of database selection in Lebanese subjects.

    Science.gov (United States)

    Badra, Mohammad; Mehio-Sibai, Abla; Zeki Al-Hazzouri, Adina; Abou Naja, Hala; Baliki, Ghassan; Salamoun, Mariana; Afeiche, Nadim; Baddoura, Omar; Bulos, Suhayl; Haidar, Rachid; Lakkis, Suhayl; Musharrafieh, Ramzi; Nsouli, Afif; Taha, Assaad; Tayim, Ahmad; El-Hajj Fuleihan, Ghada

    2009-01-01

    Bone mineral density (BMD) and fracture incidence vary greatly worldwide. The data, if any, on clinical and densitometric characteristics of patients with hip fractures from the Middle East are scarce. The objective of the study was to define risk estimates from clinical and densitometric variables and the impact of database selection on such estimates. Clinical and densitometric information were obtained in 60 hip fracture patients and 90 controls. Hip fracture subjects were 74 yr (9.4) old, were significantly taller, lighter, and more likely to be taking anxiolytics and sleeping pills than controls. National Health and Nutrition Examination Survey (NHANES) database selection resulted in a higher sensitivity and almost equal specificity in identifying patients with a hip fracture compared with the Lebanese database. The odds ratio (OR) and its confidence interval (CI) for hip fracture per standard deviation (SD) decrease in total hip BMD was 2.1 (1.45-3.05) with the NHANES database, and 2.11 (1.36-2.37) when adjusted for age and body mass index (BMI). Risk estimates were higher in male compared with female subjects. In Lebanese subjects, BMD- and BMI-derived hip fracture risk estimates are comparable to western standards. The study validates the universal use of the NHANES database, and the applicability of BMD- and BMI-derived risk fracture estimates in the World Health Organization (WHO) global fracture risk model, to the Lebanese.

  16. NASA SMD Airborne Science Capabilities for Development and Testing of New Instruments

    Science.gov (United States)

    Fladeland, Matthew

    2015-01-01

    The SMD NASA Airborne Science Program operates and maintains a fleet of highly modified aircraft to support instrument development, satellite instrument calibration, data product validation and earth science process studies. This poster will provide an overview of aircraft available to NASA researchers including performance specifications and modifications for instrument support, processes for requesting aircraft time and developing cost estimates for proposals, and policies and procedures required to ensure safety of flight.

  17. Real-time instrument-failure detection in the LOFT pressurizer using functional redundancy

    International Nuclear Information System (INIS)

    Tylee, J.L.

    1982-07-01

    The functional redundancy approach to detecting instrument failures in a pressurized water reactor (PWR) pressurizer is described and evaluated. This real-time method uses a bank of Kalman filters (one for each instrument) to generate optimal estimates of the pressurizer state. By performing consistency checks between the output of each filter, failed instruments can be identified. Simulation results and actual pressurizer data are used to demonstrate the capabilities of the technique

  18. Two-Sample Two-Stage Least Squares (TSTSLS estimates of earnings mobility: how consistent are they?

    Directory of Open Access Journals (Sweden)

    John Jerrim

    2016-08-01

    Full Text Available Academics and policymakers have shown great interest in cross-national comparisons of intergenerational earnings mobility. However, producing consistent and comparable estimates of earnings mobility is not a trivial task. In most countries researchers are unable to observe earnings information for two generations. They are thus forced to rely upon imputed data from different surveys instead. This paper builds upon previous work by considering the consistency of the intergenerational correlation (ρ as well as the elasticity (β, how this changes when using a range of different instrumental (imputer variables, and highlighting an important but infrequently discussed measurement issue. Our key finding is that, while TSTSLS estimates of β and ρ are both likely to be inconsistent, the magnitude of this problem is much greater for the former than it is for the latter. We conclude by offering advice on estimating earnings mobility using this methodology.

  19. Estimation of genetic variability and heritability of wheat agronomic traits resulted from some gamma rays irradiation techniques

    International Nuclear Information System (INIS)

    Wijaya Murti Indriatama; Trikoesoemaningtyas; Syarifah Iis Aisyah; Soeranto Human

    2016-01-01

    Gamma irradiation techniques have significant effect on frequency and spectrum of macro-mutation but the study of its effect on micro-mutation that related to genetic variability on mutated population is very limited. The aim of this research was to study the effect of gamma irradiation techniques on genetic variability and heritability of wheat agronomic characters at M2 generation. This research was conducted from July to November 2014, at Cibadak experimental station, Indonesian Center for Agricultural Biotechnology and Genetic Resources Research and Development, Ministry of Agriculture. Three introduced wheat breeding lines (F-44, Kiran-95 & WL-711) were treated by 3 gamma irradiation techniques (acute, fractionated and intermittent). M1 generation of combination treatments were planted and harvested its spike individually per plants. As M2 generation, seeds of 75 M1 spike were planted at the field with one row one spike method and evaluated on the agronomic characters and its genetic components. The used of gamma irradiation techniques decreased mean but increased range values of agronomic traits in M2 populations. Fractionated irradiation induced higher mean and wider range on spike length and number of spike let per spike than other irradiation techniques. Fractionated and intermittent irradiation resulted greater variability of grain weight per plant than acute irradiation. The number of tillers, spike weight, grain weight per spike and grain weight per plant on M2 population resulted from induction of three gamma irradiation techniques have high estimated heritability and broad sense of genetic variability coefficient values. The three gamma irradiation techniques increased genetic variability of agronomic traits on M2 populations, except plant height. (author)

  20. Absorption coefficient instrument for turbid natural waters

    Science.gov (United States)

    Friedman, E.; Cherdak, A.; Poole, L.; Houghton, W.

    1980-01-01

    The paper presents an instrument that directly measures multispectral absorption coefficient of turbid natural water. Attention is given to the design, which is shown to incorporate methods for the compensation of variation in the internal light source intensity, correction of the spectrally dependent nature of the optical elements, and correction for variation in the background light level. In addition, when used in conjunction with a spectrally matched total attenuation instrument, the spectrally dependent scattering coefficient can also be derived. Finally, it is reported that systematic errors associated with multiple scattering have been estimated using Monte Carlo techniques.

  1. Instrumenting an upland research catchment in Canterbury, New Zealand to study controls on variability of soil moisture, shallow groundwater and streamflow

    Science.gov (United States)

    McMillan, Hilary; Srinivasan, Ms

    2015-04-01

    Hydrologists recognise the importance of vertical drainage and deep flow paths in runoff generation, even in headwater catchments. Both soil and groundwater stores are highly variable over multiple scales, and the distribution of water has a strong control on flow rates and timing. In this study, we instrumented an upland headwater catchment in New Zealand to measure the temporal and spatial variation in unsaturated and saturated-zone responses. In NZ, upland catchments are the source of much of the water used in lowland agriculture, but the hydrology of such catchments and their role in water partitioning, storage and transport is poorly understood. The study area is the Langs Gully catchment in the North Branch of the Waipara River, Canterbury: this catchment was chosen to be representative of the foothills environment, with lightly managed dryland pasture and native Matagouri shrub vegetation cover. Over a period of 16 months we measured continuous soil moisture at 32 locations and near-surface water table (versus hillslope locations, and convergent versus divergent hillslopes. We found that temporal variability is strongly controlled by the climatic seasonal cycle, for both soil moisture and water table, and for both the mean and extremes of their distributions. Groundwater is a larger water storage component than soil moisture, and the difference increases with catchment wetness. The spatial standard deviation of both soil moisture and groundwater is larger in winter than in summer. It peaks during rainfall events due to partial saturation of the catchment, and also rises in spring as different locations dry out at different rates. The most important controls on spatial variability are aspect and distance from stream. South-facing and near-stream locations have higher water tables and more, larger soil moisture wetting events. Typical hydrological models do not explicitly account for aspect, but our results suggest that it is an important factor in hillslope

  2. Estimation of expected number of accidents and workforce unavailability through Bayesian population variability analysis and Markov-based model

    International Nuclear Information System (INIS)

    Chagas Moura, Márcio das; Azevedo, Rafael Valença; Droguett, Enrique López; Chaves, Leandro Rego; Lins, Isis Didier

    2016-01-01

    Occupational accidents pose several negative consequences to employees, employers, environment and people surrounding the locale where the accident takes place. Some types of accidents correspond to low frequency-high consequence (long sick leaves) events, and then classical statistical approaches are ineffective in these cases because the available dataset is generally sparse and contain censored recordings. In this context, we propose a Bayesian population variability method for the estimation of the distributions of the rates of accident and recovery. Given these distributions, a Markov-based model will be used to estimate the uncertainty over the expected number of accidents and the work time loss. Thus, the use of Bayesian analysis along with the Markov approach aims at investigating future trends regarding occupational accidents in a workplace as well as enabling a better management of the labor force and prevention efforts. One application example is presented in order to validate the proposed approach; this case uses available data gathered from a hydropower company in Brazil. - Highlights: • This paper proposes a Bayesian method to estimate rates of accident and recovery. • The model requires simple data likely to be available in the company database. • These results show the proposed model is not too sensitive to the prior estimates.

  3. Aversive pavlovian responses affect human instrumental motor performance.

    Science.gov (United States)

    Rigoli, Francesco; Pavone, Enea Francesco; Pezzulo, Giovanni

    2012-01-01

    IN NEUROSCIENCE AND PSYCHOLOGY, AN INFLUENTIAL PERSPECTIVE DISTINGUISHES BETWEEN TWO KINDS OF BEHAVIORAL CONTROL: instrumental (habitual and goal-directed) and Pavlovian. Understanding the instrumental-Pavlovian interaction is fundamental for the comprehension of decision-making. Animal studies (as those using the negative auto-maintenance paradigm), have demonstrated that Pavlovian mechanisms can have maladaptive effects on instrumental performance. However, evidence for a similar effect in humans is scarce. In addition, the mechanisms modulating the impact of Pavlovian responses on instrumental performance are largely unknown, both in human and non-human animals. The present paper describes a behavioral experiment investigating the effects of Pavlovian conditioned responses on performance in humans, focusing on the aversive domain. Results showed that Pavlovian responses influenced human performance, and, similar to animal studies, could have maladaptive effects. In particular, Pavlovian responses either impaired or increased performance depending on modulator variables such as threat distance, task controllability, punishment history, amount of training, and explicit punishment expectancy. Overall, these findings help elucidating the computational mechanisms underlying the instrumental-Pavlovian interaction, which might be at the base of apparently irrational phenomena in economics, social behavior, and psychopathology.

  4. Aversive Pavlovian responses affect human instrumental motor performance

    Directory of Open Access Journals (Sweden)

    Francesco eRigoli

    2012-10-01

    Full Text Available In neuroscience and psychology, an influential perspective distinguishes between two kinds of behavioural control: instrumental (habitual and goal-directed and Pavlovian. Understanding the instrumental-Pavlovian interaction is fundamental for the comprehension of decision-making. Animal studies (as those using the negative auto-maintenance paradigm, have demonstrated that Pavlovian mechanisms can have maladaptive effects on instrumental performance. However, evidence for a similar effect in humans is scarce. In addition, the mechanisms modulating the impact of Pavlovian responses on instrumental performance are largely unknown, both in human and non-human animals. The present paper describes a behavioural experiment investigating the effects of Pavlovian conditioned responses on performance in humans, focusing on the aversive domain. Results showed that Pavlovian responses influenced human performance, and, similar to animal studies, could have maladaptive effects. In particular, Pavlovian responses either impaired or increased performance depending on modulator variables such as threat distance, task controllability, punishment history, amount of training, and explicit punishment expectancy. Overall, these findings help elucidating the computational mechanisms underlying the instrumental-Pavlovian interaction, which might be at the base of apparently irrational phenomena in economics, social behaviour, and psychopathology.

  5. Investigation on Motorcyclist Riding Behaviour at Curve Entry Using Instrumented Motorcycle

    Science.gov (United States)

    Yuen, Choon Wah; Karim, Mohamed Rehan; Saifizul, Ahmad

    2014-01-01

    This paper details the study on the changes in riding behaviour, such as changes in speed as well as the brake force and throttle force applied, when motorcyclists ride over a curve section road using an instrumented motorcycle. In this study, an instrumented motorcycle equipped with various types of sensors, on-board cameras, and data loggers, was developed in order to collect the riding data on the study site. Results from the statistical analysis showed that riding characteristics, such as changes in speed, brake force, and throttle force applied, are influenced by the distance from the curve entry, riding experience, and travel mileage of the riders. A structural equation modeling was used to study the impact of these variables on the change of riding behaviour in curve entry section. Four regression equations are formed to study the relationship between four dependent variables, which are speed, throttle force, front brake force, and rear brake force applied with the independent variables. PMID:24523660

  6. An Assessment of the Need for Standard Variable Names for Airborne Field Campaigns

    Science.gov (United States)

    Beach, A. L., III; Chen, G.; Northup, E. A.; Kusterer, J.; Quam, B. M.

    2017-12-01

    The NASA Earth Venture Program has led to a dramatic increase in airborne observations, requiring updated data management practices with clearly defined data standards and protocols for metadata. An airborne field campaign can involve multiple aircraft and a variety of instruments. It is quite common to have different instruments/techniques measure the same parameter on one or more aircraft platforms. This creates a need to allow instrument Principal Investigators (PIs) to name their variables in a way that would distinguish them across various data sets. A lack of standardization of variables names presents a challenge for data search tools in enabling discovery of similar data across airborne studies, aircraft platforms, and instruments. This was also identified by data users as one of the top issues in data use. One effective approach for mitigating this problem is to enforce variable name standardization, which can effectively map the unique PI variable names to fixed standard names. In order to ensure consistency amongst the standard names, it will be necessary to choose them from a controlled list. However, no such list currently exists despite a number of previous efforts to establish a sufficient list of atmospheric variable names. The Atmospheric Composition Variable Standard Name Working Group was established under the auspices of NASA's Earth Science Data Systems Working Group (ESDSWG) to solicit research community feedback to create a list of standard names that are acceptable to data providers and data users This presentation will discuss the challenges and recommendations of standard variable names in an effort to demonstrate how airborne metadata curation/management can be improved to streamline data ingest, improve interoperability, and discoverability to a broader user community.

  7. Introducing instrumental variables in the LS-SVM based identification framework

    NARCIS (Netherlands)

    Laurain, V.; Zheng, W-X.; Toth, R.

    2011-01-01

    Least-Squares Support Vector Machines (LS-SVM) represent a promising approach to identify nonlinear systems via nonparametric estimation of the nonlinearities in a computationally and stochastically attractive way. All the methods dedicated to the solution of this problem rely on the minimization of

  8. Estimating evaporative fraction from readily obtainable variables in mangrove forests of the Everglades, U.S.A.

    Science.gov (United States)

    Yagci, Ali Levent; Santanello, Joseph A.; Jones, John W.; Barr, Jordan G.

    2017-01-01

    A remote-sensing-based model to estimate evaporative fraction (EF) – the ratio of latent heat (LE; energy equivalent of evapotranspiration –ET–) to total available energy – from easily obtainable remotely-sensed and meteorological parameters is presented. This research specifically addresses the shortcomings of existing ET retrieval methods such as calibration requirements of extensive accurate in situ micrometeorological and flux tower observations or of a large set of coarse-resolution or model-derived input datasets. The trapezoid model is capable of generating spatially varying EF maps from standard products such as land surface temperature (Ts) normalized difference vegetation index (NDVI) and daily maximum air temperature (Ta). The 2009 model results were validated at an eddy-covariance tower (Fluxnet ID: US-Skr) in the Everglades using Ts and NDVI products from Landsat as well as the Moderate Resolution Imaging Spectroradiometer (MODIS) sensors. Results indicate that the model accuracy is within the range of instrument uncertainty, and is dependent on the spatial resolution and selection of end-members (i.e. wet/dry edge). The most accurate results were achieved with the Ts from Landsat relative to the Ts from the MODIS flown on the Terra and Aqua platforms due to the fine spatial resolution of Landsat (30 m). The bias, mean absolute percentage error and root mean square percentage error were as low as 2.9% (3.0%), 9.8% (13.3%), and 12.1% (16.1%) for Landsat-based (MODIS-based) EF estimates, respectively. Overall, this methodology shows promise for bridging the gap between temporally limited ET estimates at Landsat scales and more complex and difficult to constrain global ET remote-sensing models.

  9. On the growth estimates of entire functions of double complex variables

    Directory of Open Access Journals (Sweden)

    Sanjib Datta

    2017-08-01

    Full Text Available Recently Datta et al. (2016 introduced the idea of relative type and relative weak type of entire functions of two complex variables with respect to another entire function of two complex variables and prove some related growth properties of it. In this paper, further we study some growth properties of entire functions of two complex variables on the basis of their relative types and relative weak types as introduced by Datta et al (2016.

  10. Sharp or broad pulse peak for high resolution instruments? Choice of moderator performance

    International Nuclear Information System (INIS)

    Arai, M.; Watanabe, N.; Teshigawara, M.

    2001-01-01

    We demonstrate a concept how we should choose moderator performance to realize required performance for instruments. Neutron burst pulse can be characterized with peak intensity, peak width and tail. Those can be controllable by designing moderator, i.e. material, temperature, shape, decoupling, poisoning and having premoderator. Hence there are large number of variable parameters to be determined. Here we discuss the required moderator performance for some typical examples, i.e. high resolution powder instrument, chopper instrument, high resolution back scattering machine. (author)

  11. Estimation of Hedonic Single-Family House Price Function Considering Neighborhood Effect Variables

    Directory of Open Access Journals (Sweden)

    Chihiro Shimizu

    2014-05-01

    Full Text Available In the formulation of hedonic models, in addition to locational factors and building structures which affect the house prices, the generation of the omitted variable bias is thought to occur in cases when local environmental variables and the individual characteristics of house buyers are not taken into consideration. However, since it is difficult to obtain local environmental information in a small neighborhood unit and to observe individual characteristics of house buyers, these variables have not been sufficiently considered in previous studies. We demonstrated that non-negligible levels of omitted variable bias are generated if these variables are not considered.

  12. Correcting the Standard Errors of 2-Stage Residual Inclusion Estimators for Mendelian Randomization Studies.

    Science.gov (United States)

    Palmer, Tom M; Holmes, Michael V; Keating, Brendan J; Sheehan, Nuala A

    2017-11-01

    Mendelian randomization studies use genotypes as instrumental variables to test for and estimate the causal effects of modifiable risk factors on outcomes. Two-stage residual inclusion (TSRI) estimators have been used when researchers are willing to make parametric assumptions. However, researchers are currently reporting uncorrected or heteroscedasticity-robust standard errors for these estimates. We compared several different forms of the standard error for linear and logistic TSRI estimates in simulations and in real-data examples. Among others, we consider standard errors modified from the approach of Newey (1987), Terza (2016), and bootstrapping. In our simulations Newey, Terza, bootstrap, and corrected 2-stage least squares (in the linear case) standard errors gave the best results in terms of coverage and type I error. In the real-data examples, the Newey standard errors were 0.5% and 2% larger than the unadjusted standard errors for the linear and logistic TSRI estimators, respectively. We show that TSRI estimators with modified standard errors have correct type I error under the null. Researchers should report TSRI estimates with modified standard errors instead of reporting unadjusted or heteroscedasticity-robust standard errors. © The Author(s) 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health.

  13. Internal Interdecadal Variability in CMIP5 Control Simulations

    Science.gov (United States)

    Cheung, A. H.; Mann, M. E.; Frankcombe, L. M.; England, M. H.; Steinman, B. A.; Miller, S. K.

    2015-12-01

    Here we make use of control simulations from the CMIP5 models to quantify the amplitude of the interdecadal internal variability component in Atlantic, Pacific, and Northern Hemisphere mean surface temperature. We compare against estimates derived from observations using a semi-empirical approach wherein the forced component as estimated using CMIP5 historical simulations is removed to yield an estimate of the residual, internal variability. While the observational estimates are largely consistent with those derived from the control simulations for both basins and the Northern Hemisphere, they lie in the upper range of the model distributions, suggesting the possibility of differences between the amplitudes of observed and modeled variability. We comment on some possible reasons for the disparity.

  14. Virtual Instrumentation in Biomedical Equipment

    Directory of Open Access Journals (Sweden)

    Tiago Faustino Andrade

    2013-01-01

    Full Text Available Nowadays, the assessment of body composition by estimating the percentage of body fat has a great impact in many fields such as nutrition, health, sports, chronic diseases and others. The main purpose for this work is the development of a virtual instrument that permits more effective assessment of body fat, automatic data processing, recording results and storage in a database, with high potential to conduct new studies, http://lipotool.com.

  15. A Compact, Low Resource Instrument to Measure Atmospheric Methane and Carbon Dioxide From Orbit

    Science.gov (United States)

    Rafkin, Scot; Davis, Michael; Varner, Ruth; Basu, Sourish; Bruhwiler, Lori; Luspay-Kuti, Adrienn; Mandt, Kathy; Roming, Pete; Soto, Alejandro; Tapley, Mark

    2017-04-01

    Methane is the second most important radiatively active trace gas forcing anthropogenic climate change. Methane has ˜28 times more warming potential than carbon dioxide on a 100-year time horizon, and the background atmospheric concentration of methane has increased by more than 150% compared to pre-industrial levels. The increase in methane abundance is driven by a combination of direct human activity, such as fossil fuel extraction and agriculture, and natural feedback processes that respond to human-induced climate change, such as increased wetland production. Accurate accounting of the exchange between the atmosphere and the natural and anthropogenic methane reservoirs is necessary to predict how methane concentration will increase going forward, how that increase will modulate the natural methane cycle, and how effective policy decisions might be at mitigating methane-induced climate change. Monitoring and quantifying methane source intensity and spatial-temporal variability has proven challenging; there are unresolved and scientifically significant discrepancies between flux estimates based on limited surface measurements (the so-called "bottom-up" method) and the values derived from limited, remotely-sensed estimates from orbit and modeling (the so-called "top-down" method). A major source of the discrepancy between bottom-up and top-down estimates is likely a result of insufficient accuracy and resolution of space-based instrumentation. Methane releases, especially anthropogenic sources, are often at kilometer-scale (or less), whereas past remote sensing instruments have at least an order of magnitude greater footprint areas. Natural sources may be larger in areal extent, but the enhancement over background levels can be just a few percent, which demands high spectral resolution and signal-to-noise ratios from monitoring instrumentation. In response to the need for higher performance space-based methane monitoring, we have developed a novel, compact, low

  16. Variability of floods, droughts and windstorms over the past 500 years in Central Europe based on documentary and instrumental data

    Science.gov (United States)

    Brazdil, Rudolf

    2016-04-01

    Hydrological and meteorological extremes (HMEs) in Central Europe during the past 500 years can be reconstructed based on instrumental and documentary data. Documentary data about weather and related phenomena represent the basic source of information for historical climatology and hydrology, dealing with reconstruction of past climate and HMEs, their perception and impacts on human society. The paper presents the basic distribution of documentary data on (i) direct descriptions of HMEs and their proxies on the one hand and on (ii) individual and institutional data sources on the other. Several groups of documentary evidence such as narrative written records (annals, chronicles, memoirs), visual daily weather records, official and personal correspondence, special prints, financial and economic records (with particular attention to taxation data), newspapers, pictorial documentation, chronograms, epigraphic data, early instrumental observations, early scientific papers and communications are demonstrated with respect to extraction of information about HMEs, which concerns usually of their occurrence, severity, seasonality, meteorological causes, perception and human impacts. The paper further presents the analysis of 500-year variability of floods, droughts and windstorms on the base of series, created by combination of documentary and instrumental data. Results, advantages and drawbacks of such approach are documented on the examples from the Czech Lands. The analysis of floods concentrates on the River Vltava (Prague) and the River Elbe (Děčín) which show the highest frequency of floods occurring in the 19th century (mainly of winter synoptic type) and in the second half of the 16th century (summer synoptic type). Reported are also the most disastrous floods (August 1501, March and August 1598, February 1655, June 1675, February 1784, March 1845, February 1862, September 1890, August 2002) and the European context of floods in the severe winter 1783/84. Drought

  17. Transient response of level instruments in a research reactor

    International Nuclear Information System (INIS)

    Cheng, Lap Y.

    1989-01-01

    A numerical model has been developed to simulate the dynamics of water level instruments in a research nuclear reactor. A bubble device, with helium gas as the working fluid, is used to monitor liquid level by sensing the static head pressure due to the height of liquid in the reactor vessel. A finite-difference model is constructed to study the transient response of the water level instruments to pressure perturbations. The field equations which describe the hydraulics of the helium gas in the bubbler device are arranged in the form of a tridiagonal matrix and the field variables are solved at each time step by the Thomas algorithm. Simulation results indicate that the dynamic response of the helium gas depends mainly on the volume and the inertia of the gas in the level instrument tubings. The anomalies in the simulated level indication are attributed to the inherent lag in the level instrument due to the hydraulics of the system. 1 ref., 5 figs

  18. Mobile device-based optical instruments for agriculture

    Science.gov (United States)

    Sumriddetchkajorn, Sarun

    2013-05-01

    Realizing that a current smart-mobile device such as a cell phone and a tablet can be considered as a pocket-size computer embedded with a built-in digital camera, this paper reviews and demonstrates on how a mobile device can be specifically functioned as a portable optical instrument for agricultural applications. The paper highlights several mobile device-based optical instruments designed for searching small pests, measuring illumination level, analyzing spectrum of light, identifying nitrogen status in the rice field, estimating chlorine in water, and determining ripeness level of the fruit. They are suitable for individual use as well as for small and medium enterprises.

  19. Inspector-instrument interface in portable NDA instrumentation

    International Nuclear Information System (INIS)

    Halbig, J.K.; Klosterbuer, S.F.

    1981-01-01

    Recent electronics technology advances make it possible to design sophisticated instruments in small packages for convenient field implementation. An inspector-instrument interface design that allows communication of procedures, responses, and results between the instrument and user is presented. This capability has been incorporated into new spent-fuel instrumentation and a battery-powered multichannel analyzer

  20. Inspector-instrument interface in portable NDA instrumentation

    International Nuclear Information System (INIS)

    Halbig, J.K.; Klosterbuer, S.F.

    1981-01-01

    Recent electronics technology advances make it possible to design sophisticated instruments in small packages for convenient field implementation. This report describes an inspector-instrument interface design which allows communication of procedures, responses, and results between the instrument and user. The interface has been incorporated into new spent-fuel instrumentation and a battery-powered multichannel analyzer

  1. Methane Flux Estimation from Point Sources using GOSAT Target Observation: Detection Limit and Improvements with Next Generation Instruments

    Science.gov (United States)

    Kuze, A.; Suto, H.; Kataoka, F.; Shiomi, K.; Kondo, Y.; Crisp, D.; Butz, A.

    2017-12-01

    Atmospheric methane (CH4) has an important role in global radiative forcing of climate but its emission estimates have larger uncertainties than carbon dioxide (CO2). The area of anthropogenic emission sources is usually much smaller than 100 km2. The Thermal And Near infrared Sensor for carbon Observation Fourier-Transform Spectrometer (TANSO-FTS) onboard the Greenhouse gases Observing SATellite (GOSAT) has measured CO2 and CH4 column density using sun light reflected from the earth's surface. It has an agile pointing system and its footprint can cover 87-km2 with a single detector. By specifying pointing angles and observation time for every orbit, TANSO-FTS can target various CH4 point sources together with reference points every 3 day over years. We selected a reference point that represents CH4 background density before or after targeting a point source. By combining satellite-measured enhancement of the CH4 column density and surface measured wind data or estimates from the Weather Research and Forecasting (WRF) model, we estimated CH4emission amounts. Here, we picked up two sites in the US West Coast, where clear sky frequency is high and a series of data are available. The natural gas leak at Aliso Canyon showed a large enhancement and its decrease with time since the initial blowout. We present time series of flux estimation assuming the source is single point without influx. The observation of the cattle feedlot in Chino, California has weather station within the TANSO-FTS footprint. The wind speed is monitored continuously and the wind direction is stable at the time of GOSAT overpass. The large TANSO-FTS footprint and strong wind decreases enhancement below noise level. Weak wind shows enhancements in CH4, but the velocity data have large uncertainties. We show the detection limit of single samples and how to reduce uncertainty using time series of satellite data. We will propose that the next generation instruments for accurate anthropogenic CO2 and CH

  2. A Survey Instrument for Measuring the Experiential Value of Employee-Tourist Encounters

    DEFF Research Database (Denmark)

    Mattsson, Jan; Sørensen, Flemming; Jensen, Jens Friis

    In this paper, we develop and test a survey instrument that aims at estimating the experiential value of employee-tourist encounters in destination-based tourism companies, as well as the characteristics of encounters that affect such experiential value. We suggest that such an instrument can...... for their visitors, rather than simply delivering service quality....

  3. Improving risk estimates of runoff producing areas: formulating variable source areas as a bivariate process.

    Science.gov (United States)

    Cheng, Xiaoya; Shaw, Stephen B; Marjerison, Rebecca D; Yearick, Christopher D; DeGloria, Stephen D; Walter, M Todd

    2014-05-01

    Predicting runoff producing areas and their corresponding risks of generating storm runoff is important for developing watershed management strategies to mitigate non-point source pollution. However, few methods for making these predictions have been proposed, especially operational approaches that would be useful in areas where variable source area (VSA) hydrology dominates storm runoff. The objective of this study is to develop a simple approach to estimate spatially-distributed risks of runoff production. By considering the development of overland flow as a bivariate process, we incorporated both rainfall and antecedent soil moisture conditions into a method for predicting VSAs based on the Natural Resource Conservation Service-Curve Number equation. We used base-flow immediately preceding storm events as an index of antecedent soil wetness status. Using nine sub-basins of the Upper Susquehanna River Basin, we demonstrated that our estimated runoff volumes and extent of VSAs agreed with observations. We further demonstrated a method for mapping these areas in a Geographic Information System using a Soil Topographic Index. The proposed methodology provides a new tool for watershed planners for quantifying runoff risks across watersheds, which can be used to target water quality protection strategies. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. Validation of a Job Satisfaction Instrument for Residential-Care Employees.

    Science.gov (United States)

    Sluyter, Gary V.; Mukherjee, Ajit K.

    1986-01-01

    A new job satisfaction instrument for employees of a residential care facility for mentally retarded persons effectively measures the employees' satisfaction with 12 work related variables: salary, company policies, supervision, working conditions, interpersonal relations, security, advancement, recognition, achievement, work responsibility, and…

  5. Process instrumentation for nuclear power station

    International Nuclear Information System (INIS)

    Yanai, Katsuya; Shinohara, Katsuhiko

    1978-01-01

    Nuclear power stations are the large scale compound system composed of many process systems. Accordingly, for the safe and high reliability operation of the plants, it is necessary to grasp the conditions of respective processes exactly and control the operation correctly. For this purpose, the process instrumentation undertakes the important function to monitor the plant operation. Hitachi Ltd. has exerted ceaseless efforts since long before to establish the basic technology for the process instrumentation in nuclear power stations, to develop and improve hardwares of high reliability, and to establish the quality control system. As for the features of the process instrumentation in nuclear power stations, the enormous quantity of measurement, the diversity of measured variables, the remote measurement and monitoring method, and the ensuring of high reliability are enumerated. Also the hardwares must withstand earthquakes, loss of coolant accidents, radiations, leaks and fires. Hitachi Unitrol Sigma Series is the measurement system which is suitable to the general process instrumentation in nuclear power stations, and satisfies sufficiently the basic requirements described above. It has various features as the nuclear energy system, such as high reliability by the use of ICs, the methods of calculation and transmission considering signal linkage, loop controller system and small size. HIACS-1000 Series is the analog controller of high reliability for water control. (Kako, I.)

  6. A Monte Carlo study comparing PIV, ULS and DWLS in the estimation of dichotomous confirmatory factor analysis.

    Science.gov (United States)

    Nestler, Steffen

    2013-02-01

    We conducted a Monte Carlo study to investigate the performance of the polychoric instrumental variable estimator (PIV) in comparison to unweighted least squares (ULS) and diagonally weighted least squares (DWLS) in the estimation of a confirmatory factor analysis model with dichotomous indicators. The simulation involved 144 conditions (1,000 replications per condition) that were defined by a combination of (a) two types of latent factor models, (b) four sample sizes (100, 250, 500, 1,000), (c) three factor loadings (low, moderate, strong), (d) three levels of non-normality (normal, moderately, and extremely non-normal), and (e) whether the factor model was correctly specified or misspecified. The results showed that when the model was correctly specified, PIV produced estimates that were as accurate as ULS and DWLS. Furthermore, the simulation showed that PIV was more robust to structural misspecifications than ULS and DWLS. © 2012 The British Psychological Society.

  7. MINIMUM VARIANCE BETA ESTIMATION WITH DYNAMIC CONSTRAINTS,

    Science.gov (United States)

    developed (at AFETR ) and is being used to isolate the primary error sources in the beta estimation task. This computer program is additionally used to...determine what success in beta estimation can be achieved with foreseeable instrumentation accuracies. Results are included that illustrate the effects on

  8. Psychological variables involved in teacher’s job performance

    OpenAIRE

    Torres Valladares, Manuel; Lajo Lazo, Rosario

    2014-01-01

    The purpose of this study is to analyze the casual relations that can exist between some psychological variables (Personality Type A, Stress facing and Burnout Syndrome) and the labour performance of university teachers from  five faculties of medicine of Lima Metropolitana. The instruments used were: Blumenthal’s inventory of auto report of behaviour type A, COPE, Maslasch’s Burnout inventory and the teacher’s labour performance made by Manuel Fernández Arata. All these instruments were subj...

  9. Ordered random variables theory and applications

    CERN Document Server

    Shahbaz, Muhammad Qaiser; Hanif Shahbaz, Saman; Al-Zahrani, Bander M

    2016-01-01

    Ordered Random Variables have attracted several authors. The basic building block of Ordered Random Variables is Order Statistics which has several applications in extreme value theory and ordered estimation. The general model for ordered random variables, known as Generalized Order Statistics has been introduced relatively recently by Kamps (1995).

  10. Spatiotemporal estimation of historical PM2.5 concentrations using PM10, meteorological variables, and spatial effect

    Science.gov (United States)

    Li, Lianfa; Wu, Anna H.; Cheng, Iona; Chen, Jiu-Chiuan; Wu, Jun

    2017-10-01

    Monitoring of fine particulate matter with diameter health outcomes such as cancer. In this study, we aimed to design a flexible approach to reliably estimate historical PM2.5 concentrations by incorporating spatial effect and the measurements of existing co-pollutants such as particulate matter with diameter additive non-linear model. The spatiotemporal model was evaluated, using leaving-one-site-month-out cross validation. Our final daily model had an R2 of 0.81, with PM10, meteorological variables, and spatial autocorrelation, explaining 55%, 10%, and 10% of the variance in PM2.5 concentrations, respectively. The model had a cross-validation R2 of 0.83 for monthly PM2.5 concentrations (N = 8170) and 0.79 for daily PM2.5 concentrations (N = 51,421) with few extreme values in prediction. Further, the incorporation of spatial effects reduced bias in predictions. Our approach achieved a cross validation R2 of 0.61 for the daily model when PM10 was replaced by total suspended particulate. Our model can robustly estimate historical PM2.5 concentrations in California when PM2.5 measurements were not available.

  11. Infectious complications in head and neck cancer patients treated with cetuximab: propensity score and instrumental variable analysis.

    Directory of Open Access Journals (Sweden)

    Ching-Chih Lee

    Full Text Available BACKGROUND: To compare the infection rates between cetuximab-treated patients with head and neck cancers (HNC and untreated patients. METHODOLOGY: A national cohort of 1083 HNC patients identified in 2010 from the Taiwan National Health Insurance Research Database was established. After patients were followed for one year, propensity score analysis and instrumental variable analysis were performed to assess the association between cetuximab therapy and the infection rates. RESULTS: HNC patients receiving cetuximab (n = 158 were older, had lower SES, and resided more frequently in rural areas as compared to those without cetuximab therapy. 125 patients, 32 (20.3% in the group using cetuximab and 93 (10.1% in the group not using it presented infections. The propensity score analysis revealed a 2.3-fold (adjusted odds ratio [OR] = 2.27; 95% CI, 1.46-3.54; P = 0.001 increased risk for infection in HNC patients treated with cetuximab. However, using IVA, the average treatment effect of cetuximab was not statistically associated with increased risk of infection (OR, 0.87; 95% CI, 0.61-1.14. CONCLUSIONS: Cetuximab therapy was not statistically associated with infection rate in HNC patients. However, older HNC patients using cetuximab may incur up to 33% infection rate during one year. Particular attention should be given to older HNC patients treated with cetuximab.

  12. Using cost-effectiveness estimates from survey data to guide commissioning: an application to home care.

    Science.gov (United States)

    Forder, Julien; Malley, Juliette; Towers, Ann-Marie; Netten, Ann

    2014-08-01

    The aim is to describe and trial a pragmatic method to produce estimates of the incremental cost-effectiveness of care services from survey data. The main challenge is in estimating the counterfactual; that is, what the patient's quality of life would be if they did not receive that level of service. A production function method is presented, which seeks to distinguish the variation in care-related quality of life in the data that is due to service use as opposed to other factors. A problem is that relevant need factors also affect the amount of service used and therefore any missing factors could create endogeneity bias. Instrumental variable estimation can mitigate this problem. This method was applied to a survey of older people using home care as a proof of concept. In the analysis, we were able to estimate a quality-of-life production function using survey data with the expected form and robust estimation diagnostics. The practical advantages with this method are clear, but there are limitations. It is computationally complex, and there is a risk of misspecification and biased results, particularly with IV estimation. One strategy would be to use this method to produce preliminary estimates, with a full trial conducted thereafter, if indicated. Copyright © 2013 John Wiley & Sons, Ltd.

  13. A Review of the LHC Beam Instrumentation

    CERN Document Server

    Fischer, Cl

    1998-01-01

    This is a review of the diagnostics presently considered for LHC running in and operation. The purpose of each instrument is given and, except for the pick-ups of the Beam Position Monitoring system w hich have their position already reserved, the optimal location and an estimate of the required length along the vacuum chamber are given.

  14. Impact of ground motion characterization on conservatism and variability in seismic risk estimates

    International Nuclear Information System (INIS)

    Sewell, R.T.; Toro, G.R.; McGuire, R.K.

    1996-07-01

    This study evaluates the impact, on estimates of seismic risk and its uncertainty, of alternative methods in treatment and characterization of earthquake ground motions. The objective of this study is to delineate specific procedures and characterizations that may lead to less biased and more precise seismic risk results. This report focuses on sources of conservatism and variability in risk that may be introduced through the analytical processes and ground-motion descriptions which are commonly implemented at the interface of seismic hazard and fragility assessments. In particular, implication of the common practice of using a single, composite spectral shape to characterize motions of different magnitudes is investigated. Also, the impact of parameterization of ground motion on fragility and hazard assessments is shown. Examination of these results demonstrates the following. (1) There exists significant conservatism in the review spectra (usually, spectra characteristic of western U.S. earthquakes) that have been used in conducting past seismic risk assessments and seismic margin assessments for eastern U.S. nuclear power plants. (2) There is a strong dependence of seismic fragility on earthquake magnitude when PGA is used as the ground-motion characterization. When, however, magnitude-dependent spectra are anchored to a common measure of elastic spectral acceleration averaged over the appropriate frequency range, seismic fragility shows no important nor consistent dependence on either magnitude or strong-motion duration. Use of inelastic spectral acceleration (at the proper frequency) as the ground spectrum anchor demonstrates a very similar result. This study concludes that a single, composite-magnitude spectrum can generally be used to characterize ground motion for fragility assessment without introducing significant bias or uncertainty in seismic risk estimates

  15. Impact of ground motion characterization on conservatism and variability in seismic risk estimates

    Energy Technology Data Exchange (ETDEWEB)

    Sewell, R.T.; Toro, G.R.; McGuire, R.K.

    1996-07-01

    This study evaluates the impact, on estimates of seismic risk and its uncertainty, of alternative methods in treatment and characterization of earthquake ground motions. The objective of this study is to delineate specific procedures and characterizations that may lead to less biased and more precise seismic risk results. This report focuses on sources of conservatism and variability in risk that may be introduced through the analytical processes and ground-motion descriptions which are commonly implemented at the interface of seismic hazard and fragility assessments. In particular, implication of the common practice of using a single, composite spectral shape to characterize motions of different magnitudes is investigated. Also, the impact of parameterization of ground motion on fragility and hazard assessments is shown. Examination of these results demonstrates the following. (1) There exists significant conservatism in the review spectra (usually, spectra characteristic of western U.S. earthquakes) that have been used in conducting past seismic risk assessments and seismic margin assessments for eastern U.S. nuclear power plants. (2) There is a strong dependence of seismic fragility on earthquake magnitude when PGA is used as the ground-motion characterization. When, however, magnitude-dependent spectra are anchored to a common measure of elastic spectral acceleration averaged over the appropriate frequency range, seismic fragility shows no important nor consistent dependence on either magnitude or strong-motion duration. Use of inelastic spectral acceleration (at the proper frequency) as the ground spectrum anchor demonstrates a very similar result. This study concludes that a single, composite-magnitude spectrum can generally be used to characterize ground motion for fragility assessment without introducing significant bias or uncertainty in seismic risk estimates.

  16. Histogram Estimators of Bivariate Densities

    National Research Council Canada - National Science Library

    Husemann, Joyce A

    1986-01-01

    One-dimensional fixed-interval histogram estimators of univariate probability density functions are less efficient than the analogous variable-interval estimators which are constructed from intervals...

  17. Biological Sampling Variability Study

    Energy Technology Data Exchange (ETDEWEB)

    Amidan, Brett G. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hutchison, Janine R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-11-08

    There are many sources of variability that exist in the sample collection and analysis process. This paper addresses many, but not all, sources of variability. The main focus of this paper was to better understand and estimate variability due to differences between samplers. Variability between days was also studied, as well as random variability within each sampler. Experiments were performed using multiple surface materials (ceramic and stainless steel), multiple contaminant concentrations (10 spores and 100 spores), and with and without the presence of interfering material. All testing was done with sponge sticks using 10-inch by 10-inch coupons. Bacillus atrophaeus was used as the BA surrogate. Spores were deposited using wet deposition. Grime was coated on the coupons which were planned to include the interfering material (Section 3.3). Samples were prepared and analyzed at PNNL using CDC protocol (Section 3.4) and then cultured and counted. Five samplers were trained so that samples were taken using the same protocol. Each sampler randomly sampled eight coupons each day, four coupons with 10 spores deposited and four coupons with 100 spores deposited. Each day consisted of one material being tested. The clean samples (no interfering materials) were run first, followed by the dirty samples (coated with interfering material). There was a significant difference in recovery efficiency between the coupons with 10 spores deposited (mean of 48.9%) and those with 100 spores deposited (mean of 59.8%). There was no general significant difference between the clean and dirty (containing interfering material) coupons or between the two surface materials; however, there was a significant interaction between concentration amount and presence of interfering material. The recovery efficiency was close to the same for coupons with 10 spores deposited, but for the coupons with 100 spores deposited, the recovery efficiency for the dirty samples was significantly larger (65

  18. A Fast and Sensitive New Satellite SO2 Retrieval Algorithm based on Principal Component Analysis: Application to the Ozone Monitoring Instrument

    Science.gov (United States)

    Li, Can; Joiner, Joanna; Krotkov, A.; Bhartia, Pawan K.

    2013-01-01

    We describe a new algorithm to retrieve SO2 from satellite-measured hyperspectral radiances. We employ the principal component analysis technique in regions with no significant SO2 to capture radiance variability caused by both physical processes (e.g., Rayleigh and Raman scattering and ozone absorption) and measurement artifacts. We use the resulting principal components and SO2 Jacobians calculated with a radiative transfer model to directly estimate SO2 vertical column density in one step. Application to the Ozone Monitoring Instrument (OMI) radiance spectra in 310.5-340 nm demonstrates that this approach can greatly reduce biases in the operational OMI product and decrease the noise by a factor of 2, providing greater sensitivity to anthropogenic emissions. The new algorithm is fast, eliminates the need for instrument-specific radiance correction schemes, and can be easily adapted to other sensors. These attributes make it a promising technique for producing longterm, consistent SO2 records for air quality and climate research.

  19. An integrated approach to endoscopic instrument tracking for augmented reality applications in surgical simulation training.

    Science.gov (United States)

    Loukas, Constantinos; Lahanas, Vasileios; Georgiou, Evangelos

    2013-12-01

    Despite the popular use of virtual and physical reality simulators in laparoscopic training, the educational potential of augmented reality (AR) has not received much attention. A major challenge is the robust tracking and three-dimensional (3D) pose estimation of the endoscopic instrument, which are essential for achieving interaction with the virtual world and for realistic rendering when the virtual scene is occluded by the instrument. In this paper we propose a method that addresses these issues, based solely on visual information obtained from the endoscopic camera. Two different tracking algorithms are combined for estimating the 3D pose of the surgical instrument with respect to the camera. The first tracker creates an adaptive model of a colour strip attached to the distal part of the tool (close to the tip). The second algorithm tracks the endoscopic shaft, using a combined Hough-Kalman approach. The 3D pose is estimated with perspective geometry, using appropriate measurements extracted by the two trackers. The method has been validated on several complex image sequences for its tracking efficiency, pose estimation accuracy and applicability in AR-based training. Using a standard endoscopic camera, the absolute average error of the tip position was 2.5 mm for working distances commonly found in laparoscopic training. The average error of the instrument's angle with respect to the camera plane was approximately 2°. The results are also supplemented by video segments of laparoscopic training tasks performed in a physical and an AR environment. The experiments yielded promising results regarding the potential of applying AR technologies for laparoscopic skills training, based on a computer vision framework. The issue of occlusion handling was adequately addressed. The estimated trajectory of the instruments may also be used for surgical gesture interpretation and assessment. Copyright © 2013 John Wiley & Sons, Ltd.

  20. Can Reliability of Multiple Component Measuring Instruments Depend on Response Option Presentation Mode?

    Science.gov (United States)

    Menold, Natalja; Raykov, Tenko

    2016-01-01

    This article examines the possible dependency of composite reliability on presentation format of the elements of a multi-item measuring instrument. Using empirical data and a recent method for interval estimation of group differences in reliability, we demonstrate that the reliability of an instrument need not be the same when polarity of the…

  1. MANU. Instrumentation of Buffer Demo. Preliminary Study

    International Nuclear Information System (INIS)

    Laaksonen, R.

    2010-01-01

    The purpose of this work is to describe feasible measuring and monitoring alternatives which can be used, if needed, in medium to full scale nuclear waste repository deposition hole mock-up tests. The focus of the work was to determine what variables can actually be measured, how to achieve the measurements and what kind of demands comes from the modelling, scientific, and technical points of view. This project includes a review of the previous waste repository mock-up tests carried out in several European countries such as Belgium, Czech Republic, Spain and Sweden. Also information was gathered by interviewing domestic and foreign scientists specialized in the fields of measurement instrumentation and related in-situ and laboratory work. On the basis of this review, recommendations were developed for the necessary actions needed to be done from the instrumentation point of view for future tests. It is possible to measure and monitor the processes going on in a deposition hole in-situ conditions. The data received during a test in real repository conditions enables to follow the processes and to verify the hypothesis made on the behaviour of various components of the repository: buffer, canister, rock and backfill. Because full scale testing is expensive, the objectives and hypothesis must be carefully set and the test itself with its instrumentation must serve very specific objectives. The main purpose of mock-up tests is to verify that the conditions surrounding the canister are according to the design requirements. A whole mock-up test and demonstration process requires a lot of time and effort. The instrumentation part of the work must also start at early stages to ensure that the instrumentation itself will not become bottlenecked nor suffer from low quality solutions. The planning of the instrumentation work could be done in collaboration with foreign scientists which have participated to previous instrumentation projects. (orig.)

  2. A Tool for Estimating Variability in Wood Preservative Treatment Retention

    Science.gov (United States)

    Patricia K. Lebow; Adam M. Taylor; Timothy M. Young

    2015-01-01

    Composite sampling is standard practice for evaluation of preservative retention levels in preservative-treated wood. Current protocols provide an average retention value but no estimate of uncertainty. Here we describe a statistical method for calculating uncertainty estimates using the standard sampling regime with minimal additional chemical analysis. This tool can...

  3. Instrumentation

    International Nuclear Information System (INIS)

    Prieur, G.; Nadi, M.; Hedjiedj, A.; Weber, S.

    1995-01-01

    This second chapter on instrumentation gives little general consideration on history and classification of instrumentation, and two specific states of the art. The first one concerns NMR (block diagram of instrumentation chain with details on the magnets, gradients, probes, reception unit). The first one concerns precision instrumentation (optical fiber gyro-meter and scanning electron microscope), and its data processing tools (programmability, VXI standard and its history). The chapter ends with future trends on smart sensors and Field Emission Displays. (D.L.). Refs., figs

  4. The variability of interconnected wind plants

    International Nuclear Information System (INIS)

    Katzenstein, Warren; Fertig, Emily; Apt, Jay

    2010-01-01

    We present the first frequency-dependent analyses of the geographic smoothing of wind power's variability, analyzing the interconnected measured output of 20 wind plants in Texas. Reductions in variability occur at frequencies corresponding to times shorter than ∼24 h and are quantified by measuring the departure from a Kolmogorov spectrum. At a frequency of 2.8x10 -4 Hz (corresponding to 1 h), an 87% reduction of the variability of a single wind plant is obtained by interconnecting 4 wind plants. Interconnecting the remaining 16 wind plants produces only an additional 8% reduction. We use step change analyses and correlation coefficients to compare our results with previous studies, finding that wind power ramps up faster than it ramps down for each of the step change intervals analyzed and that correlation between the power output of wind plants 200 km away is half that of co-located wind plants. To examine variability at very low frequencies, we estimate yearly wind energy production in the Great Plains region of the United States from automated wind observations at airports covering 36 years. The estimated wind power has significant inter-annual variability and the severity of wind drought years is estimated to be about half that observed nationally for hydroelectric power.

  5. Benefits of using customized instrumentation in total knee arthroplasty: results from an activity-based costing model.

    Science.gov (United States)

    Tibesku, Carsten O; Hofer, Pamela; Portegies, Wesley; Ruys, C J M; Fennema, Peter

    2013-03-01

    The growing demand for total knee arthroplasty (TKA) associated with the efforts to contain healthcare expenditure by advanced economies necessitates the use of economically effective technologies in TKA. The present analysis based on activity-based costing (ABC) model was carried out to estimate the economic value of patient-matched instrumentation (PMI) compared to standard surgical instrumentation in TKA. The costs of the two approaches, PMI and standard instrumentation in TKA, were determined by the use of ABC which measures the cost of a particular procedure by determining the activities involved and adding the cost of each activity. Improvement in productivity due to increased operating room (OR) turn-around times was determined and potential additional revenue to the hospital by the efficient utilization of gained OR time was estimated. Increased efficiency in the usage of OR and utilization of surgical trays were noted with patient-specific approach. Potential revenues to the hospital were estimated with the use of PMI by efficient utilization of time saved in OR. Additional revenues of 78,240 per year were estimated considering utilization of gained OR time to perform surgeries other than TKA. The analysis suggests that use of PMI in TKA is economically effective when compared to standard instrumentation.

  6. Work-nonwork interference: Preliminary results on the psychometric properties of a new instrument

    Directory of Open Access Journals (Sweden)

    Eileen Koekemoer

    2010-11-01

    Research purpose: The objectives of this study were to investigate the internal validity (construct, discriminant and convergent validity, reliability and external validity (relationship with theoretically relevant variables, including job characteristics, home characteristics, burnout, ill health and life satisfaction of the instrument. Motivation for the study: Work-family interaction is a key topic receiving significant research attention. In order to facilitate comparison across work-family studies, the use of psychometrically sound instruments is of great importance. Research design, approach and method: A cross-sectional survey design was used for the target population of married employees with children working at a tertiary institution in the North West province (n = 366. In addition to the new instrument, job characteristics, home characteristics, burnout, ill health and life satisfaction were measured. Main findings: The results provided evidence for construct, discriminant and convergent validity, reliability and significant relations with external variables. Practical/managerial implications: The new instrument can be used by researchers and managers as a test under development to investigate the interference between work and different nonwork roles (i.e. parental role, spousal role, work role, domestic role and specific relations with antecedents (e.g. job/home characteristics and well-being (e.g. burnout, ill health and life satisfaction. Contribution/value-add: This study provides preliminary information on the psychometric properties of a new instrument that measures the interference between work and nonwork.

  7. A framework for assessing Health Economic Evaluation (HEE) quality appraisal instruments.

    Science.gov (United States)

    Langer, Astrid

    2012-08-16

    Health economic evaluations support the health care decision-making process by providing information on costs and consequences of health interventions. The quality of such studies is assessed by health economic evaluation (HEE) quality appraisal instruments. At present, there is no instrument for measuring and improving the quality of such HEE quality appraisal instruments. Therefore, the objectives of this study are to establish a framework for assessing the quality of HEE quality appraisal instruments to support and improve their quality, and to apply this framework to those HEE quality appraisal instruments which have been subject to more scrutiny than others, in order to test the framework and to demonstrate the shortcomings of existing HEE quality appraisal instruments. To develop the quality assessment framework for HEE quality appraisal instruments, the experiences of using appraisal tools for clinical guidelines are used. Based on a deductive iterative process, clinical guideline appraisal instruments identified through literature search are reviewed, consolidated, and adapted to produce the final quality assessment framework for HEE quality appraisal instruments. The final quality assessment framework for HEE quality appraisal instruments consists of 36 items organized within 7 dimensions, each of which captures a specific domain of quality. Applying the quality assessment framework to four existing HEE quality appraisal instruments, it is found that these four quality appraisal instruments are of variable quality. The framework described in this study should be regarded as a starting point for appraising the quality of HEE quality appraisal instruments. This framework can be used by HEE quality appraisal instrument producers to support and improve the quality and acceptance of existing and future HEE quality appraisal instruments. By applying this framework, users of HEE quality appraisal instruments can become aware of methodological deficiencies

  8. Variation in posture quality across musical instruments and its impact during performances.

    Science.gov (United States)

    Blanco-Piñeiro, Patricia; Díaz-Pereira, M Pino; Martínez Vidal, Aurora

    2018-06-01

    Bad posture increases the risk that a musician may suffer from musculoskeletal disorders. This study compared posture quality required by different instruments or families of instruments. Using an ad-hoc postural observation instrument embracing 11 postural variables, four experts evaluated the postures of 100 students attending a Spanish higher conservatory of music. The agreement of the experts' evaluations was statistically confirmed by a Cohen's κ value between 0.855 and 1.000 and a Kendall value between 0.709 and 1.000 (p instrument families and seated posture with respect to pelvic attitude, dorsal curvature and head alignment in both sagittal and frontal planes. This analysis also showed an association between instrument families and standing posture with respect to the frontal plane of the axis of gravity, pelvic attitude, head alignment in the frontal plane, the sagittal plane of the shoulders and overall posture. While certain postural defects appear to be common to all families of instruments, others are more characteristic of some families than others. The instrument associated with the best posture quality was the bagpipe, followed by percussion and strings.

  9. Beam Instrumentation and Diagnostics

    CERN Document Server

    Strehl, Peter

    2006-01-01

    This treatise covers all aspects of the design and the daily operations of a beam diagnostic system for a large particle accelerator. A very interdisciplinary field, it involves contributions from physicists, electrical and mechanical engineers and computer experts alike so as to satisfy the ever-increasing demands for beam parameter variability for a vast range of operation modi and particles. The author draws upon 40 years of research and work, most of them spent as the head of the beam diagnostics group at GSI. He has illustrated the more theoretical aspects with many real-life examples that will provide beam instrumentation designers with ideas and tools for their work.

  10. Investigation of load reduction for a variable speed, variable pitch, and variable coning wind turbine

    Energy Technology Data Exchange (ETDEWEB)

    Pierce, K. [Univ. of Utah, Salt Lake City, UT (United States)

    1997-12-31

    A two bladed, variable speed and variable pitch wind turbine was modeled using ADAMS{reg_sign} to evaluate load reduction abilities of a variable coning configuration as compared to a teetered rotor, and also to evaluate control methods. The basic dynamic behavior of the variable coning turbine was investigated and compared to the teetered rotor under constant wind conditions as well as turbulent wind conditions. Results indicate the variable coning rotor has larger flap oscillation amplitudes and much lower root flap bending moments than the teetered rotor. Three methods of control were evaluated for turbulent wind simulations. These were a standard IPD control method, a generalized predictive control method, and a bias estimate control method. Each control method was evaluated for both the variable coning configuration and the teetered configuration. The ability of the different control methods to maintain the rotor speed near the desired set point is evaluated from the RMS error of rotor speed. The activity of the control system is evaluated from cycles per second of the blade pitch angle. All three of the methods were found to produce similar results for the variable coning rotor and the teetered rotor, as well as similar results to each other.

  11. Varying ultrasound power level to distinguish surgical instruments and tissue.

    Science.gov (United States)

    Ren, Hongliang; Anuraj, Banani; Dupont, Pierre E

    2018-03-01

    We investigate a new framework of surgical instrument detection based on power-varying ultrasound images with simple and efficient pixel-wise intensity processing. Without using complicated feature extraction methods, we identified the instrument with an estimated optimal power level and by comparing pixel values of varying transducer power level images. The proposed framework exploits the physics of ultrasound imaging system by varying the transducer power level to effectively distinguish metallic surgical instruments from tissue. This power-varying image-guidance is motivated from our observations that ultrasound imaging at different power levels exhibit different contrast enhancement capabilities between tissue and instruments in ultrasound-guided robotic beating-heart surgery. Using lower transducer power levels (ranging from 40 to 75% of the rated lowest ultrasound power levels of the two tested ultrasound scanners) can effectively suppress the strong imaging artifacts from metallic instruments and thus, can be utilized together with the images from normal transducer power levels to enhance the separability between instrument and tissue, improving intraoperative instrument tracking accuracy from the acquired noisy ultrasound volumetric images. We performed experiments in phantoms and ex vivo hearts in water tank environments. The proposed multi-level power-varying ultrasound imaging approach can identify robotic instruments of high acoustic impedance from low-signal-to-noise-ratio ultrasound images by power adjustments.

  12. Instrumentation

    Energy Technology Data Exchange (ETDEWEB)

    Decreton, M

    2001-04-01

    SCK-CEN's research and development programme on instrumentation involves the assessment and the development of sensitive measurement systems used within a radiation environment. Particular emphasis is on the assessment of optical fibre components and their adaptability to radiation environments. The evaluation of ageing processes of instrumentation in fission plants, the development of specific data evaluation strategies to compensate for ageing induced degradation of sensors and cable performance form part of these activities. In 2000, particular emphasis was on in-core reactor instrumentation applied to fusion, accelerator driven and water-cooled fission reactors. This involved the development of high performance instrumentation for irradiation experiments in the BR2 reactor in support of new instrumentation needs for MYRRHA, and for diagnostic systems for the ITER reactor.

  13. Instrumentation

    International Nuclear Information System (INIS)

    Decreton, M.

    2001-01-01

    SCK-CEN's research and development programme on instrumentation involves the assessment and the development of sensitive measurement systems used within a radiation environment. Particular emphasis is on the assessment of optical fibre components and their adaptability to radiation environments. The evaluation of ageing processes of instrumentation in fission plants, the development of specific data evaluation strategies to compensate for ageing induced degradation of sensors and cable performance form part of these activities. In 2000, particular emphasis was on in-core reactor instrumentation applied to fusion, accelerator driven and water-cooled fission reactors. This involved the development of high performance instrumentation for irradiation experiments in the BR2 reactor in support of new instrumentation needs for MYRRHA, and for diagnostic systems for the ITER reactor

  14. Diabetes and quality of life: Comparing results from utility instruments and Diabetes-39.

    Science.gov (United States)

    Chen, Gang; Iezzi, Angelo; McKie, John; Khan, Munir A; Richardson, Jeff

    2015-08-01

    To compare the Diabetes-39 (D-39) with six multi-attribute utility (MAU) instruments (15D, AQoL-8D, EQ-5D, HUI3, QWB, and SF-6D), and to develop mapping algorithms which could be used to transform the D-39 scores into the MAU scores. Self-reported diabetes sufferers (N=924) and members of the healthy public (N=1760), aged 18 years and over, were recruited from 6 countries (Australia 18%, USA 18%, UK 17%, Canada 16%, Norway 16%, and Germany 15%). Apart from the QWB which was distributed normally, non-parametric rank tests were used to compare subgroup utilities and D-39 scores. Mapping algorithms were estimated using ordinary least squares (OLS) and generalised linear models (GLM). MAU instruments discriminated between diabetes patients and the healthy public; however, utilities varied between instruments. The 15D, SF-6D, AQoL-8D had the strongest correlations with the D-39. Except for the HUI3, there were significant differences by gender. Mapping algorithms based on the OLS estimator consistently gave better goodness-of-fit results. The mean absolute error (MAE) values ranged from 0.061 to 0.147, the root mean square error (RMSE) values 0.083 to 0.198, and the R-square statistics 0.428 and 0.610. Based on MAE and RMSE values the preferred mapping is D-39 into 15D. R-square statistics and the range of predicted utilities indicate the preferred mapping is D-39 into AQoL-8D. Utilities estimated from different MAU instruments differ significantly and the outcome of a study could depend upon the instrument used. The algorithms reported in this paper enable D-39 data to be mapped into utilities predicted from any of six instruments. This provides choice for those conducting cost-utility analyses. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  15. Manufacturing challenge: An employee perception of the impact of BEM variables on motivation

    Science.gov (United States)

    Nyaude, Alaster

    The study examines the impact of Thomas F. Gilbert's Behavior Engineering Model (BEM) variables on employee perception of motivation at an aerospace equipment manufacturing plant in Georgia. The research process involved literature review, and determination of an appropriate survey instrument for the study. The Hersey-Chevalier modified PROBE instrument (Appendix C) was used with Dr Roger Chevalier's validation. The participants' responses were further examined to determine the influence of demographic control variables of age, gender, length of service with the company and education on employee perception of motivation. The results indicated that the top three highly motivating variables were knowledge and skills, capacity and resources. Knowledge and skills was perceived to be highly motivating, capacity as second highly motivating and resources as the third highly motivating variable. Interestingly, the fourth highly motivating variable was information, the fifth was motives and the sixth was incentives. The results also showed that demographic control variables had no influence on employee perception of motivation. Further research may be required to understand to what extend these BEM variables impact employee perceptions of motivation.

  16. Reactivity estimation using digital nonlinear H∞ estimator for VHTRC experiment

    International Nuclear Information System (INIS)

    Suzuki, Katsuo; Nabeshima, Kunihiko; Yamane, Tsuyoshi

    2003-01-01

    On-line and real-time estimation of time-varying reactivity in a nuclear reactor in necessary for early detection of reactivity anomaly and safe operation. Using a digital nonlinear H ∞ estimator, an experiment of real-time dynamic reactivity estimation was carried out in the Very High Temperature Reactor Critical Assembly (VHTRC) of Japan Atomic Energy Research Institute. Some technical issues of the experiment are described, such as reactivity insertion, data sampling frequency, anti-aliasing filter, experimental circuit and digitalising nonlinear H ∞ reactivity estimator, and so on. Then, we discussed the experimental results obtained by the digital nonlinear H ∞ estimator with sampled data of the nuclear instrumentation signal for the power responses under various reactivity insertions. Good performances of estimated reactivity were observed, with almost no delay to the true reactivity and sufficient accuracy between 0.05 cent and 0.1 cent. The experiment shows that real-time reactivity for data sampling period of 10 ms can be certainly realized. From the results of the experiment, it is concluded that the digital nonlinear H ∞ reactivity estimator can be applied as on-line real-time reactivity meter for actual nuclear plants. (author)

  17. The Instrumentation Channel for the MUCOOL Experiment

    International Nuclear Information System (INIS)

    Kahn, S. A.; Guler, H.; Lu, C.; McDonald, K. T.; Prebys, E. J.; Vahsen, S. E.

    1999-01-01

    The MUCOOL facility is proposed to examine cooling techniques that could be used in a muon collider. The solenoidal beam channel before and after the cooling test section are instrumented to measure the beam emittance. This instrumentation channel includes a bent solenoid to provide dispersion and time projection chambers to measure the beam variables before and after the bend. The momentum of the muons is obtained from a measurement of the drift of the muon trajectory in the bent solenoid. The timing measurement is made by determining the phase from the momentum of the muon before and after it traverses RF cavities or by the use of a fast Cherenkov chamber. A computer simulation of the muon solenoidal channel is performed using GEANT. This study evaluates the resolution of the beam emittance measurement for MUCOOL

  18. Leaf area index estimation in a pine plantation with LAI-2000 under direct sunlight conditions: relationship with inventory and hydrologic variables

    International Nuclear Information System (INIS)

    Molina, A.; Campo, A. D. del

    2011-01-01

    LAI is a key factor in light and rainfall interception processes in forest stands and, for this reason, is called to play an important role in global change adaptive silviculture. Therefore, it is necessary to develop practical and operative methodologies to measure this parameter as well as simple relationships with other silviculture variables. This work has studied 1) the feasibility of LAI-2000 sensor in estimating LAI-stand when readings are taken under direct sunlight conditions; and 2) the ability of LAI in studying rainfall partitioned into throughfall (T) in an Aleppo pine stand after different thinning intensities, as well as its relationships to basal area, (G), cover (FCC), and tree density (D). Results showed that the angular correction scheme applied to LAI-2000 direct-sunlight readings stabilized them for different solar angles, allowing a better operational use of LAI-2000 in Mediterranean areas, where uniform overcast conditions are difficult to meet and predict. Forest cover showed the highest predictive ability of LAI (R 2 = 0.98; S = 0.28), then G (R 2 = 0.96; S = 0.43) and D (R 2 = 0.50; S = 0.28). In the hydrological plane, T increased with thinning intensity, being G the most explanatory variable (R 2 = 0.81; S = 3.07) and LAI the one that showed the poorest relation with it (R 2 = 0.69; S = 3.95). These results open a way for forest hydrologic modeling taking LAI as an input variable either estimated form LAI-2000 or deducted from inventory data. (Author) 36 refs.

  19. On-line testing of calibration of process instrumentation channels in nuclear power plants. Phase 2, Final report

    International Nuclear Information System (INIS)

    Hashemian, H.M.

    1995-11-01

    The nuclear industry is interested in automating the calibration of process instrumentation channels; this report provides key results of one of the sponsored projects to determine the validity of automated calibrations. Conclusion is that the normal outputs of instrument channels in nuclear plants can be monitored over a fuel cycle while the plant is operating to determine calibration drift in the field sensors and associated signal conversion and signal conditioning equipment. The procedure for on-line calibration tests involving calculating the deviation of each instrument channel from the best estimate of the process parameter that the instrument is measuring. Methods were evaluated for determining the best estimate. Deviation of each signal from the best estimate is updated frequently while the plant is operating and plotted vs time for entire fuel cycle, thereby providing time history plots that can reveal channel drift and other anomalies. Any instrument channel that exceeds allowable drift or channel accuracy band is then scheduled for calibration during a refueling outage or sooner. This provides calibration test results at the process operating point, one of the most critical points of the channel operation. This should suffice for most narrow-range instruments, although the calibration of some instruments can be verified at other points throughout their range. It should be pointed out that the calibration of some process signals such as the high pressure coolant injection flow in BWRs, which are normally off- scale during plant operation, can not be tested on-line

  20. Small area estimation for semicontinuous data.

    Science.gov (United States)

    Chandra, Hukum; Chambers, Ray

    2016-03-01

    Survey data often contain measurements for variables that are semicontinuous in nature, i.e. they either take a single fixed value (we assume this is zero) or they have a continuous, often skewed, distribution on the positive real line. Standard methods for small area estimation (SAE) based on the use of linear mixed models can be inefficient for such variables. We discuss SAE techniques for semicontinuous variables under a two part random effects model that allows for the presence of excess zeros as well as the skewed nature of the nonzero values of the response variable. In particular, we first model the excess zeros via a generalized linear mixed model fitted to the probability of a nonzero, i.e. strictly positive, value being observed, and then model the response, given that it is strictly positive, using a linear mixed model fitted on the logarithmic scale. Empirical results suggest that the proposed method leads to efficient small area estimates for semicontinuous data of this type. We also propose a parametric bootstrap method to estimate the MSE of the proposed small area estimator. These bootstrap estimates of the MSE are compared to the true MSE in a simulation study. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Environmental Variables and Pupils' Academic Performance in ...

    African Journals Online (AJOL)

    This causal-comparative study was carried out to investigate the influence of environmental variables on pupils' academic performance in primary science in Cross River State, Nigeria. Three hypotheses were formulated to guide the study. Two instruments were used to collect data for the study namely: environmental ...

  2. agronomic performance and estimate of genetic variability of upland ...

    African Journals Online (AJOL)

    Admin

    importance of rice, it has many industrial uses. For example ... environmental constraints. Particularly ... of Variance (ANOVA) according to Gomez and Gomez. (1984) and ... selection of genotypes for increased grain yield. For grain ..... yield components in wheat, Crop Science ... variability, stability and correlation studies in.

  3. Instrumental variable analysis

    NARCIS (Netherlands)

    Stel, Vianda S.; Dekker, Friedo W.; Zoccali, Carmine; Jager, Kitty J.

    2013-01-01

    The main advantage of the randomized controlled trial (RCT) is the random assignment of treatment that prevents selection by prognosis. Nevertheless, only few RCTs can be performed given their high cost and the difficulties in conducting such studies. Therefore, several analytical methods for

  4. Investigation of Music Student Efficacy as Influenced by Age, Experience, Gender, Ethnicity, and Type of Instrument Played in South Carolina

    Science.gov (United States)

    White, Norman

    2010-01-01

    The purpose of this research study was to quantitatively examine South Carolina high school instrumental music students' self-efficacy as measured by the Generalized Self-Efficacy (GSE) instrument (Schwarzer & Jerusalem, 1993). The independent variables of age, experience, gender, ethnicity, and type of instrument played) were correlated with…

  5. Instrument comparison for Aerosolized Titanium Dioxide

    Science.gov (United States)

    Ranpara, Anand

    Recent toxicological studies have shown that the surface area of ultrafine particles (UFP i.e., particles with diameters less than 0.1 micrometer) has a stronger correlation with adverse health effects than does mass of these particles. Ultrafine titanium dioxide (TiO2) particles are widely used in industry, and their use is associated with adverse health outcomes, such as micro vascular dysfunctions and pulmonary damages. The primary aim of this experimental study was to compare a variety of laboratory and industrial hygiene (IH) field study instruments all measuring the same aerosolized TiO2. The study also observed intra-instrument variability between measurements made by two apparently identical devices of the same type of instrument placed side-by-side. The types of instruments studied were (1) DustTrak(TM) DRX, (2) Personal Data RAMs(TM) (PDR), (3) GRIMM, (4) Diffusion charger (DC) and (5) Scanning Mobility Particle Sizer (SMPS). Two devices of each of the four IH field study instrument types were used to measure six levels of mass concentration of fine and ultrafine TiO2 aerosols in controlled chamber tests. Metrics evaluated included real-time mass, active surface area and number/geometric surface area distributions, and off-line gravimetric mass and morphology on filters. DustTrak(TM) DRXs and PDRs were used for mass concentration measurements. DCs were used for active surface area concentration measurements. GRIMMs were used for number concentration measurements. SMPS was used for inter-instrument comparisons of surface area and number concentrations. The results indicated that two apparently identical devices of each DRX and PDR were statistically not different with each other for all the trials of both the sizes of powder (p < 5%). Mean difference between mass concentrations measured by two DustTrak DRX devices was smaller than that measured by two PDR devices. DustTrak DRX measurements were closer to the reference method, gravimetric mass concentration

  6. The MCLIB library: Monte Carlo simulation of neutron scattering instruments

    Energy Technology Data Exchange (ETDEWEB)

    Seeger, P.A.

    1995-09-01

    Monte Carlo is a method to integrate over a large number of variables. Random numbers are used to select a value for each variable, and the integrand is evaluated. The process is repeated a large number of times and the resulting values are averaged. For a neutron transport problem, first select a neutron from the source distribution, and project it through the instrument using either deterministic or probabilistic algorithms to describe its interaction whenever it hits something, and then (if it hits the detector) tally it in a histogram representing where and when it was detected. This is intended to simulate the process of running an actual experiment (but it is much slower). This report describes the philosophy and structure of MCLIB, a Fortran library of Monte Carlo subroutines which has been developed for design of neutron scattering instruments. A pair of programs (LQDGEOM and MC{_}RUN) which use the library are shown as an example.

  7. The MCLIB library: Monte Carlo simulation of neutron scattering instruments

    International Nuclear Information System (INIS)

    Seeger, P.A.

    1995-01-01

    Monte Carlo is a method to integrate over a large number of variables. Random numbers are used to select a value for each variable, and the integrand is evaluated. The process is repeated a large number of times and the resulting values are averaged. For a neutron transport problem, first select a neutron from the source distribution, and project it through the instrument using either deterministic or probabilistic algorithms to describe its interaction whenever it hits something, and then (if it hits the detector) tally it in a histogram representing where and when it was detected. This is intended to simulate the process of running an actual experiment (but it is much slower). This report describes the philosophy and structure of MCLIB, a Fortran library of Monte Carlo subroutines which has been developed for design of neutron scattering instruments. A pair of programs (LQDGEOM and MC RUN) which use the library are shown as an example

  8. Instrumentation requirements for radiological defense of the U.S. population in community shelters. Final report

    International Nuclear Information System (INIS)

    Haaland, C.M.; Gant, K.S.

    1978-08-01

    Estimates are made of requirements for instruments for radiological defense of the U.S. population in the event of a nuclear attack. A detailed Community Shelter Plan posture is developed for each of 42,000 Standard Location Areas. Travel distance from residence to shelter in urban areas is limited to approximately 1 mile. Sixty percent of the U.S. population is sheltered in home basements, thirty-one percent in National Shelter Survey shelters, and nine percent is in neither. Three minimum allocations of instruments are developed. Allocation A, one radiological defense set per shelter, is essentially the same as the current civil defense allocations but is found to be inadequate for about 100,000 shelters having more than 100 occupants. Allocation B requires 3.4 million new dosimeters based on estimated shelter occupancy and provides a minimum instrumentation for radiological defense but not enough instruments to maintain individual dose records. Allocation C would require 18.1 million new dosimeters and would provide adequate instrumentation to maintain dose records for all shelter occupants

  9. Ergonomic investigation of weight distribution of laparoscopic instruments.

    Science.gov (United States)

    Lin, Chiuhsiang Joe; Chen, Hung-Jen; Lo, Ying-Chu

    2011-06-01

    Laparoscopic surgery procedures require highly specialized visually controlled movements. Investigations of industrial applications indicate that the length as well as the weight of hand-held tools substantially affects movement time (MT). Different weight distributions may have similar effects on long-shafted laparoscopic instruments when performing surgical procedures. For this reason, the current experiment aimed at finding direct evidence of the weight distribution effect in an accurate task. Ten right-handed subjects made continuous Fitts' pointing tasks using a long laparoscopic instrument. The factors and levels were target width: (2.5, 3, 3.5, and 4 cm), target distance (14, 23, and 37 cm), and weight distribution (uniform, front, middle, and rear). Weight distribution was made by chips of lead attached to the laparoscopic instrument. MT, error rate, and throughput (TP) were recorded as dependent variables. There were significant differences between the weight distribution in MT and in TP. The middle position was found to require the least time to manipulate the laparoscopic instrument in pointing tasks and also obtained the highest TP. These analyses and findings pointed to a design direction for the ergonomics and usability of the long hand-held tool such as the laparoscopic instrument in this study. To optimize efficiency in using these tools, the consideration of a better weight design is important and should not be neglected.

  10. Simple Instrumental and Visual Tests for Nonlaboratory Environmental Control

    Directory of Open Access Journals (Sweden)

    L. P. Eksperiandova

    2016-01-01

    Full Text Available Proposed are simple and available techniques that can be used for rapid and reliable environmental control specifically of natural water by means of instrumental and visual tests in outdoor conditions. Developed are the chemical colorimetric modes for fast detection of socially dangerous trace impurities in water such as Co(II, Pd(II, and Rh(III as well as NO2--ions and Fe(III serving as model impurities. Application of portable digital devices and scanner allows estimating the color coordinates and increasing the accuracy and sensitivity of the tests. The combination of complex formation with preconcentration of colored complexes replaces the sensitive but time-consuming and capricious kinetic method that is usually used for this purpose at the more convenient and reliable colorimetric method. As the test tools, the following ones are worked out: polyurethane foam tablets with sorbed colored complexes, the two-layer paper sandwich packaged in slide adapter and saturated by reagents, and polyethylene terephthalate blister with dried reagents. Fast analysis of polyurethane foam tablets is realized using a pocket digital RGB-colorimeter or portable photometer. Express analysis of two-layer paper sandwich or polyethylene terephthalate blister is realized by visual and instrumental tests. The metrological characteristics of the developed visual and instrumental express analysis techniques are estimated.

  11. Cross-section and panel estimates of peer effects in early adolescent cannabis use: With a little help from my 'friends once removed'.

    Science.gov (United States)

    Moriarty, John; McVicar, Duncan; Higgins, Kathryn

    2016-08-01

    Peer effects in adolescent cannabis are difficult to estimate, due in part to the lack of appropriate data on behaviour and social ties. This paper exploits survey data that have many desirable properties and have not previously been used for this purpose. The data set, collected from teenagers in three annual waves from 2002 to 2004 contains longitudinal information about friendship networks within schools (N = 5020). We exploit these data on network structure to estimate peer effects on adolescents from their nominated friends within school using two alternative approaches to identification. First, we present a cross-sectional instrumental variable (IV) estimate of peer effects that exploits network structure at the second degree, i.e. using information on friends of friends who are not themselves ego's friends to instrument for the cannabis use of friends. Second, we present an individual fixed effects estimate of peer effects using the full longitudinal structure of the data. Both innovations allow a greater degree of control for correlated effects than is commonly the case in the substance-use peer effects literature, improving our chances of obtaining estimates of peer effects than can be plausibly interpreted as causal. Both estimates suggest positive peer effects of non-trivial magnitude, although the IV estimate is imprecise. Furthermore, when we specify identical models with behaviour and characteristics of randomly selected school peers in place of friends', we find effectively zero effect from these 'placebo' peers, lending credence to our main estimates. We conclude that cross-sectional data can be used to estimate plausible positive peer effects on cannabis use where network structure information is available and appropriately exploited. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. The Total Deviation Index estimated by Tolerance Intervals to evaluate the concordance of measurement devices

    Directory of Open Access Journals (Sweden)

    Ascaso Carlos

    2010-04-01

    Full Text Available Abstract Background In an agreement assay, it is of interest to evaluate the degree of agreement between the different methods (devices, instruments or observers used to measure the same characteristic. We propose in this study a technical simplification for inference about the total deviation index (TDI estimate to assess agreement between two devices of normally-distributed measurements and describe its utility to evaluate inter- and intra-rater agreement if more than one reading per subject is available for each device. Methods We propose to estimate the TDI by constructing a probability interval of the difference in paired measurements between devices, and thereafter, we derive a tolerance interval (TI procedure as a natural way to make inferences about probability limit estimates. We also describe how the proposed method can be used to compute bounds of the coverage probability. Results The approach is illustrated in a real case example where the agreement between two instruments, a handle mercury sphygmomanometer device and an OMRON 711 automatic device, is assessed in a sample of 384 subjects where measures of systolic blood pressure were taken twice by each device. A simulation study procedure is implemented to evaluate and compare the accuracy of the approach to two already established methods, showing that the TI approximation produces accurate empirical confidence levels which are reasonably close to the nominal confidence level. Conclusions The method proposed is straightforward since the TDI estimate is derived directly from a probability interval of a normally-distributed variable in its original scale, without further transformations. Thereafter, a natural way of making inferences about this estimate is to derive the appropriate TI. Constructions of TI based on normal populations are implemented in most standard statistical packages, thus making it simpler for any practitioner to implement our proposal to assess agreement.

  13. Estimating intervention effects of prevention programs: accounting for noncompliance.

    Science.gov (United States)

    Stuart, Elizabeth A; Perry, Deborah F; Le, Huynh-Nhu; Ialongo, Nicholas S

    2008-12-01

    Individuals not fully complying with their assigned treatments is a common problem encountered in randomized evaluations of behavioral interventions. Treatment group members rarely attend all sessions or do all "required" activities; control group members sometimes find ways to participate in aspects of the intervention. As a result, there is often interest in estimating both the effect of being assigned to participate in the intervention, as well as the impact of actually participating and doing all of the required activities. Methods known broadly as "complier average causal effects" (CACE) or "instrumental variables" (IV) methods have been developed to estimate this latter effect, but they are more commonly applied in medical and treatment research. Since the use of these statistical techniques in prevention trials has been less widespread, many prevention scientists may not be familiar with the underlying assumptions and limitations of CACE and IV approaches. This paper provides an introduction to these methods, described in the context of randomized controlled trials of two preventive interventions: one for perinatal depression among at-risk women and the other for aggressive disruptive behavior in children. Through these case studies, the underlying assumptions and limitations of these methods are highlighted.

  14. Field estimation of soil water content. A practical guide to methods, instrumentation and sensor technology

    International Nuclear Information System (INIS)

    2008-01-01

    During a period of five years, an international group of soil water instrumentation experts were contracted by the International Atomic Energy Agency to carry out a range of comparative assessments of soil water sensing methods under laboratory and field conditions. The detailed results of those studies are published elsewhere. Most of the devices examined worked well some of the time, but most also performed poorly in some circumstances. The group was also aware that the choice of a water measurement technology is often made for economic, convenience and other reasons, and that there was a need to be able to obtain the best results from any device used. The choice of a technology is sometimes not made by the ultimate user, or even if it is, the main constraint may be financial rather than technical. Thus, this guide is presented in a way that allows the user to obtain the best performance from any instrument, while also providing guidance as to which instruments perform best under given circumstances. That said, this expert group of the IAEA reached several important conclusions: (1) the field calibrated neutron moisture meter (NMM) remains the most accurate and precise method for soil profile water content determination in the field, and is the only indirect method capable of providing accurate soil water balance data for studies of crop water use, water use efficiency, irrigation efficiency and irrigation water use efficiency, with a minimum number of access tubes; (2) those electromagnetic sensors known as capacitance sensors exhibit much more variability in the field than either the NMM or direct soil water measurements, and they are not recommended for soil water balance studies for this reason (impractically large numbers of access tubes and sensors are required) and because they are rendered inaccurate by changes in soil bulk electrical conductivity (including temperature effects) that often occur in irrigated soils, particularly those containing

  15. Interobserver agreement in fusion status assessment after instrumental desis of the lower lumbar spine using 64-slice multidetector computed tomography

    DEFF Research Database (Denmark)

    Laoutliev, Borislav; Havsteen, Inger; Bech, Birthe Højlund

    2012-01-01

    Persistent lower back pain after instrumental posterolateral desis may arise from incomplete fusion. We investigate the impact of experience on interobserver agreement in fusion estimation.......Persistent lower back pain after instrumental posterolateral desis may arise from incomplete fusion. We investigate the impact of experience on interobserver agreement in fusion estimation....

  16. THE EVOLUTION OF ANNUAL MEAN TEMPERATURE AND PRECIPITATION QUANTITY VARIABILITY BASED ON ESTIMATED CHANGES BY THE REGIONAL CLIMATIC MODELS

    Directory of Open Access Journals (Sweden)

    Paula Furtună

    2013-03-01

    Full Text Available Climatic changes are representing one of the major challenges of our century, these being forcasted according to climate scenarios and models, which represent plausible and concrete images of future climatic conditions. The results of climate models comparison regarding future water resources and temperature regime trend can become a useful instrument for decision makers in choosing the most effective decisions regarding economic, social and ecologic levels. The aim of this article is the analysis of temperature and pluviometric variability at the closest grid point to Cluj-Napoca, based on data provided by six different regional climate models (RCMs. Analysed on 30 year periods (2001-2030,2031-2060 and 2061-2090, the mean temperature has an ascending general trend, with great varability between periods. The precipitation expressed trough percentage deviation shows a descending general trend, which is more emphazied during 2031-2060 and 2061-2090.

  17. Spectral estimates of net radiation and soil heat flux

    International Nuclear Information System (INIS)

    Daughtry, C.S.T.; Kustas, W.P.; Moran, M.S.; Pinter, P.J. Jr.; Jackson, R.D.; Brown, P.W.; Nichols, W.D.; Gay, L.W.

    1990-01-01

    Conventional methods of measuring surface energy balance are point measurements and represent only a small area. Remote sensing offers a potential means of measuring outgoing fluxes over large areas at the spatial resolution of the sensor. The objective of this study was to estimate net radiation (Rn) and soil heat flux (G) using remotely sensed multispectral data acquired from an aircraft over large agricultural fields. Ground-based instruments measured Rn and G at nine locations along the flight lines. Incoming fluxes were also measured by ground-based instruments. Outgoing fluxes were estimated using remotely sensed data. Remote Rn, estimated as the algebraic sum of incoming and outgoing fluxes, slightly underestimated Rn measured by the ground-based net radiometers. The mean absolute errors for remote Rn minus measured Rn were less than 7%. Remote G, estimated as a function of a spectral vegetation index and remote Rn, slightly overestimated measured G; however, the mean absolute error for remote G was 13%. Some of the differences between measured and remote values of Rn and G are associated with differences in instrument designs and measurement techniques. The root mean square error for available energy (Rn - G) was 12%. Thus, methods using both ground-based and remotely sensed data can provide reliable estimates of the available energy which can be partitioned into sensible and latent heat under non advective conditions

  18. Frequency-Zooming ARMA Modeling for Analysis of Noisy String Instrument Tones

    Directory of Open Access Journals (Sweden)

    Paulo A. A. Esquef

    2003-09-01

    Full Text Available This paper addresses model-based analysis of string instrument sounds. In particular, it reviews the application of autoregressive (AR modeling to sound analysis/synthesis purposes. Moreover, a frequency-zooming autoregressive moving average (FZ-ARMA modeling scheme is described. The performance of the FZ-ARMA method on modeling the modal behavior of isolated groups of resonance frequencies is evaluated for both synthetic and real string instrument tones immersed in background noise. We demonstrate that the FZ-ARMA modeling is a robust tool to estimate the decay time and frequency of partials of noisy tones. Finally, we discuss the use of the method in synthesis of string instrument sounds.

  19. Estimator's electrical man-hour manual

    CERN Document Server

    Page, John S

    1999-01-01

    This manual's latest edition continues to be the best source available for making accurate, reliable man-hour estimates for electrical installation. This new edition is revised and expanded to include installation of electrical instrumentation, which is used in monitoring various process systems.

  20. 8 years of Solar Spectral Irradiance Observations from the ISS with the SOLAR/SOLSPEC Instrument

    Science.gov (United States)

    Damé, L.; Bolsée, D.; Meftah, M.; Irbah, A.; Hauchecorne, A.; Bekki, S.; Pereira, N.; Cessateur, G.; Marchand, M.; Thiéblemont, R.; Foujols, T.

    2016-12-01

    Accurate measurements of Solar Spectral Irradiance (SSI) are of primary importance for a better understanding of solar physics and of the impact of solar variability on climate (via Earth's atmospheric photochemistry). The acquisition of a top of atmosphere reference solar spectrum and of its temporal and spectral variability during the unusual solar cycle 24 is of prime interest for these studies. These measurements are performed since April 2008 with the SOLSPEC spectro-radiometer from the far ultraviolet to the infrared (166 nm to 3088 nm). This instrument, developed under a fruitful LATMOS/BIRA-IASB collaboration, is part of the Solar Monitoring Observatory (SOLAR) payload, externally mounted on the Columbus module of the International Space Station (ISS). The SOLAR mission, with its actual 8 years duration, will cover almost the entire solar cycle 24. We present here the in-flight operations and performances of the SOLSPEC instrument, including the engineering corrections, calibrations and improved know-how procedure for aging corrections. Accordingly, a SSI reference spectrum from the UV to the NIR will be presented, together with its UV variability, as measured by SOLAR/SOLSPEC. Uncertainties on these measurements and comparisons with other instruments will be briefly discussed.

  1. Instruments for the promotion of environmental innovations. Stocktaking, evaluation and deficit analysis; Instrumente zur Foerderung von Umweltinnovationen. Bestandsaufnahme, Bewertung und Defizitanalyse

    Energy Technology Data Exchange (ETDEWEB)

    Rennings, Klaus; Rammer, Christian; Oberndorfer, Ulrich [Zentrum fuer Europaeische Wirtschaftsforschung (ZEW), GmbH, Mannheim (DE)] (and others)

    2008-03-15

    In the ecological report 2006 of the Federal Ministry for the Environment, Nature Conservation and Nuclear Safety (Berlin, Federal Republic of Germany) with the title ''Environment - Innovation - Occupation'', the Federal Ministry outlines the concept of an innovative oriented environmental policy. An innovative environmental policy contributes to the reduction of the environmental pollution and the ecological damages of industrial production. In this background, the question arises, how the environmental policy can support appropriate innovative activities. For this, the research project under consideration presents the following contributions: (a) Overview of the most important instruments for the promotion of environmental innovations in Germany; (b) Estimation of the effectiveness of the instruments including gap analysis and deficit analysis; (c) Presentation of foreign Best Practice Examples (instruments and environmental political beginnings); (d) Proposals for priority environmental political starting points to the advancement of the equipment and pointing out the further research need.

  2. Radioisotope instruments

    CERN Document Server

    Cameron, J F; Silverleaf, D J

    1971-01-01

    International Series of Monographs in Nuclear Energy, Volume 107: Radioisotope Instruments, Part 1 focuses on the design and applications of instruments based on the radiation released by radioactive substances. The book first offers information on the physical basis of radioisotope instruments; technical and economic advantages of radioisotope instruments; and radiation hazard. The manuscript then discusses commercial radioisotope instruments, including radiation sources and detectors, computing and control units, and measuring heads. The text describes the applications of radioisotop

  3. Hydroclimate variability: comparing dendroclimatic records and future GCM scenarios

    International Nuclear Information System (INIS)

    Lapp, S.

    2008-01-01

    Drought events of the 20th Century in western North America have been linked to teleconnections that influence climate variability on inter-annual and decadal to multi-decadal time scales. These teleconnections represent the changes sea surface temperatures (SSTs) in the tropical and extra-tropical regions of the Pacific Ocean, ENSO (El-Nino Southern Oscillation) and PDO (Pacific Decadal Oscillation), respectively, and the Atlantic Ocean, AMO (Atlantic Multidecadal Oscillation), and also to atmospheric circulation patterns (PNA: Pacific-North American). A network of precipitation sensitive tree-ring chronologies from Montana, Alberta, Saskatchewan and NWT highly correlate to the climate moisture index (CMI) of precipitation potential evapotranspiration (P-PET), thus, capturing the long-term hydroclimatic variability of the region. Reconstructions of annual and seasonal CMI identify drought events in previous centuries that are more extreme in magnitude, frequency and duration than recorded during the instrumental period. Variability in the future climate will include these natural climate cycles as well as modulations of these cycles affected by human induced global warming. The proxy hydroclimate records derived from tree-rings present information on decadal and multi-decadal hydroclimatic variability for the past millennium; therefore, providing a unique opportunity to validate the climate variability simulated by GCMs (Global Climate Models) on longer time scales otherwise constrained by the shorter observation records. Developing scenarios of future variability depends: 1) on our understanding of the interaction of these teleconnection; and, 2) to identify climate models that are able to accurately simulate the hydroclimatic variability as detected in the instrumental and proxy records. (author)

  4. Response of Nuclear Power Plant Instrumentation Cables Exposed to Fire Conditions.

    Energy Technology Data Exchange (ETDEWEB)

    Muna, Alice Baca [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); LaFleur, Chris Bensdotter [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Brooks, Dusty Marie [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-09-01

    This report presents the results of instrumentation cable tests sponsored by the US Nuclear Regulatory Commission (NRC) Office of Nuclear Regulatory Research and performed at Sandia National Laboratories (SNL). The goal of the tests was to assess thermal and electrical response behavior under fire-exposure conditions for instrumentation cables and circuits. The test objective was to assess how severe radiant heating conditions surrounding an instrumentation cable affect current or voltage signals in an instrumentation circuit. A total of thirty-nine small-scale tests were conducted. Ten different instrumentation cables were tested, ranging from one conductor to eight-twisted pairs. Because the focus of the tests was thermoset (TS) cables, only two of the ten cables had thermoplastic (TP) insulation and jacket material and the remaining eight cables were one of three different TS insulation and jacket material. Two instrumentation cables from previous cable fire testing were included, one TS and one TP. Three test circuits were used to simulate instrumentation circuits present in nuclear power plants: a 4–20 mA current loop, a 10–50 mA current loop and a 1–5 VDC voltage loop. A regression analysis was conducted to determine key variables affecting signal leakage time.

  5. Risk assessment of groundwater level variability using variable Kriging methods

    Science.gov (United States)

    Spanoudaki, Katerina; Kampanis, Nikolaos A.

    2015-04-01

    Assessment of the water table level spatial variability in aquifers provides useful information regarding optimal groundwater management. This information becomes more important in basins where the water table level has fallen significantly. The spatial variability of the water table level in this work is estimated based on hydraulic head measured during the wet period of the hydrological year 2007-2008, in a sparsely monitored basin in Crete, Greece, which is of high socioeconomic and agricultural interest. Three Kriging-based methodologies are elaborated in Matlab environment to estimate the spatial variability of the water table level in the basin. The first methodology is based on the Ordinary Kriging approach, the second involves auxiliary information from a Digital Elevation Model in terms of Residual Kriging and the third methodology calculates the probability of the groundwater level to fall below a predefined minimum value that could cause significant problems in groundwater resources availability, by means of Indicator Kriging. The Box-Cox methodology is applied to normalize both the data and the residuals for improved prediction results. In addition, various classical variogram models are applied to determine the spatial dependence of the measurements. The Matérn model proves to be the optimal, which in combination with Kriging methodologies provides the most accurate cross validation estimations. Groundwater level and probability maps are constructed to examine the spatial variability of the groundwater level in the basin and the associated risk that certain locations exhibit regarding a predefined minimum value that has been set for the sustainability of the basin's groundwater resources. Acknowledgement The work presented in this paper has been funded by the Greek State Scholarships Foundation (IKY), Fellowships of Excellence for Postdoctoral Studies (Siemens Program), 'A simulation-optimization model for assessing the best practices for the

  6. Enhancing feedforward controller tuning via instrumental variables: with application to nanopositioning

    NARCIS (Netherlands)

    Boeren, F.A.J.; Bruijnen, D.J.H.; Oomen, T.A.E.

    2017-01-01

    Feedforward control enables high performance of a motion system. Recently, algorithms have been proposed that eliminate bias errors in tuning the parameters of a feedforward controller. The aim of this paper is to develop a new algorithm that combines unbiased parameter estimates with optimal

  7. Estimation of time-variable fast flow path chemical concentrations for application in tracer-based hydrograph separation analyses

    Science.gov (United States)

    Kronholm, Scott C.; Capel, Paul D.

    2016-01-01

    Mixing models are a commonly used method for hydrograph separation, but can be hindered by the subjective choice of the end-member tracer concentrations. This work tests a new variant of mixing model that uses high-frequency measures of two tracers and streamflow to separate total streamflow into water from slowflow and fastflow sources. The ratio between the concentrations of the two tracers is used to create a time-variable estimate of the concentration of each tracer in the fastflow end-member. Multiple synthetic data sets, and data from two hydrologically diverse streams, are used to test the performance and limitations of the new model (two-tracer ratio-based mixing model: TRaMM). When applied to the synthetic streams under many different scenarios, the TRaMM produces results that were reasonable approximations of the actual values of fastflow discharge (±0.1% of maximum fastflow) and fastflow tracer concentrations (±9.5% and ±16% of maximum fastflow nitrate concentration and specific conductance, respectively). With real stream data, the TRaMM produces high-frequency estimates of slowflow and fastflow discharge that align with expectations for each stream based on their respective hydrologic settings. The use of two tracers with the TRaMM provides an innovative and objective approach for estimating high-frequency fastflow concentrations and contributions of fastflow water to the stream. This provides useful information for tracking chemical movement to streams and allows for better selection and implementation of water quality management strategies.

  8. Stimulus-specific variability in color working memory with delayed estimation.

    Science.gov (United States)

    Bae, Gi-Yeul; Olkkonen, Maria; Allred, Sarah R; Wilson, Colin; Flombaum, Jonathan I

    2014-04-08

    Working memory for color has been the central focus in an ongoing debate concerning the structure and limits of visual working memory. Within this area, the delayed estimation task has played a key role. An implicit assumption in color working memory research generally, and delayed estimation in particular, is that the fidelity of memory does not depend on color value (and, relatedly, that experimental colors have been sampled homogeneously with respect to discriminability). This assumption is reflected in the common practice of collapsing across trials with different target colors when estimating memory precision and other model parameters. Here we investigated whether or not this assumption is secure. To do so, we conducted delayed estimation experiments following standard practice with a memory load of one. We discovered that different target colors evoked response distributions that differed widely in dispersion and that these stimulus-specific response properties were correlated across observers. Subsequent experiments demonstrated that stimulus-specific responses persist under higher memory loads and that at least part of the specificity arises in perception and is eventually propagated to working memory. Posthoc stimulus measurement revealed that rendered stimuli differed from nominal stimuli in both chromaticity and luminance. We discuss the implications of these deviations for both our results and those from other working memory studies.

  9. Comparing Accuracy of Airborne Laser Scanning and TerraSAR-X Radar Images in the Estimation of Plot-Level Forest Variables

    Directory of Open Access Journals (Sweden)

    Juha Hyyppä

    2010-01-01

    Full Text Available In this study we compared the accuracy of low-pulse airborne laser scanning (ALS data, multi-temporal high-resolution noninterferometric TerraSAR-X radar data and a combined feature set derived from these data in the estimation of forest variables at plot level. The TerraSAR-X data set consisted of seven dual-polarized (HH/HV or VH/VV Stripmap mode images from all seasons of the year. We were especially interested in distinguishing between the tree species. The dependent variables estimated included mean volume, basal area, mean height, mean diameter and tree species-specific mean volumes. Selection of best possible feature set was based on a genetic algorithm (GA. The nonparametric k-nearest neighbour (k-NN algorithm was applied to the estimation. The research material consisted of 124 circular plots measured at tree level and located in the vicinity of Espoo, Finland. There are large variations in the elevation and forest structure in the study area, making it demanding for image interpretation. The best feature set contained 12 features, nine of them originating from the ALS data and three from the TerraSAR-X data. The relative RMSEs for the best performing feature set were 34.7% (mean volume, 28.1% (basal area, 14.3% (mean height, 21.4% (mean diameter, 99.9% (mean volume of Scots pine, 61.6% (mean volume of Norway spruce and 91.6% (mean volume of deciduous tree species. The combined feature set outperformed an ALS-based feature set marginally; in fact, the latter was better in the case of species-specific volumes. Features from TerraSAR-X alone performed poorly. However, due to favorable temporal resolution, satellite-borne radar imaging is a promising data source for updating large-area forest inventories based on low-pulse ALS.

  10. FEATURES OF AN ESTIMATION OF INVESTMENT PROJECTS AT THE ENTERPRISES OF AVIATION INSTRUMENT

    Directory of Open Access Journals (Sweden)

    Petr P. Dobrov

    2016-01-01

    Full Text Available The relevance of this study due to the fact that the current situation in Russia is complemented by the negative effects of market reforms in the economy and economic sanctions adopted against our country and in particular the different level companies. In view of this, to effectively manage the activities and the development of aviation instrument companies and enterprises of different ownership forms are highly relevant issues related to the assessment of investment projects. The general crisis that engulfed almost all industry in Russia, demanded the application of a new ideology of the organization and management of investment projects, as well as their assessment at the enterprises of aviation instrument. In Russia, began a new stage in the development of project management establishment of a domestic methodology, complex tools and training for professional project management on the basis of domestic achievements, global experience and creativity of its processing based on the actual conditions of our country. The need for the use of project management methodology in Russia is determined by two factors: the increasing complexity of projects and the organizations that operate them, and the fact that project management is widely used in countries with market economies. Projects at the enterprises of aviation instrument making and evaluation are characterized by complexity and uncertainty, a significant dependence on the dynamic environment, including socio-economic, political, financial, economic, legislative influence of both the state and competing companies. In this paper, a study of modern methods of evaluating investment projects at the enterprises of aviation instrument. Methodology. The methodological basis of this paper appeared comparative and economic-mathematical analysis methods. Results. As part of the presentation of the present article the author, it was found that the activity of modern companies is not linear and is

  11. Evaluation of software sensors for on-line estimation of culture conditions in an Escherichia coli cultivation expressing a recombinant protein.

    Science.gov (United States)

    Warth, Benedikt; Rajkai, György; Mandenius, Carl-Fredrik

    2010-05-03

    Software sensors for monitoring and on-line estimation of critical bioprocess variables have mainly been used with standard bioreactor sensors, such as electrodes and gas analyzers, where algorithms in the software model have generated the desired state variables. In this article we propose that other on-line instruments, such as NIR probes and on-line HPLC, should be used to make more reliable and flexible software sensors. Five software sensor architectures were compared and evaluated: (1) biomass concentration from an on-line NIR probe, (2) biomass concentration from titrant addition, (3) specific growth rate from titrant addition, (4) specific growth rate from the NIR probe, and (5) specific substrate uptake rate and by-product rate from on-line HPLC and NIR probe signals. The software sensors were demonstrated on an Escherichia coli cultivation expressing a recombinant protein, green fluorescent protein (GFP), but the results could be extrapolated to other production organisms and product proteins. We conclude that well-maintained on-line instrumentation (hardware sensors) can increase the potential of software sensors. This would also strongly support the intentions with process analytical technology and quality-by-design concepts. 2010 Elsevier B.V. All rights reserved.

  12. Instrument Remote Control via the Astronomical Instrument Markup Language

    Science.gov (United States)

    Sall, Ken; Ames, Troy; Warsaw, Craig; Koons, Lisa; Shafer, Richard

    1998-01-01

    The Instrument Remote Control (IRC) project ongoing at NASA's Goddard Space Flight Center's (GSFC) Information Systems Center (ISC) supports NASA's mission by defining an adaptive intranet-based framework that provides robust interactive and distributed control and monitoring of remote instruments. An astronomical IRC architecture that combines the platform-independent processing capabilities of Java with the power of Extensible Markup Language (XML) to express hierarchical data in an equally platform-independent, as well as human readable manner, has been developed. This architecture is implemented using a variety of XML support tools and Application Programming Interfaces (API) written in Java. IRC will enable trusted astronomers from around the world to easily access infrared instruments (e.g., telescopes, cameras, and spectrometers) located in remote, inhospitable environments, such as the South Pole, a high Chilean mountaintop, or an airborne observatory aboard a Boeing 747. Using IRC's frameworks, an astronomer or other scientist can easily define the type of onboard instrument, control the instrument remotely, and return monitoring data all through the intranet. The Astronomical Instrument Markup Language (AIML) is the first implementation of the more general Instrument Markup Language (IML). The key aspects of our approach to instrument description and control applies to many domains, from medical instruments to machine assembly lines. The concepts behind AIML apply equally well to the description and control of instruments in general. IRC enables us to apply our techniques to several instruments, preferably from different observatories.

  13. A comparison of small-area estimation techniques to estimate selected stand attributes using LiDAR-derived auxiliary variables

    Science.gov (United States)

    Michael E. Goerndt; Vicente J. Monleon; Hailemariam. Temesgen

    2011-01-01

    One of the challenges often faced in forestry is the estimation of forest attributes for smaller areas of interest within a larger population. Small-area estimation (SAE) is a set of techniques well suited to estimation of forest attributes for small areas in which the existing sample size is small and auxiliary information is available. Selected SAE methods were...

  14. Teachers' Pedagogical Management and Instrumental Performance in Students of an Artistic Higher Education School

    Science.gov (United States)

    De La Cruz Bautista, Edwin

    2017-01-01

    This research aims to know the relationship between the variables teachers' pedagogical management and instrumental performance in students from an Artistic Higher Education School. It is a descriptive and correlational research that seeks to find the relationship between both variables. The sample of the study consisted of 30 students of the…

  15. Relationship of motivation for motherhood with some sociodemographic variables and gender identity

    Directory of Open Access Journals (Sweden)

    Vuletić Georgije M.

    2016-01-01

    Full Text Available Main goal of the research was to explore the relationship between motivation for motherhood and some of the sociodemographic variables which have been noticed as significant in the similar researches of other authors, as well as relation to the gender roles and gender identity, according to the model proposed by Sandra Bem. The study was conducted on the sample consisting of 571 female students in Belgrade. Statistically significant correlations are confirmed between motivation for motherhood and number of siblings, age of subject's mother and age of subject's mother at first birth. The highest correlation is found between motivation for motherhood and femininity. It is also proposed a preliminary questioner, as the first step of constructing an adequate instrument for measuring motivation for motherhood. The questioner is used for estimation of motivation for motherhood in this research.

  16. A new mathematical approach for the estimation of the AUC and its variability under different experimental designs in preclinical studies.

    Science.gov (United States)

    Navarro-Fontestad, Carmen; González-Álvarez, Isabel; Fernández-Teruel, Carlos; Bermejo, Marival; Casabó, Vicente Germán

    2012-01-01

    The aim of the present work was to develop a new mathematical method for estimating the area under the curve (AUC) and its variability that could be applied in different preclinical experimental designs and amenable to be implemented in standard calculation worksheets. In order to assess the usefulness of the new approach, different experimental scenarios were studied and the results were compared with those obtained with commonly used software: WinNonlin® and Phoenix WinNonlin®. The results do not show statistical differences among the AUC values obtained by both procedures, but the new method appears to be a better estimator of the AUC standard error, measured as the coverage of 95% confidence interval. In this way, the new proposed method demonstrates to be as useful as WinNonlin® software when it was applicable. Copyright © 2011 John Wiley & Sons, Ltd.

  17. Thermal architecture of the SPICA/SAFARI instrument

    Science.gov (United States)

    Charles, Ivan; Duband, Lionel; Duval, Jean-Marc; Jackson, Brian; Jellema, Willem; Kooijman, Peter Paul; Luchier, Nicolas; Tirolien, Thierry; van Weers, Henk

    2012-09-01

    The SAFARI instrument is a far infrared imaging spectrometer that is a core instrument of the SPICA mission. Thanks to the large (3 meter) SPICA cold telescope, the ultra sensitive detectors and a powerful Fourier Transform Spectrometer, this instrument will give access to the faintest light never observed in the 34 μm - 210 μm bandwidth with a high spectral resolution. To achieve this goal, TES detectors, that need to be cooled at a temperature as low as 50 mK, have been chosen. The thermal architecture of the SAFARI focal plane unit (FPU) which fulfils the TES detector thermal requirements is presented. In particular, an original 50 mK cooler concept based on a sorption cooler in series with an adiabatic demagnetization refrigerator will be used. The thermal design of the detector focal plane array (FPA) that uses three temperature stages to limit the loads on the lowest temperature stage, will be also described. The current SAFARI thermal budget estimations are presented and discussed regarding the limited SPICA allocations. Finally, preliminary thermal sensitivity analysis dealing with thermal stability requirements is presented.

  18. WFIRST: Data/Instrument Simulation Support at IPAC

    Science.gov (United States)

    Laine, Seppo; Akeson, Rachel; Armus, Lee; Bennett, Lee; Colbert, James; Helou, George; Kirkpatrick, J. Davy; Meshkat, Tiffany; Paladini, Roberta; Ramirez, Solange; Wang, Yun; Xie, Joan; Yan, Lin

    2018-01-01

    As part of WFIRST Science Center preparations, the IPAC Science Operations Center (ISOC) maintains a repository of 1) WFIRST data and instrument simulations, 2) tools to facilitate scientific performance and feasibility studies using the WFIRST, and 3) parameters summarizing the current design and predicted performance of the WFIRST telescope and instruments. The simulation repository provides access for the science community to simulation code, tools, and resulting analyses. Examples of simulation code with ISOC-built web-based interfaces include EXOSIMS (for estimating exoplanet yields in CGI surveys) and the Galaxy Survey Exposure Time Calculator. In the future the repository will provide an interface for users to run custom simulations of a wide range of coronagraph instrument (CGI) observations and sophisticated tools for designing microlensing experiments. We encourage those who are generating simulations or writing tools for exoplanet observations with WFIRST to contact the ISOC team so we can work with you to bring these to the attention of the broader astronomical community as we prepare for the exciting science that will be enabled by WFIRST.

  19. A comparison on parameter-estimation methods in multiple regression analysis with existence of multicollinearity among independent variables

    Directory of Open Access Journals (Sweden)

    Hukharnsusatrue, A.

    2005-11-01

    Full Text Available The objective of this research is to compare multiple regression coefficients estimating methods with existence of multicollinearity among independent variables. The estimation methods are Ordinary Least Squares method (OLS, Restricted Least Squares method (RLS, Restricted Ridge Regression method (RRR and Restricted Liu method (RL when restrictions are true and restrictions are not true. The study used the Monte Carlo Simulation method. The experiment was repeated 1,000 times under each situation. The analyzed results of the data are demonstrated as follows. CASE 1: The restrictions are true. In all cases, RRR and RL methods have a smaller Average Mean Square Error (AMSE than OLS and RLS method, respectively. RRR method provides the smallest AMSE when the level of correlations is high and also provides the smallest AMSE for all level of correlations and all sample sizes when standard deviation is equal to 5. However, RL method provides the smallest AMSE when the level of correlations is low and middle, except in the case of standard deviation equal to 3, small sample sizes, RRR method provides the smallest AMSE.The AMSE varies with, most to least, respectively, level of correlations, standard deviation and number of independent variables but inversely with to sample size.CASE 2: The restrictions are not true.In all cases, RRR method provides the smallest AMSE, except in the case of standard deviation equal to 1 and error of restrictions equal to 5%, OLS method provides the smallest AMSE when the level of correlations is low or median and there is a large sample size, but the small sample sizes, RL method provides the smallest AMSE. In addition, when error of restrictions is increased, OLS method provides the smallest AMSE for all level, of correlations and all sample sizes, except when the level of correlations is high and sample sizes small. Moreover, the case OLS method provides the smallest AMSE, the most RLS method has a smaller AMSE than

  20. Update of Earthquake Strong-Motion Instrumentation at Lawrence Livermore National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Murray, Robert C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-09-01

    Following the January 1980 earthquake that was felt at Lawrence Livermore National Laboratory (LLNL), a network of strong-motion accelerographs was installed at LLNL. Prior to the 1980 earthquake, there were no accelerographs installed. The ground motion from the 1980 earthquake was estimated from USGS instruments around the Laboratory to be between 0.2 – 0.3 g horizontal peak ground acceleration. These instruments were located at the Veterans Hospital, 5 miles southwest of LLNL, and in San Ramon, about 12 miles west of LLNL. In 2011, the Department of Energy (DOE) requested to know the status of our seismic instruments. We conducted a survey of our instrumentation systems and responded to DOE in a letter. During this survey, it was found that the recorders in Buildings 111 and 332 were not operational. The instruments on Nova had been removed, and only three of the 10 NIF instruments installed in 2005 were operational (two were damaged and five had been removed from operation at the request of the program). After the survey, it was clear that the site seismic instrumentation had degraded substantially and would benefit from an overhaul and more attention to ongoing maintenance. LLNL management decided to update the LLNL seismic instrumentation system. The updated system is documented in this report.

  1. Estimated effect of climatic variables on the transmission of Plasmodium vivax malaria in the Republic of Korea.

    Science.gov (United States)

    Kim, Young-Min; Park, Jae-Won; Cheong, Hae-Kwan

    2012-09-01

    Climate change may affect Plasmodium vivax malaria transmission in a wide region including both subtropical and temperate areas. We aimed to estimate the effects of climatic variables on the transmission of P. vivax in temperate regions. We estimated the effects of climatic factors on P. vivax malaria transmission using data on weekly numbers of malaria cases for the years 2001-2009 in the Republic of Korea. Generalized linear Poisson models and distributed lag nonlinear models (DLNM) were adopted to estimate the effects of temperature, relative humidity, temperature fluctuation, duration of sunshine, and rainfall on malaria transmission while adjusting for seasonal variation, between-year variation, and other climatic factors. A 1°C increase in temperature was associated with a 17.7% [95% confidence interval (CI): 16.9, 18.6%] increase in malaria incidence after a 3-week lag, a 10% rise in relative humidity was associated with 40.7% (95% CI: -44.3, -36.9%) decrease in malaria after a 7-week lag, a 1°C increase in the diurnal temperature range was associated with a 24.1% (95% CI: -26.7, -21.4%) decrease in malaria after a 7-week lag, and a 10-hr increase in sunshine per week was associated with a 5.1% (95% CI: -8.4, -1.7%) decrease in malaria after a 2-week lag. The cumulative relative risk for a 10-mm increase in rainfall (≤ 350 mm) on P. vivax malaria was 3.61 (95% CI: 1.69, 7.72) based on a DLNM with a 10-week maximum lag. Our findings suggest that malaria transmission in temperate areas is highly dependent on climate factors. In addition, lagged estimates of the effect of rainfall on malaria are consistent with the time necessary for mosquito development and P. vivax incubation.

  2. Willingness-to-pay and policy-instrument choice for climate-change policy in the United States

    International Nuclear Information System (INIS)

    Kotchen, Matthew J.; Boyle, Kevin J.; Leiserowitz, Anthony A.

    2013-01-01

    This paper provides the first willingness-to-pay (WTP) estimates in support of a national climate-change policy that are comparable with the costs of actual legislative efforts in the U.S. Congress. Based on a survey of 2034 American adults, we find that households are, on average, willing to pay between $79 and $89 per year in support of reducing domestic greenhouse-gas (GHG) emissions 17% by 2020. Even very conservative estimates yield an average WTP at or above $60 per year. Taking advantage of randomized treatments within the survey valuation question, we find that mean WTP does not vary substantially among the policy instruments of a cap-and-trade program, a carbon tax, or a GHG regulation. But there are differences in the sociodemographic characteristics of those willing to pay across policy instruments. Greater education always increases WTP. Older individuals have a lower WTP for a carbon tax and a GHG regulation, while greater household income increases WTP for these same two policy instruments. Republicans, along with those indicating no political party affiliation, have a significantly lower WTP regardless of the policy instrument. But many of these differences are no longer evident after controlling for respondent opinions about whether global warming is actually happening. - Highlights: ► First willingness-to-pay (WTP) estimates for actual national climate-change policy in the U.S. ► WTP does not vary among the instruments of a cap-and-trade program, a carbon tax, or a GHG regulation. ► There are differences in the characteristics of those willing to pay across policy instruments. ► No differences after controlling for opinions about whether global warming is actually happening

  3. Incorporation of personal computers in a research reactor instrumentation system for data monitoring and analysis

    International Nuclear Information System (INIS)

    Leopando, L.S.

    1998-01-01

    The research contract was implemented by obtaining off-the shelf personal computer hardware and data acquisition cards, designing the interconnection with the instrumentation system, writing and debugging the software, and the assembling and testing the set-up. The hardware was designed to allow all variables monitored by the instrumentation system to be accessible to the computers, without requiring any major modification of the instrumentation system and without compromising reactor safety in any way. The computer hardware addition was also designed to have no effect on any existing function of the instrumentation system. The software was designed to implement only graphical display and automated logging of reactor variables. Additional functionality could be easily added in the future with software revision because all the reactor variables are already available in the computer. It would even be possible to ''close the loop'' and control the reactor through software. It was found that most of the effort in an undertaking of this sort will be in software development, but the job can be done even by non-computer specialized reactor people working with programming languages they are already familiar with. It was also found that the continuing rapid advance of personal computer technology makes it essential that such a project be undertaken with inevitability of future hardware upgrading in mind. The hardware techniques and the software developed may find applicability in other research reactors, especially those with a generic analog research reactor TRIGA console. (author)

  4. Identification of variables for site calibration and power curve assessment in complex terrain. Task 8, a literature survey on theory and practice of parameter identification, specification and estimation (ISE) techniques

    Energy Technology Data Exchange (ETDEWEB)

    Verhoef, J.P.; Leendertse, G.P. [ECN Wind, Petten (Netherlands)

    2001-04-01

    This document presents the literature survey results on Identification, Specification and Estimation (ISE) techniques for variables within the SiteParIden project. Besides an overview of the different general techniques also an overview is given on EU funded wind energy projects where some of these techniques have been applied more specifically. The main problem in applications like power performance assessment and site calibration is to establish an appropriate model for predicting the considered dependent variable with the aid of measured independent (explanatory) variables. In these applications detailed knowledge on what the relevant variables are and how their precise appearance in the model would be is typically missing. Therefore, the identification (of variables) and the specification (of the model relation) are important steps in the model building phase. For the determination of the parameters in the model a reliable variable estimation technique is required. In EU funded wind energy projects the linear regression technique is the most commonly applied tool for the estimation step. The linear regression technique may fail in finding reliable parameter estimates when the model variables are strongly correlated, either due to the experimental set-up or because of their particular appearance in the model. This situation of multicollinearity sometimes results in unrealistic parameter values, e.g. with the wrong algebraic sign. It is concluded that different approaches, like multi-binning can provide a better way of identifying the relevant variables. However further research in these applications is needed and it is recommended that alternative methods (neural networks, singular value decomposition etc.) should also be tested on their usefulness in a succeeding project. Increased interest in complex terrains, as feasible locations for wind farms, has also emphasised the need for adequate models. A common standard procedure to prescribe the statistical

  5. Validation of the Organizational Culture Assessment Instrument

    Science.gov (United States)

    Heritage, Brody; Pollock, Clare; Roberts, Lynne

    2014-01-01

    Organizational culture is a commonly studied area in industrial/organizational psychology due to its important role in workplace behaviour, cognitions, and outcomes. Jung et al.'s [1] review of the psychometric properties of organizational culture measurement instruments noted many instruments have limited validation data despite frequent use in both theoretical and applied situations. The Organizational Culture Assessment Instrument (OCAI) has had conflicting data regarding its psychometric properties, particularly regarding its factor structure. Our study examined the factor structure and criterion validity of the OCAI using robust analysis methods on data gathered from 328 (females = 226, males = 102) Australian employees. Confirmatory factor analysis supported a four factor structure of the OCAI for both ideal and current organizational culture perspectives. Current organizational culture data demonstrated expected reciprocally-opposed relationships between three of the four OCAI factors and the outcome variable of job satisfaction but ideal culture data did not, thus indicating possible weak criterion validity when the OCAI is used to assess ideal culture. Based on the mixed evidence regarding the measure's properties, further examination of the factor structure and broad validity of the measure is encouraged. PMID:24667839

  6. Validation of the organizational culture assessment instrument.

    Directory of Open Access Journals (Sweden)

    Brody Heritage

    Full Text Available Organizational culture is a commonly studied area in industrial/organizational psychology due to its important role in workplace behaviour, cognitions, and outcomes. Jung et al.'s [1] review of the psychometric properties of organizational culture measurement instruments noted many instruments have limited validation data despite frequent use in both theoretical and applied situations. The Organizational Culture Assessment Instrument (OCAI has had conflicting data regarding its psychometric properties, particularly regarding its factor structure. Our study examined the factor structure and criterion validity of the OCAI using robust analysis methods on data gathered from 328 (females = 226, males = 102 Australian employees. Confirmatory factor analysis supported a four factor structure of the OCAI for both ideal and current organizational culture perspectives. Current organizational culture data demonstrated expected reciprocally-opposed relationships between three of the four OCAI factors and the outcome variable of job satisfaction but ideal culture data did not, thus indicating possible weak criterion validity when the OCAI is used to assess ideal culture. Based on the mixed evidence regarding the measure's properties, further examination of the factor structure and broad validity of the measure is encouraged.

  7. Estimating Above-Ground Biomass in Sub-Tropical Buffer Zone Community Forests, Nepal, Using Sentinel 2 Data

    Directory of Open Access Journals (Sweden)

    Santa Pandit

    2018-04-01

    Full Text Available Accurate assessment of above-ground biomass (AGB is important for the sustainable management of forests, especially buffer zone (areas within the protected area, where restrictions are placed upon resource use and special measure are undertaken to intensify the conservation value of protected area areas with a high dependence on forest products. This study presents a new AGB estimation method and demonstrates the potential of medium-resolution Sentinel-2 Multi-Spectral Instrument (MSI data application as an alternative to hyperspectral data in inaccessible regions. Sentinel-2 performance was evaluated for a buffer zone community forest in Parsa National Park, Nepal, using field-based AGB as a dependent variable, as well as spectral band values and spectral-derived vegetation indices as independent variables in the Random Forest (RF algorithm. The 10-fold cross-validation was used to evaluate model effectiveness. The effect of the input variable number on AGB prediction was also investigated. The model using all extracted spectral information plus all derived spectral vegetation indices provided better AGB estimates (R2 = 0.81 and RMSE = 25.57 t ha−1. Incorporating the optimal subset of key variables did not improve model variance but reduced the error slightly. This result is explained by the technically-advanced nature of Sentinel-2, which includes fine spatial resolution (10, 20 m and strategically-positioned bands (red-edge, conducted in flat topography with an advanced machine learning algorithm. However, assessing its transferability to other forest types with varying altitude would enable future performance and interpretability assessments of Sentinel-2.

  8. Instruments evaluating the quality of the clinical learning environment in nursing education: A systematic review of psychometric properties.

    Science.gov (United States)

    Mansutti, Irene; Saiani, Luisa; Grassetti, Luca; Palese, Alvisa

    2017-03-01

    The clinical learning environment is fundamental to nursing education paths, capable of affecting learning processes and outcomes. Several instruments have been developed in nursing education, aimed at evaluating the quality of the clinical learning environments; however, no systematic review of the psychometric properties and methodological quality of these studies has been performed to date. The aims of the study were: 1) to identify validated instruments evaluating the clinical learning environments in nursing education; 2) to evaluate critically the methodological quality of the psychometric property estimation used; and 3) to compare psychometric properties across the instruments available. A systematic review of the literature (using the Preferred Reporting Items for Systematic Reviews and Meta-Analysis guidelines) and an evaluation of the methodological quality of psychometric properties (using the COnsensus-based Standards for the selection of health Measurement INstruments guidelines). The Medline and CINAHL databases were searched. Eligible studies were those that satisfied the following criteria: a) validation studies of instruments evaluating the quality of clinical learning environments; b) in nursing education; c) published in English or Italian; d) before April 2016. The included studies were evaluated for the methodological quality of the psychometric properties measured and then compared in terms of both the psychometric properties and the methodological quality of the processes used. The search strategy yielded a total of 26 studies and eight clinical learning environment evaluation instruments. A variety of psychometric properties have been estimated for each instrument, with differing qualities in the methodology used. Concept and construct validity were poorly assessed in terms of their significance and rarely judged by the target population (nursing students). Some properties were rarely considered (e.g., reliability, measurement error

  9. Policy instruments to decrease the climate impact of housing, personal transport and food. Detailed instrument descriptions; Ohjauskeinoja asumisen, henkiloeliikenteen ja ruoan ilmastovaikutusten hillintaeaen. Yksityiskohtaiset ohjauskeinokuvaukset

    Energy Technology Data Exchange (ETDEWEB)

    Heiskanen, E.; Perrels, A.; Nissinen, A.; Berghaell, E.; Liesimaa, V.; Mattinen, M. (eds.)

    2012-03-15

    Reducing consumption volumes or introducing climate conscious consumption patterns can be efficient ways to mitigate climate change. Twenty existing policy instruments affecting the greenhouse gas emissions of housing, passenger traffic and food are described in this report of the KUILU-project. The policy instruments and the possibilities to develop effective instrument packages were discussed in two expert workshops, the results of which are presented in annexes of this report. There are already several policy instruments that target on housing and passenger traffic. Their differences in estimated emission reductions are large, which can ease the prioritization and selection of the instruments for further development. So far, only one policy instrument exists that aims to reduce the climate impacts of food choices, namely a Council of State Decision of Principle on Promoting Sustainability in Public Purchasing. However, it includes several measures that can be used to influence private companies and citizens, and thus it opens the field of policy instrument for mitigation of climate impacts of food. According to the expert survey, the most effective policy instrument involved in the analysis was building regulations, and the four most effective instruments after this were the following: gradated procurement tax of cars based on emissions, gradation of car tax, taxation of transport fuels, energy taxes of housing, and the effect of ecodesign directive on appliances. The effectiveness of policy instruments related to food was assessed to be on average level. The five least effective instruments were: EU energy label, voluntary energy experts, the tax of beverage packing, energy certificates, and subsidies for energy efficiency reparation in buildings. However, the expert opinions on the effectiveness of the policy instruments varied significantly. After this report, the KUILU project continued with an analysis of the policy instrument packages and with suggestions

  10. Policy instruments to decrease the climate impact of housing, personal transport and food. Detailed instrument descriptions; Ohjauskeinoja asumisen, henkiloeliikenteen ja ruoan ilmastovaikutusten hillintaeaen. Yksityiskohtaiset ohjauskeinokuvaukset

    Energy Technology Data Exchange (ETDEWEB)

    Heiskanen, E.; Perrels, A.; Nissinen, A.; Berghaell, E.; Liesmaa, V.; Mattinen, M. (eds.)

    2012-07-01

    Reducing consumption volumes or introducing climate conscious consumption patterns can be efficient ways to mitigate climate change. Twenty existing policy instruments affecting the greenhouse gas emissions of housing, passenger traffic and food are described in this report of the KUILU-project. The policy instruments and the possibilities to develop effective instrument packages were discussed in two expert workshops, the results of which are presented in annexes of this report. There are already several policy instruments that target on housing and passenger traffic. Their differences in estimated emission reductions are large, which can ease the prioritization and selection of the instruments for further development. So far, only one policy instrument exists that aims to reduce the climate impacts of food choices, namely a Council of State Decision of Principle on Promoting Sustainability in Public Purchasing. However, it includes several measures that can be used to influence private companies and citizens, and thus it opens the field of policy instrument for mitigation of climate impacts of food. According to the expert survey, the most effective policy instrument involved in the analysis was building regulations, and the four most effective instruments after this were the following: gradated procurement tax of cars based on emissions, gradation of car tax, taxation of transport fuels, energy taxes of housing, and the effect of ecodesign directive on appliances. The effectiveness of policy instruments related to food was assessed to be on average level. The five least effective instruments were: EU energy label, voluntary energy experts, the tax of beverage packing, energy certificates, and subsidies for energy efficiency reparation in buildings. However, the expert opinions on the effectiveness of the policy instruments varied significantly. After this report, the KUILU project continued with an analysis of the policy instrument packages and with suggestions

  11. Cost-effective design of economic instruments in nutrition policy

    DEFF Research Database (Denmark)

    Jensen, Jørgen Dejgård; Smed, Sinne

    2007-01-01

    This paper addresses the potential for using economic regulation, e.g. taxes or subsidies, as instruments to combat the increasing problems of inappropriate diets, leading to health problems such as obesity, diabetes 2, cardiovascular diseases etc. in most countries. Such policy measures may....... The analyses demonstrate that the average cost-effectiveness with regard to changing the intake of selected nutritional variables can be improved by 10–30 per cent if taxes/subsidies are targeted against these nutrients, compared with targeting selected food categories. Finally, the paper raises a range...... of issues, which need to be investigated further, before firm conclusions about the suitability of economic instruments in nutrition policy can be drawn....

  12. Uncertainty in CH4 and N2O emission estimates from a managed fen meadow using EC measurements

    International Nuclear Information System (INIS)

    Kroon, P.S.; Hensen, A.; Van 't Veen, W.H.; Vermeulen, A.T.; Jonker, H.

    2009-02-01

    The overall uncertainty in annual flux estimates derived from chamber measurements may be as high as 50% due to the temporal and spatial variability in the fluxes. As even a large number of chamber plots still cover typically less than 1% of the total field area, the field-scale integrated emission necessarily remains a matter of speculation. High frequency micrometeorological methods are a good option for obtaining integrated estimates on a hectare scale with a continuous coverage in time. Instrumentation is now becoming available that meets the requirements for CH4 and N2O eddy covariance (EC) measurements. A system consisting of a quantum cascade laser (QCL) spectrometer and a sonic anemometer has recently been proven to be suitable for performing EC measurements. This study analyses the EC flux measurements of CH4 and N2O and its corrections, like calibration, Webb-correction, and corrections for high and low frequency losses, and assesses the magnitude of the uncertainties associated with the precision of the measurement instruments, measurement set-up and the methodology. The uncertainty of one single EC flux measurement, a daily, monthly and 3-monthly average EC flux is estimated. In addition, the cumulative emission of C-CH4 and N-N2O and their uncertainties are determined over several fertilizing events at a dairy farm site in the Netherlands. These fertilizing events are selected from the continuously EC flux measurements from August 2006 to September 2008. The EC flux uncertainties are compared by the overall uncertainty in annual flux estimates derived from chamber measurements. It will be shown that EC flux measurements can decrease the overall uncertainty in annual flux estimates

  13. Uncertainty in CH4 and N2O emission estimates from a managed fen meadow using EC measurements

    Energy Technology Data Exchange (ETDEWEB)

    Kroon, P.S.; Hensen, A.; Van ' t Veen, W.H.; Vermeulen, A.T. [ECN Biomass, Coal and Environment, Petten (Netherlands); Jonker, H. [Delft University of Technology, Delft (Netherlands)

    2009-02-15

    The overall uncertainty in annual flux estimates derived from chamber measurements may be as high as 50% due to the temporal and spatial variability in the fluxes. As even a large number of chamber plots still cover typically less than 1% of the total field area, the field-scale integrated emission necessarily remains a matter of speculation. High frequency micrometeorological methods are a good option for obtaining integrated estimates on a hectare scale with a continuous coverage in time. Instrumentation is now becoming available that meets the requirements for CH4 and N2O eddy covariance (EC) measurements. A system consisting of a quantum cascade laser (QCL) spectrometer and a sonic anemometer has recently been proven to be suitable for performing EC measurements. This study analyses the EC flux measurements of CH4 and N2O and its corrections, like calibration, Webb-correction, and corrections for high and low frequency losses, and assesses the magnitude of the uncertainties associated with the precision of the measurement instruments, measurement set-up and the methodology. The uncertainty of one single EC flux measurement, a daily, monthly and 3-monthly average EC flux is estimated. In addition, the cumulative emission of C-CH4 and N-N2O and their uncertainties are determined over several fertilizing events at a dairy farm site in the Netherlands. These fertilizing events are selected from the continuously EC flux measurements from August 2006 to September 2008. The EC flux uncertainties are compared by the overall uncertainty in annual flux estimates derived from chamber measurements. It will be shown that EC flux measurements can decrease the overall uncertainty in annual flux estimates.

  14. Estimating pole/zero errors in GSN-IRIS/USGS network calibration metadata

    Science.gov (United States)

    Ringler, A.T.; Hutt, C.R.; Aster, R.; Bolton, H.; Gee, L.S.; Storm, T.

    2012-01-01

    Mapping the digital record of a seismograph into true ground motion requires the correction of the data by some description of the instrument's response. For the Global Seismographic Network (Butler et al., 2004), as well as many other networks, this instrument response is represented as a Laplace domain pole–zero model and published in the Standard for the Exchange of Earthquake Data (SEED) format. This Laplace representation assumes that the seismometer behaves as a linear system, with any abrupt changes described adequately via multiple time-invariant epochs. The SEED format allows for published instrument response errors as well, but these typically have not been estimated or provided to users. We present an iterative three-step method to estimate the instrument response parameters (poles and zeros) and their associated errors using random calibration signals. First, we solve a coarse nonlinear inverse problem using a least-squares grid search to yield a first approximation to the solution. This approach reduces the likelihood of poorly estimated parameters (a local-minimum solution) caused by noise in the calibration records and enhances algorithm convergence. Second, we iteratively solve a nonlinear parameter estimation problem to obtain the least-squares best-fit Laplace pole–zero–gain model. Third, by applying the central limit theorem, we estimate the errors in this pole–zero model by solving the inverse problem at each frequency in a two-thirds octave band centered at each best-fit pole–zero frequency. This procedure yields error estimates of the 99% confidence interval. We demonstrate the method by applying it to a number of recent Incorporated Research Institutions in Seismology/United States Geological Survey (IRIS/USGS) network calibrations (network code IU).

  15. Instrumentation

    International Nuclear Information System (INIS)

    Umminger, K.

    2008-01-01

    A proper measurement of the relevant single and two-phase flow parameters is the basis for the understanding of many complex thermal-hydraulic processes. Reliable instrumentation is therefore necessary for the interaction between analysis and experiment especially in the field of nuclear safety research where postulated accident scenarios have to be simulated in experimental facilities and predicted by complex computer code systems. The so-called conventional instrumentation for the measurement of e. g. pressures, temperatures, pressure differences and single phase flow velocities is still a solid basis for the investigation and interpretation of many phenomena and especially for the understanding of the overall system behavior. Measurement data from such instrumentation still serves in many cases as a database for thermal-hydraulic system codes. However some special instrumentation such as online concentration measurement for boric acid in the water phase or for non-condensibles in steam atmosphere as well as flow visualization techniques were further developed and successfully applied during the recent years. Concerning the modeling needs for advanced thermal-hydraulic codes, significant advances have been accomplished in the last few years in the local instrumentation technology for two-phase flow by the application of new sensor techniques, optical or beam methods and electronic technology. This paper will give insight into the current state of instrumentation technology for safety-related thermohydraulic experiments. Advantages and limitations of some measurement processes and systems will be indicated as well as trends and possibilities for further development. Aspects of instrumentation in operating reactors will also be mentioned.

  16. Kinetic characteristics of the gait of a musician carrying or not his instrument

    Directory of Open Access Journals (Sweden)

    Carlos Bolli Mota

    2009-01-01

    Full Text Available The integrity of the locomotor system can be compromised by the transport of certain objects, especially when done in an inadequate manner. Due to their weight and size, the transport of musical instruments can contribute to body dysfunctions in musicians who frequently have to carry their instruments, influencing balance andbody posture. Thus, the soil reaction force was investigated during the gait of a musician carrying or not his instrument. Two AMTI (Advanced Mechanical Technologies, Inc. platforms were used for kinetic data acquisition. A total of 40 measurements were obtainedfor gait and balance: 20 without carrying the instrument and 20 while carrying the instrument. The t test showed significant differences between the two situations for all variables analyzed. The results suggest that the locomotor system suffers alterationswhen carrying any kind of load, as was the case here in which the subject carried 7.75% of his own weight.

  17. Current Status of the Validation of the Atmospheric Chemistry Instruments on Envisat

    Science.gov (United States)

    Lecomte, P.; Koopman, R.; Zehner, C.; Laur, H.; Attema, E.; Wursteisen, P.; Snoeij, P.

    2003-04-01

    Envisat is ESA's advanced Earth observing satellite launched in March 2002 and is designed to provide measurements of the atmosphere, ocean, land and ice over a five-year period. After the launch and the switch-on period, a six-month commissioning phase has taken place for instrument calibration and geophysical validation, concluded with the Envisat Calibration Review held in September 2002. In addition to ESA and its industrial partners in the Envisat consortium, many other companies and research institutes have contributed to the calibration and validation programme under ESA contract as expert support laboratories (ESLs). A major contribution has also been made by the Principal Investigators of approved proposals submitted to ESA in response to a worldwide "Announcement of Opportunity for the Exploitation of the Envisat Data Products" in 1998. Working teams have been formed in which the different participants worked side by side to achieve the objectives of the calibration and validation programme. Validation is a comparison of Envisat level-2 data products and estimates of the different geophysical variables obtained by independent means, the validation instruments. Validation is closely linked to calibration because inconsistencies discovered in the comparison of Envisat Level 2 data products to well-known external instruments can have many different sources, including inaccuracies of the Envisat instrument calibration and the data calibration algorithms. Therefore, initial validation of the geophysical variables has provided feedback to calibration, de-bugging and algorithm improvement. The initial validation phase ended in December 2002 with the Envisat Validation Workshop at which, for a number of products, a final quality statement was given. Full validation of all data products available from the Atmospheric Chemistry Instruments on Envisat (MIPAS, GOMOS and SCIAMACHY) is quite a challenge and therefore it has been decided to adopt a step-wise approach

  18. Survival Analysis of Factors Influencing Cyclic Fatigue of Nickel-Titanium Endodontic Instruments

    Directory of Open Access Journals (Sweden)

    Eva Fišerová

    2015-01-01

    Full Text Available Objective. The aim of this study was to validate a survival analysis assessing the effect of type of rotary system, canal curvature, and instrument size on cyclic resistance. Materials and Methods. Cyclic fatigue testing was carried out in stainless steel artificial canals with radii of curvature of 3 or 5 mm and the angle of curvature of 60 degrees. All the instruments were new and 25 mm in working length, and ISO colour coding indicated the instrument size (yellow for size 20; red for size 25. Wizard Navigator instruments, Mtwo instruments, ProTaper instruments, and Revo-S instruments were passively rotated at 250 rotations per minute, and the time fracture was being recorded. Subsequently, fractographic analysis of broken tips was performed by scanning electron microscope. The data were then analysed by the Kaplan-Meier estimator of the survival function, the Cox proportional hazards model, the Wald test for regression covariates, and the Wald test for significance of regression model. Conclusion. The lifespan registered for the tested instruments was Mtwo > Wizard Navigator > Revo-S > ProTaper; 5 mm radius > 3 mm radius; and yellow > red in ISO colour coding system.

  19. Variability of HOMA and QUICKI insulin sensitivity indices.

    Science.gov (United States)

    Žarković, Miloš; Ćirić, Jasmina; Beleslin, Biljana; Stojković, Mirjana; Savić, Slavica; Stojanović, Miloš; Lalić, Tijana

    2017-07-01

    Assessment of insulin sensitivity based on a single measurement of insulin and glucose, is both easy to understand and simple to perform. The tests most often used are HOMA and QUICKI. The aim of this study was to assess the biological variability of estimates of insulin sensitivity using HOMA and QUICKI indices. After a 12-h fast, blood was sampled for insulin and glucose determination. Sampling lasted for 90 min with an intersample interval of 2 min. A total of 56 subjects were included in the study, and in nine subjects sampling was done before and after weight reduction, so total number of analyzed series was 65. To compute the reference value of the insulin sensitivity index, averages of all 46 insulin and glucose samples were used. We also computed point estimates (single value estimates) of the insulin sensitivity index based on the different number of insulin/glucose samples (1-45 consecutive samples). To compute the variability of point estimates a bootstrapping procedure was used using 1000 resamples for each series and for each number of samples used to average insulin and glucose. Using a single insulin/glucose sample HOMA variability was 26.18 ± 4.31%, and QUICKI variability was 3.30 ± 0.54%. For 10 samples variability was 11.99 ± 2.22% and 1.62 ± 0.31% respectively. Biological variability of insulin sensitivity indices is significant, and it can be reduced by increasing the number of samples. Oscillations of insulin concentration in plasma are the major cause of variability of insulin sensitivity indices.

  20. Estimating uncertainty in resolution tests

    CSIR Research Space (South Africa)

    Goncalves, DP

    2006-05-01

    Full Text Available frequencies yields a biased estimate, and we provide an improved estimator. An application illustrates how the results derived can be incorporated into a larger un- certainty analysis. ? 2006 Society of Photo-Optical Instrumentation Engineers. H20851DOI: 10....1117/1.2202914H20852 Subject terms: resolution testing; USAF 1951 test target; resolution uncertainity. Paper 050404R received May 20, 2005; revised manuscript received Sep. 2, 2005; accepted for publication Sep. 9, 2005; published online May 10, 2006. 1...